4G, Tablets and Clouds, Welcome to the third wave.

A telecoms tower
4G is a game changed

Right now tablets are only just useable and yet they are set to outsell traditional laptops and desktops combined in 2015.

We have seen this before, twice. I have in the past talked about the second phase of the ICT revolution; I see now I was wrong. Actually we are about to enter the third phase. Such phases, or waves of change, are characterised by a new technology being taken up whilst it is still only just usable. This effect is the perfect indicator of massive change. If a technology is just OK then it's take up is progressive as it advances (thing of Diesel engines in cars) but if a technology is utterly transformation, we see people take it up despite it being very limited initially (think penicillin, aeroplanes and tablets).

Consider this, back in the mid 1940s computers really were only just usable. By the mid 1950s large corporations realised they should have them. By the mid 1960s they were essential. I was part of the micro-computer wave (the second phase). I remember trying to explain to my mother what micro-computers might be useful for; she quite rightly did not get it. They were not really much use for anything, but the potential and their ability to do those few things they could do was amazing.

Laptops were not a wave; they grew slowly. The utility of one is amazing, but not transformatory. They are too heavy, and to complex and just not a revolution. Putting a steam engine on wheels changed the world completely and utterly. Laptops were like high pressure steam engines, really useful but not a revolution; the iPad is Stephenson's Rocket, with it we enter a new era. Putting tablets on 4G will be, is being, just as big if not bigger. Steam engines on rails (let us call them trains) meant people could travel at super animal speeds for the first time. On day were were limited to a galloping horse, the next we were not; suddenly we could travel were we wanted at speeds which made such transport practical. Putting Tablets on 4G means we no longer have have to travel. With a tablet and good 4G access we can be everywhere at the same time. Yes we thought video conferencing would kill of travel but it was too clunky in the computer based 20th century world. Video conferencing from your bedroom from a device the size of your hand - that is the true revolution.

The devil is in the detail. If you work in a modern office environment you might well have had the experience of despirately needing a coffee but being stuck on a conference call (of doom). I found the solution two days ago; I used my phablet (phones and tables are really just the same things now - I call them phablets). I put on my headset, attached it to my iPhone and made a coffee whilst on the call. Which meant that 5 minutes I would have spent making the coffee after the call was not unproductive time.

Similarly, I am sat on a train writing this using my larger phablet... Actually, now I am on the tube! Have you ever tried to use a laptop on the London Undergroud? Don't bother, it is really, stupidly hard. Now I am finishing this paragraph sat by the pool in a Finca in Spain. Can you tell?

What we are looking at is a new level of mobility; a situation where phablets, with a some more development, will start to shift the way we think about life at a cultural level. We will break down the age old correlation between location and activity. My wife already does the week's food shopping whilst we sit in bed. At the same time I often do a bit of research and reading. If she is not shopping she is researching and I am surfing youtube: all on our phablets. Such a disconnection between location and activity is very new. What meaning a "reading chair" and a ’listening room’ when we can listen and read using the same device whilst sitting in a cable car on the way to skiing?

As we disentangle our prejudice over location and activity for leisure so we will for work. Not only at the level of remote working but also with work times. It does not matter where or when people work if they can still communicate; I communicate with my Chinese team members from the train and those trom the US whilst in the bath. Delivery drivers to doctors can fit their time and location around their needs and the needs of their end users; no longer shall we all be synchronised to some external rigid clock simply to allow some form of primitive coordination.

Critically, all this can not happen and would not have started to happen with out hdma (a forerunner of 4G) and cloud computing. All that flexibility needs flexible networking and powerful computing. Carrying around enough processing and storage to work hyper flexibly is not practical or efficient. Apart from anything else, we as users want low latency not high throughout. We don't want to wait for something but we only use compute power sporadically. Connecting a powerful enough machine (the fablet) to a cloud compute back end over 4G makes sense in a way that lugging a really powerful computer with terabytes of storage around does not. But that does require a lot of mobile networking power. In all honesty, in the UK, even where we have 4G, that promise has not quite been met yet. The bandwidth on cell towers does not seem good enough, and there are far too many dead spots; but it will happen and when it does we will be in a new world which plays by different rules to the one we are used to.

 

 

 

If Skynet Were To Happen Tomorrow - It Would Run On ARM

The Most Important IT Hardware Headquarters In The World

ARM Headquarters - Cambridge UK
Wikicommons: http://en.wikipedia.org/wiki/ARM_Holdings#mediaviewer/File:Cambridge_ARM_building_panorama.jpg
According to the latest punditry tablets will outsell PCs in 2015. Seriously though - smart phones are already more used and more powerful than PCs were only a few years ago. When you add the two together (after all, tablets are just big smartphones) then PCs are looking a small player in the user end of the IT market. Macs and Linux machines don't make much difference to the figures. Basically, phone/tablets (let's call them phablets) are taking over the world.

They run on ARM

Whilst AMD and Intel slog it out at the server side, they have precious little traction in the phablet market - only a few percent. Nothing is going to change this because... why should it? Intel is all about the x86 and x86_64 architecture as is AMD. This architecture is hopelessly bad at low power. Lots of work can make it a bit better - but it is always going to be playing catch up with ARM. 

ARM don't make chips - they license the intellectual property; but so what? The point is that the world is already owned by ARM technology. If Skynet were to happen tomorrow, it would run on ARM.

Now - I spend every working day squeezing nanoseconds from Intel based instruction sets. It is fun and it will continue to be critical because ARM has had little impact on the server side. The ARM architecture is not great for the server. New tech' like their silicon-photonics and super fast SSDs will all help Intel to consolidate their position on the server side. I hope AMD will do similar things as well. But - and it is a big BUT, that little office in Cambridge is the headquarters of the most important IT hardware company in the world right now.

Musings On A Year Of Living C++

Time to take to the high ground and get some perspective.
Copyright Dr Alexander J Turner all rights reserved.
It was just over a year ago that I started my first 100% pure C++11 project. What has it been like?

I had played with C++11 and versions of compilers which had many of the features of C++11. I had worked in C++ for years. But I had not had the amazing chance to write a complex system in C++11 from the ground up until this year.  Now that is a very very different proposition to patching existing code. A clean slate project allows new coding paradigms to form rather than inevitably evolving from the existing paradigms. The final piece of the picture fell into place just this last two weeks when I introduced a young programmer to C++ for the first time (in a commercial setting). He was introduced to C++11 and the effect was quite astounding. Coming from a more Java background, he did not just throw up his hands in despair at all the pointers and edge cases; no his found a safe, clean C++11 style and was productive immediately.

C++11 is a super set of previous C++. But, thinking of it this way completely misses the point. It is much better to think of C++11 as a cut down and cleaned up version of C++03. What! Yes, I really do mean that. We don't have to use as much of the language to achieve the same things. In older styles of C++ we had to jump through hoops to do relatively simple things. 

Consider functors for example. To customise a standard sort one had to define a functor which involved writing a whole class and overloading its () operator. All that has gone and we can simply define a lambda in line.

Similarly with move (std::move etc). Now, it is a bit complex to get ones head around how to implement move constructors and the like. However, most of the time we don't need to do that, we can just use them. The point being that in previous C++ version we might have had to jump through hoops to avoid making copies of objects. Now we do not need to worry. Say we are pushing strings onto a vector of strings, we can use push back with an rvalue or emplace back with the appropriate constructor arguments. No copying will happen, moving will instead. In other words the language directly expresses the intent of the programmer.

my_vector.emplace_back("Hello")

It is completely obvious what the programmer intended here. It is efficient and does not require a lot of superstructure to achieve that efficiency.

Which leads us to pointers. Well, most of the time we do not need them any more. They can be nice for some stuff. For example, they are a really handy way to keying things into maps. But in general, they are no longer required as a high speed technique in the way they once were. I really wish the std::reference_wrapper stuff in C++11 was just a little easier to use and a little less clunky; if that were the case then we would almost never need to explicitly use pointers (implicit stuff like std::string("This is actually represented by a pointer to const char") does not really matter). 

So back to showing someone with Java experience how to write something in C++11. I gave a few rules to make life easier and off he went:

1) Use const as much as possible
2) Use for(const auto& x: container) as a pattern were ever possible for iteration
3) Pass everything as const reference or reference unless there is a very very good reason not to. Copy parameters and pointers should be avoided.
4) Use auto as much as you can for variable definitions.
5) Write as many functions as you need, feel free to break up code into lots of little functions.
6) But if you go the 5 route - use an anonymous name space to hide all the little functions from the global name space.

Closing Thoughts
Well, this is not some detailed discussion; it is just musings. However, I do have some fairly clear thoughts. C++11 is a new language. Yes it contains the older C++ flavours, but it does not need to be written that way. It can be written in a much cleaner, and higher level fashion. What is more, it is more productive to the point where its productivity rivals that of Java (exceeds it in many ways).

C++11 is also well on its way to platform independence. I know that sounds daft, but because it now lends its self to higher level programming, it is much easier to avoid platform dependent code in the main logic. It can all be hidden in libraries. For example, standards like atomic and high resolution counters make these two classic areas of platform dependence go away.

C++11 marks the end of the 'inevitable decline' of C++. The language has proven its self to be an ongoing competitive development system. With the end of ever increasing clock speeds and some large problems being see with ever large memory consumption (bigger memory means more cache pressure which reduces performance) the basic advantage of C++ over something like Java - the ability to tune the heck out of it - is rightfully causing something of a resurgence of interest.