Embedded = IT + 15 years?
The cryptic title refers to an observation that many people have about the Embedded Industry; that it is trailing the IT industry by a number of years and that most trends in the IT world eventually will find their way to the embedded world. This is especially true as we are finding our way to the world of DSO. See John's blog about DSO.
That raises two questions:
- Why is it that Embedded is so "behind"?
- Do all IT trends come to Embedded?
I believe the answer to these questions lie somewhere in understanding what is different between the deceivingly similar worlds of IT and Embedded. I will focus on the technical comparison between these worlds, even though there could be a similar discussion on the business side.
There are a lot of similarities, after all both are all about running software on processors. So trends around programming efficiency (higher level languages, programming models, higher abstraction levels) and better hardware (faster, cheaper, fewer variants) should apply to both, right?
Let's look at some of these trends:
In the IT world the trend has certainly been very clear. When it comes to programming languages we have moved from binary to assembly, to FORTRAN/COBOL/C to Java/C++/C# and sometimes up to modeling languages. With each of these higher level languages we also typically get a higher abstraction level. We need to know less about the hardware we run on and we get tons of standardized libraries that do a lot of the work we used to have to program ourselves. This makes it possible to reuse the code in the underlying platform and focus on what is new on the top. Programmers become more efficient.
This added efficiency comes at a cost. The cost is that we use more processing power, more memory, etc. for any given task, than what it would take if you wrote the same application from the ground up. But it is worth it, since it is cheaper to throw hardware at the problem than to add programmers.
In a device the economics are sometimes different. A more powerful processor not only costs more, it is also using a lot more power, draining batteries faster, etc. For mass-produced cheap devices that extra cost of a faster processor and more memory might make it too expensive.
Java is a great example of how the trade-offs are different for devices. Michael Scharf would argue that Java is as efficient as C, but of course he is wrong ;-). The way Java gets decent performance on a work station is by using a technique called just-in-time (JIT) compilation, which means that as you execute the code, you actually convert the Java byte-code to native code that the processors can run directly. A JIT compiler uses a lot of memory, since it needs to have a smart compiler on the target and it needs to cache the compiled code. This is of course a problem on a device, where you want fast execution and small footprint.
Another problem with higher level languages such as Java is that the more you abstract from the hardware, the harder it is to control the timing of execution. Many devices have real-time considerations, meaning that you have to guarantee execution of code within a certain time window. A classic example is an airbag. You don't want the Java garbage collector to kick in exactly that millisecond when the airbag is supposed to inflate.
Many hardware trends move quickly from the desktop to embedded. Take things like utilizing power efficient processors, multicore and cheaper memory. All these trends quickly find their way into devices.
One thing that has not happened, at least not yet, is the consolidation to a fewer number of processor architectures and variants. All the various mini-computers of the 70s and 80s are now replaced with just a couple of architectures, x86/IA32 being the dominant one. In embedded we still have a lot of architectures and even within one family there are endless of variants. Individual PowerPC processors, for example, have many differences such as hardware floating point in multiple flavors (no FP, standard PPC, Altivec, E500 v1 and v2, ...). This means that OS and tools vendors have to provide support to a big matrix of variants. Contrast that with the x86, where the same tools and OS run on all variants from multiple vendors. Why is that?
As with everything there are a number of reasons, including:
- The need for backwards compatibility has been much greater in the IT space
- The drive towards standards have been faster in the IT space
- Workstations/servers/desktops have more similarities than say a phone and a network router
- Devices do more specialized tasks
The standardization that is happening in the embedded space is very vertical, so for example most advance phones are using ARM based processors these days.
So, did we answer the questions of why and if embedded always will be behind IT. Not really ;-).
But I am interested in your opinions!
In reality there are probably some things that will always be different (such as the economics of using higher level languages) and some things that could be much more similar. In particular I believe we need to change our thinking when it comes to driving standards. We need better standards in this space to become more efficient.
This is really what DSO is all about: Think differently and drive the industry to a higher level of efficiency.