Embedded = IT + 15 years?

The cryptic title refers to an observation that many people have about the Embedded Industry; that it is trailing the IT industry by a number of years and that most trends in the IT world eventually will find their way to the embedded world. This is especially true as we are finding our way to the world of DSO. See John’s blog about DSO.

That raises two questions:

  1. Why is it that Embedded is so "behind"?
  2. Do all IT trends come to Embedded?

I believe the answer to these questions lie somewhere in understanding what is different between the deceivingly similar worlds of IT and Embedded. I will focus on the technical comparison between these worlds, even though there could be a similar discussion on the business side.

There are a lot of similarities, after all both are all about running software on processors. So trends around programming efficiency (higher level languages, programming models, higher abstraction levels) and better hardware (faster, cheaper, fewer variants) should apply to both, right?

Let’s look at some of these trends:

Programming efficiency

In the IT world the trend has certainly been very clear. When it comes to programming languages we have moved from binary to assembly, to FORTRAN/COBOL/C to Java/C++/C# and sometimes up to modeling languages. With each of these higher level languages we also typically get a higher abstraction level. We need to know less about the hardware we run on and we get tons of standardized libraries that do a lot of the work we used to have to program ourselves. This makes it possible to reuse the code in the underlying platform and focus on what is new on the top. Programmers become more efficient.

This added efficiency comes at a cost. The cost is that we use more processing power, more memory, etc. for any given task, than what it would take if you wrote the same application from the ground up. But it is worth it, since it is cheaper to throw hardware at the problem than to add programmers.

In a device the economics are sometimes different. A more powerful processor not only costs more, it is also using a lot more power, draining batteries faster, etc. For mass-produced cheap devices that extra cost of a faster processor and more memory might make it too expensive.

Java is a great example of how the trade-offs are different for devices. Michael Scharf would argue that Java is as efficient as C, but of course he is wrong ;-) . The way Java gets decent performance on a work station is by using a technique called just-in-time (JIT) compilation, which means that as you execute the code, you actually convert the Java byte-code to native code that the processors can run directly. A JIT compiler uses a lot of memory, since it needs to have a smart compiler on the target and it needs to cache the compiled code. This is of course a problem on a device, where you want fast execution and small footprint.

Another problem with higher level languages such as Java is that the more you abstract from the hardware, the harder it is to control the timing of execution. Many devices have real-time considerations, meaning that you have to guarantee execution of code within a certain time window. A classic example is an airbag. You don’t want the Java garbage collector to kick in exactly that millisecond when the airbag is supposed to inflate.

Hardware trends

Many hardware trends move quickly from the desktop to embedded. Take things like utilizing power efficient processors, multicore and cheaper memory. All these trends quickly find their way into devices.

One thing that has not happened, at least not yet, is the consolidation to a fewer number of processor architectures and variants. All the various mini-computers of the 70s and 80s are now replaced with just a couple of architectures, x86/IA32 being the dominant one. In embedded we still have a lot of architectures and even within one family there are endless of variants. Individual PowerPC processors, for example, have many differences such as hardware floating point in multiple flavors (no FP, standard PPC, Altivec, E500 v1 and v2, …). This means that OS and tools vendors have to provide support to a big matrix of variants. Contrast that with the x86, where the same tools and OS run on all variants from multiple vendors. Why is that?

As with everything there are a number of reasons, including:

  • The need for backwards compatibility has been much greater in the IT space
  • The drive towards standards have been faster in the IT space
  • Workstations/servers/desktops have more similarities than say a phone and a network router
  • Devices do more specialized tasks

The standardization that is happening in the embedded space is very vertical, so for example most advance phones are using ARM based processors these days.

The questions

So, did we answer the questions of why and if embedded always will be behind IT. Not really ;-) .

But I am interested in your opinions!

In reality there are probably some things that will always be different (such as the economics of using higher level languages) and some things that could be much more similar. In particular I believe we need to change our thinking when it comes to driving standards. We need better standards in this space to become more efficient.

This is really what DSO is all about: Think differently and drive the industry to a higher level of efficiency.


  1. Michael

    I dunno, but the answer to the author’s question seems a bit obvious to me. Performance is everything in embedded!
    Response time, physical size, battery life, and most importantly COST are not something you can “just throw hardware at”. Developers are cheap (in the grand scheme of things)!
    Do you think the military cares if their hand-held satellite communications device uses open standards, or if it has a long battery life?
    Do you think most consumers care that their ipod is efficiently coded, or that it doesn’t skip when playing video?
    Do you care that your bluetooth headset is using a standard hardware platform, or that it costs $5 less then the competition?
    Most industries will jump through (many) hoops just to get the 5-10% edge over the competition. Abstraction is great for computer scientists, but not necessarily for engineers.
    In embedded, performance is everything!

  2. Rich Dubielzig

    I don’t think it’s accurate to say that the embedded industry is trailing IT as much as it is to accept that it has a different focus. In terms of power management, I think embedded development is actually ahead of general IT. Different processor architectures can be seen as a reflection of power-management–the right one is chosen on the basis of its ability to deliver what we need and do as little else as possible.
    Real-time software is another area where embedded computing leads, in my obviously unbiased opinion. The only dominant IT OS that attempts realtime is Linux, and it wasn’t even attempted until long after the OS had been accepted on the desktop.
    I prefer to see it this way: Where embedded follows IT, it does it by taking the best of that industry, and missing the fads.

  3. Tomas

    I agree that embedded is actually ahead in many areas and it is certainly true that the trade-offs are very different.
    Perhaps it is only people coming into embedded from the IT space that believe we are behind ;-)
    Besides pointing out some very real differences that I believe will continue to make the two worlds different, I also believe that we can learn somethings from IT. For example, why is it that we have to care about (and port our code to) tens of different timers? Surely there are things we could get together and say “let’s do this the same way across the board”?

  4. Steve

    The standardisation on the desktop has come about because two companies, (usually in small, but occassionally in large increments) have added increasing value to the users (and conusmers) of that desktop technology resulting in a shut out for everyone else. Microsoft and Intel have their competitors sure, but these are relatively niche (or sensibly to win big completely plug compatible). Even in the open source world applications are made “compatible” with Microsoft…
    If we could deliver affordable cost and a consistent set of needed features in the device world then this too would accelerate standardisation. The challenge though is what should those features be. Every device is different in terms of form, function and cost so finding the common denominator is proving to be very difficult.
    It is only by companies consolidating their technology portfolio or applications actively standardising around one type of technology (such as Tomas’s ARM example in mobile telephony) that standardisation really becomes possible for the device developer. Till then we have to move toward this very desireable goal, but doing it a small step at a time. I think DSO through standardisation will come about but it will be small, incremental and steady progress rather than one giant leap.

Comments are closed.