Is footprint still important for desktop applications?
I just read Tomas' blog entry titled Is footprint still important for embedded devices? His answer is a clear "Yes." Well, what about desktop applications? Why the heck are desktop applications so slow and memory-hungry?
Like Tomas, I bought my first computer in the late Seventies. I did a vacation job for a month and bought the computer with a friend (50/50), because it was so expensive. It was a Commodore Pet 2001 with 8kb of RAM. Unlike Tomas, I programmed it in Basic (Assembly did not attract me). One of the first programs I got running was Joseph Weizenbaum's ELIZA. I actually got a listing and I typed it in and modified it a bit. From then on I was fascinated by the art of programming: Modula-2, C++, Python, Java, showing a trend in the direction of higher level "more expensive" languages and systems.
Ok, back to the question. Let's look at IDEs as an example. For some reasons IDEs always "stress the desktop systems." 20 years ago I used EMACS. At that time people said EMACS was an acronym for Extended Memory And Constantly Swapping. With SNiFF we had the same problem. And today, Eclipse stresses our desktops.
Yes, careful design, better algorithms and data-structures can improve performance and footprint, but don't think desktop application developers don't already do this. The real problem is somewhere else. Identifying and fixing real hotspots is easy. In all those systems, quite some effort was put into performance and footprint improvements. However, it is incredibly hard to optimize complex systems with no real hotspots. If you use a memory analyzer or a profiler you'll hardly find hot-spots that contribute more than 5-10% of the overall footprint. There are usually higher priority tasks during development than optimizing a 5% problem.
Let's look at Eclipse. Where does this "distributed footprint problem" come from? Is it Java? No, Java is much more efficient than most people think. Unlike C++, Java managed to provide an environment where class libraries, tool kits and frameworks from different sources work well together. Why reinvent the wheel, when someone has already done the work for you? You can focus on the application you want to create instead of getting lost in infrastructure. The price you pay is that the components you use are usually not tailored for your use case, but can often be used for a wider range of applications. More general libraries add overhead in performance and footprint, but they can drastically reduce the development effort.
Could we rewrite Eclipse (or Workbench) with a better footprint? We probably could. In some cases we could use better overall design, better algorithms or data-structures. Preserving the functionality Eclipse has today, we could maybe get an overall factor 2 or 3 if we are lucky (or good). But then it would probably be an island solution. We would have more applications running in parallel, which then would add to the overall footprint.
What can you do to speedup your desktop applications today? In most cases adding memory is the solution. Don't let you computer touch the hard-disk. Just go to the next computer shop and buy 2GB of memory and plug it into your computer. At least this is what helped me.