The Power – And Tyranny – of Abstractions

One of my favorite toolkits is the Qt toolkit from Nokia. Qt is a toolkit for developing cross-platform GUIs that is available under both an open source and commercial license. It was first developed in 1991 and led to the formation of Trolltech in 1994. In 2008, Nokia bought Trolltech which caused quite a stir in the Qt community (see the comments on this Qt Labs announcement of the acquisition).

Trolltech had made a considerable investment in mobile GUI and platform development for Qt. The fear was that Nokia would close source Qt and shutdown development of the Desktop side. To their credit, Nokia promised the community they would not do this – and they've been good to their word. Ars Technica (which always has the best coverage of Qt) recently covered the new features coming in Qt 5.

Qt started off as a pure C++ toolkit and with Qt 5, Nokia has made it clear that the future of Qt is now their Qt Quick technology – a declarative GUI syntax with JavaScript runtime. (Sound familiar?)

Before getting into Qt Quick, it's worth stepping back and looking at the big picture – the picture of how Qt relates to its surroundings.

Abstractions and Layers

Qt's original reason for existence is to create a new GUI toolkit abstraction on top of multiple operating systems with their own native GUI toolkits. For instance, on Windows, you would write an application in C++ referencing the Qt QWidget family of controls that were themselves implemented on top of the Windows GDI.

Qt's appeal was that if you wanted to then run that same application on Mac OS X or Linux, you could take your C++ source code and with just minimal changes recompile and run your applications on those platforms. The QWidget control implementations on the other platforms would translate the Qt calls into Cocoa or XWindows calls as needed.

In essence, what Qt did was create a new windowing abstraction in their toolkit on top of the platform's native GUI toolkit. This new abstraction was very powerful for developers that needed to develop applications that were cross-platform and who didn't want to write a completely separate application for each platform.

Everything about this scenario sounds great on paper. The clouds only start to gather when you actually get down into the details. And that's when the abstraction begins to bear down upon you.

The Tyranny of Abstraction

Steve Jobs famously posted his open letter "Thoughts on Flash" during his battles with Adobe. In it, he focuses on the issues of Flash on iOS – but he might as well have been talking about Qt:

We know from painful experience that letting a third party layer of software come between the platform and the developer ultimately results in sub-standard apps and hinders the enhancement and progress of the platform. If developers grow dependent on third party development libraries and tools, they can only take advantage of platform enhancements if and when the third party chooses to adopt the new features. We cannot be at the mercy of a third party deciding if and when they will make our enhancements available to our developers.

This becomes even worse if the third party is supplying a cross platform development tool. The third party may not adopt enhancements from one platform unless they are available on all of their supported platforms. Hence developers only have access to the lowest common denominator set of features. Again, we cannot accept an outcome where developers are blocked from using our innovations and enhancements because they are not available on our competitor’s platforms.

What happens with technology like Qt is you are locked into their abstraction and as a result locked out of the platform's native abstractions – both for good and bad. The good is that you don't have to deal with the platform differences. The bad is that you don't get to take advantage of the platform differences.

The Chains That Bind You

Now don't get me wrong; I'm a big fan of Qt. But you have to be realistic about the tradeoffs that its abstractions impose on you and your project. For comparison sake, take PyQt as an example.

It would be easy to take potshots at C++ but let's just say we prefer Python instead. PyQt layers a Python abstraction over Qt so you can use it from that language. The difference between this abstraction layer between Python and C++ on the one hand and Qt's abstraction and the native platform's GUI on the other is the thickness. PyQt is a very thin abstraction that basically is more of a "translation" than a new abstraction. There is very little of Qt you can not access and use through PyQt.

This concept is what takes us back to Qt Quick. Qt has traditionally been regarded as a "cross-platform application framework". With this new emphasis on Qt Quick, I believe Nokia is taking a serious risk. Qt Quick is not a thin abstraction like PyQt on top of the C++ base. It consists of new concepts, behavior, syntax and languages. In other words it is another thick abstraction on top of the already thick abstraction of the C++ framework.

No Win Scenario?

If Nokia doesn't manage this transition well, I believe they risk turning Jobs lament about Flash on iOS into a prophetic prediction of Qt's demise. If Qt Quick becomes another thick abstraction on top of an already thick C++ abstraction, all of the negative effects Jobs points out will not be alleviated. They will only be compounded – and they are already being felt.

On the other hand, let's say Nokia pulls this off and the result is a new single, coherent, Qt Quick abstraction between your application and the native platform. An abstraction that includes a declarative GUI language. And a JavaScript runtime. Sounds an awful lot like a web browser.

After all of this work, will they end up with something the market regards as nothing more than a proprietary web browser? And at the same time will they expose themselves to new competitors with a non-standard solution in a market dominated by standards-based solutions?

Only time will tell.

Mobile Flash is Dead - The Battle of the Web Terminals is Over

Well, get ready for the gang pile - Mobile Flash is Dead.

There will be plenty said about this topic from business and marketing perspectives. But what does this represent from a technology perspective? More than anything, I think it's just another victim in a long standing battle amongst various approaches to Web Terminals.

Some Brief History

If you take a look at the IEEE Computer Society's Timeline of Computing History you can piece together some interesting highlights in computing:

  • First computer "mainframes" in the 1950's were giant mechanisms with no remote access.
  • The 1960s saw the development of "terminals" which allowed remote access to a mainframe.
  • The 1970s up to the mid-1980s marked the rise of "personal computers" - smaller systems dedicated to an individual user.
  • Beginning in the mid-1980s, "network computing" became the popular concept. "Client" software on personal computers would talk to mainframe "server" software accessible via computer networks.
  • In 1990 Apranet is decommissioned and the Internet officially commercialized. With the introduction of the web browser in 1993, the "web" is born.
  • The 2000s see a tremendous boom in web based applications and services.
  • In 2008, the Apple App Store is opened and mobile applications explode in popularity.

 If you step back from that summary for a minute and look at it from a high-level these are the trends at work:

  • large computing systems with no network connectivity
  • large computing systems with remote terminals
  • small computing systems with no network connectivity
  • small computing systems with network connectivity to large computing systems via applications
  • small computing systems with network connectivity to large computing systems via browsers
  • small computing systems with network connectivity to large computing systems via applications

Somehow in the 2000s, the entire industry convinced itself that the web was going to be the solution to all of the computing industry's problems. I believe the reversal of this in the late 2000s is an indication of the mistake that the industry made trying to force everything onto the web.

Web Terminals

Based on that view of computing history, it's easy to see the web browser as just a throw back to terminals from the early mainframe days. Client/server applications were considered difficult to build - the environment on the personal computer was just too complex and difficult to deal with (i.e. Windows). Wouldn't it be easier to have a dumb terminal (a web browser) and put energy into building server applications in a more stable environment?

For many applications, this proved good enough. Those early browsers were pretty dumb terminals with many limitations but many developers saw concrete benefits to that approach, at least for simple applications.

Then in 1995, along came Java. It was supposed to be the answer to all of our dreams - it would allow full use of all of the computing resources on the personal computer while delivering ease of development and deployment. It would start off as a browser plugin but would certainly some day become the ultimate web smart terminal. Yet, despite all of that, Java never fulfilled that vision - and in fact is rarely used in browsers today.

The Rise of Flash

Why did Java fail to take over all of client/server computing and the web? In my opinion, one simple reason: sex appeal. What was really driving the growth of the Internet at that time was commercialization. And that commercialization was being driven by average people beginning to discover the Internet. Java was a visually ugly system with little ability to satisfy the sizzle and style needs of marketers and promoters who needed to connect with those new users.

The arrival of the Flash Player in 1997 (at a time when the web browser was still a very dumb terminal) met this need and then some. Java was languishing as the ugly step-child and the web browser's IQ was flatlined. All of a sudden the Flash Player was the ultimate smart terminal. It opened up tremendous new opportunities for rich content on the web.

This state of affairs stayed relatively stable until about the mid 2000s. Java had long given up on conquering the web having conquered the back office but the web browser was starting to learn some new tricks, though, still not quite enough at that time.

The Fall of Flash

Apple infamously began the topic of iPhone application development by telling all of their developers that all apps would be web apps - it was the future. This was met with derision and in 2008, native application development was introduced. For the first time in almost 15 years, a platform began driving the growth of traditional client/server application development again. This was the first sign of problems for Flash, the King of Web Terminals.

The next problem for Flash was that the web browser was finally starting to smarten up. Between Apple and Google, innovation in the web browser arena began to explode. Suddenly, you could start doing the same things in open standards browsers that had traditionally only been possible in Flash - a proprietary environment with expensive commercial tools.

Thus, today's news is big, but pretty inevitable given the trends. Flash was the smartest web terminal on the block for many years. However, it was just a matter of time before the browser killed it after Apple and Google threw their resources behind it.

What Next?

When viewed as web terminal technology, the battle is pretty clear. However, what about the larger conflict? What is to become of native client application development versus web terminals?

Perhaps the reversal in trend back to more native client application is not a new trend away from web terminals but more of a correction of an earlier overreach that set back client/server computing development many years.

Maybe what we are going to see is a new norm develop where web terminals (the browser) are used for simpler applications that don't need native access and need wide platform and operating system support. At the same time perhaps we will witness a renaissance in development of native applications where performance, native API access and polished user interfaces are the highest priority.

If that's how it really does play out, adding Objective-C (iOS) and Java (Android) to your resume is probably a good career move.