Mobile Flash is Dead - The Battle of the Web Terminals is Over

Well, get ready for the gang pile - Mobile Flash is Dead.

There will be plenty said about this topic from business and marketing perspectives. But what does this represent from a technology perspective? More than anything, I think it's just another victim in a long standing battle amongst various approaches to Web Terminals.

Some Brief History

If you take a look at the IEEE Computer Society's Timeline of Computing History you can piece together some interesting highlights in computing:

  • First computer "mainframes" in the 1950's were giant mechanisms with no remote access.
  • The 1960s saw the development of "terminals" which allowed remote access to a mainframe.
  • The 1970s up to the mid-1980s marked the rise of "personal computers" - smaller systems dedicated to an individual user.
  • Beginning in the mid-1980s, "network computing" became the popular concept. "Client" software on personal computers would talk to mainframe "server" software accessible via computer networks.
  • In 1990 Apranet is decommissioned and the Internet officially commercialized. With the introduction of the web browser in 1993, the "web" is born.
  • The 2000s see a tremendous boom in web based applications and services.
  • In 2008, the Apple App Store is opened and mobile applications explode in popularity.

 If you step back from that summary for a minute and look at it from a high-level these are the trends at work:

  • large computing systems with no network connectivity
  • large computing systems with remote terminals
  • small computing systems with no network connectivity
  • small computing systems with network connectivity to large computing systems via applications
  • small computing systems with network connectivity to large computing systems via browsers
  • small computing systems with network connectivity to large computing systems via applications

Somehow in the 2000s, the entire industry convinced itself that the web was going to be the solution to all of the computing industry's problems. I believe the reversal of this in the late 2000s is an indication of the mistake that the industry made trying to force everything onto the web.

Web Terminals

Based on that view of computing history, it's easy to see the web browser as just a throw back to terminals from the early mainframe days. Client/server applications were considered difficult to build - the environment on the personal computer was just too complex and difficult to deal with (i.e. Windows). Wouldn't it be easier to have a dumb terminal (a web browser) and put energy into building server applications in a more stable environment?

For many applications, this proved good enough. Those early browsers were pretty dumb terminals with many limitations but many developers saw concrete benefits to that approach, at least for simple applications.

Then in 1995, along came Java. It was supposed to be the answer to all of our dreams - it would allow full use of all of the computing resources on the personal computer while delivering ease of development and deployment. It would start off as a browser plugin but would certainly some day become the ultimate web smart terminal. Yet, despite all of that, Java never fulfilled that vision - and in fact is rarely used in browsers today.

The Rise of Flash

Why did Java fail to take over all of client/server computing and the web? In my opinion, one simple reason: sex appeal. What was really driving the growth of the Internet at that time was commercialization. And that commercialization was being driven by average people beginning to discover the Internet. Java was a visually ugly system with little ability to satisfy the sizzle and style needs of marketers and promoters who needed to connect with those new users.

The arrival of the Flash Player in 1997 (at a time when the web browser was still a very dumb terminal) met this need and then some. Java was languishing as the ugly step-child and the web browser's IQ was flatlined. All of a sudden the Flash Player was the ultimate smart terminal. It opened up tremendous new opportunities for rich content on the web.

This state of affairs stayed relatively stable until about the mid 2000s. Java had long given up on conquering the web having conquered the back office but the web browser was starting to learn some new tricks, though, still not quite enough at that time.

The Fall of Flash

Apple infamously began the topic of iPhone application development by telling all of their developers that all apps would be web apps - it was the future. This was met with derision and in 2008, native application development was introduced. For the first time in almost 15 years, a platform began driving the growth of traditional client/server application development again. This was the first sign of problems for Flash, the King of Web Terminals.

The next problem for Flash was that the web browser was finally starting to smarten up. Between Apple and Google, innovation in the web browser arena began to explode. Suddenly, you could start doing the same things in open standards browsers that had traditionally only been possible in Flash - a proprietary environment with expensive commercial tools.

Thus, today's news is big, but pretty inevitable given the trends. Flash was the smartest web terminal on the block for many years. However, it was just a matter of time before the browser killed it after Apple and Google threw their resources behind it.

What Next?

When viewed as web terminal technology, the battle is pretty clear. However, what about the larger conflict? What is to become of native client application development versus web terminals?

Perhaps the reversal in trend back to more native client application is not a new trend away from web terminals but more of a correction of an earlier overreach that set back client/server computing development many years.

Maybe what we are going to see is a new norm develop where web terminals (the browser) are used for simpler applications that don't need native access and need wide platform and operating system support. At the same time perhaps we will witness a renaissance in development of native applications where performance, native API access and polished user interfaces are the highest priority.

If that's how it really does play out, adding Objective-C (iOS) and Java (Android) to your resume is probably a good career move.