In this month's column, let's briefly review the many changes we've seen in applications over the past decade.
By 2000, the client-server applications of the 1990s were giving way to three-tier, web-based applications. However, these web applications were characterized by static content and relatively primitive user interfaces. Web applications improved over the decade, with increasingly dynamic and customizable content. Ajax and related technologies revolutionized the richness and flexibility of web-based applications by allowing JavaScript programs running in the browser to communicate independently with both the user and back-end server. In 2000, few believed serious desktop applications like spreadsheets and word processing could be web-based. Today, however, web applications represent a credible alternative to traditional desktop applications.
Web applications in which the content is generated through collaboration and user contribution - together with the richer Ajax interfaces - became known as "web 2.0." These web 2.0 applications, including Facebook, Wikipedia, Twitter, and the like, now represent some of the largest applications in existence in terms of data volumes and user populations.
As web technologies became more powerful, delivery of business and corporate applications across the internet became feasible. Salesforce.com - with its web-based CRM application-pioneered this software as a service (SaaS) approach, while Google took a dominant role in deploying desktop applications over the internet.
In terms of application development platforms, we saw Java go from being the new kid on the block to achieving an almost COBOL-like dominance in business applications development. Microsoft .NET, announced early in the decade, emerged as a credible alternative to Java for Microsoft applications, and C# rapidly replaced Visual Basic as the language of choice for Microsoft programming. Open source languages such as Perl and PHP continued to enjoy widespread use, while new open source languages such as Ruby and Python gained increasing mindshare.
Open source applications were somewhat niche at the beginning of the decade but now are clearly mainstream. Credible open source alternatives now exist for almost every category of application, as well as every component of the application.
Serious applications running on mobile phones or PDAs had long been anticipated, and there were sporadic attempts to create a software market around the Palm and Windows Mobile platforms. However, it was the Apple iPhone and the Apple App Store that really triggered an explosion of applications for mobile devices. The Apple App Store represented a renaissance for the small "micro-ISV" software developer, opening up a new market with millions of consumers.
Web services and service-oriented architecture generated a lot of enthusiasm in the early part of the decade. Application functionality exposed as standard APIs across the internet promised to allow the development of composite applications that would merge functionality in new and useful ways. Standards-based web services failed to deliver on these early promises, although simpler approaches - such as "mashups" - formed the basis for many interesting web applications.
Virtualization has played a part in mainframe and midrange servers for many years, but widespread use of virtualization really took off in the second half of the decade. Products such as VMware ESX allowed for the practical virtualization of production servers, providing compelling reductions in hardware, power, and management overhead.
Toward the end of the decade, virtualization technologies, together with SaaS applications and increasingly ubiquitous broadband internet connectivity, suggested that a new revolution - "cloud computing"-was upon us. While many aspects of cloud computing are undoubtedly overhyped, most observers agree that a significant paradigm shift is in progress. It seems highly likely that over the next decade, our application data and logic will increasingly reside in the "cloud."