As we continue to shovel away the debris from the Y2K and Internet buying binges of a few years ago, CIOs in a variety of industries are discovering something remarkable. Beneath the rubble, a dramatically different model for business computing is emerging, as different from today's architecture as was the shift from mainframes to client/server in the 1980s. Whereas PCs made it possible to distribute both applications and data closer to their users, the next-generation architecture will distribute even smaller units of software over the Internet, not just to distant users but to destinations such as equipment on the factory floor and packages on store shelves. That capability will create a new class of information products and services that will interact with each other across organisational boundaries using sophisticated messaging and security protocols. Data processing will become even more tightly connected to business processes, designed to scale up or down quickly as conditions require, supported by new kinds of outsourcing relationships with hardware, software and communications vendors. Sports fans might think of it as "extreme computing".
The individual movements that fuel this next-generation architecture scenario have been percolating for some time. But it's their coming together within the next two to five years that will create the profound new landscape in which companies will do business. Organisations adopting next-generation architecture will realise substantial reductions in development and maintenance costs, and participate in new forms of information exchange up and down the supply chain that will create new sources of value.
The good news is that much of the investment you've made in the past five years in networks, hardware and applications will turn out to yield unexpected dividends as part of this new architecture - not just productivity savings but, more important, competitive advantage and perhaps even new products and services.
The bad news is that, like earlier large-scale shifts in computing architecture, the transition will be accompanied by plenty of trauma. To take maximum advantage, your company will need to make more investments in infrastructure, rethink some applications and gear up for a major transformation of your IT skills.
The next-generation architecture emerges from the intersection of three important trends in business computing:
Cheap computing. Following Moore's Law, computing power continues to get dramatically faster, cheaper and smaller all the time. Not only can computers do more things, more things can be turned into computers, including consumer electronics, appliances and, increasingly, individual products and packaging.
Distributed processing. Applications are being distributed closer and closer to the business functions they support, another long-standing trend. In the coming wave of distribution, processing will move away from human users to the objects in a commercial transaction. It won't be the driver of the truck who scans the package and uploads the data, in other words, but the package itself, which will send data continuously throughout its lifetime. A new set of business relationships will run parallel to the movement of goods and services, collecting and communicating information about every stage of every transaction - an important by-product of the next-generation architecture I have called the "information supply chain".
Openness. Turbo-charging these dramatic changes is one unexpected development of the past decade: the unprecedented spread of TCP/IP, XML, MP3, and other non-proprietary networking and data communications standards known collectively as the Internet. Despite the dotcom meltdown, open data communications standards continue to evolve and take hold, creating the foundation for some of the most exciting new applications of next-generation computing.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.