Everywhere you look, software is over budget, behind schedule, insecure, unreliable, and hard to use. Anytime an organization attempts to introduce a new system, or upgrade an old one, it takes a colossal risk; today, large information-technology projects are technological tar pits that immobilize institutions. Studies regularly report that two-thirds of such projects encounter major delays, significant cost overruns, or both. The U.S. government has found it nearly impossible to introduce or upgrade large-scale software systems: decade-long efforts at the Federal Aviation Administration and the FBI have collapsed in chaos.
Businesses have fared no better. To give a single example, McDonald's executives dreamed of a Web-based management system they called Innovate that would track the real-time flow of burgers, fries, and chicken nuggets in every one of their restaurants around the world. By the time they gave up and canceled the project, they had to write off $170 million of its estimated $1 billion total cost.
Such failures add up. Every year, according to a 2002 study by the National Institute of Standards and Technology, software failures cost $59.5 billion. But the price of bad software can also be measured in human misery--and even in lives lost. During the 1991 Gulf War, a Patriot missile battery didn't fire at an incoming Scud because of faulty software; the direct hit on a barracks killed 28 U.S. soldiers.
The past half-century of computing has seen wonderful progress. Programmers have abandoned punch cards and teletypes. They have given us a computer on every desktop, tools for work, toys for play, and a network that links homes and businesses to form a teeming global pool of information and entertainment. This progress has been fueled by the exponential curve of Moore's Law, Intel founder Gordon Moore's prediction that microchips' power would double (or their cost would halve) every one to two years.
But even as Moore's Law has made each year's new computers faster and cheaper, the flexibility and utility of our computer systems have been limited by the slower, uneven evolution of software. One formulation of this problem is known as Wirth's Law, after programming expert Niklaus Wirth: "Software gets slower faster than hardware gets faster."