42 years later, is Moore’s Law still relevant?
IS Moore's Law Still Relevant?
In 1965, Intel co-founder Gordon Moore famously predicted that the number of transistors on a chip would double every two years.
Now known as Moore's Law, this theory can be interpreted to mean that the processing power of a computer doubles every 24 months, with cost remaining constant.
Of course, this prediction is famous because history has proven it to be true. Raw computing power has increased along these lines. But perhaps more relevant to the masses is what we'll call the "utility corollary" to Moore's Law. That is, the usefulness of computers must keep up with their processing power. Is the utility corollary true?
The utility corollary is certainly much more subjective, and thus more difficult to accurately calculate, than Moore's Law. But it is easily argued that, in the first two decades of personal computing, the utility of personal computers doubled at least every two years, if not more frequently.
When personal computers were first introduced, software was limited to word processing, spreadsheet processing, and, if you were lucky enough to have an Apple, very rudimentary graphic development. In addition, you might have a calculator that was more difficult to use, less functional, and much more inconvenient than the old-fashioned TI or HP on your desk.
These programs became much more useful as they matured, at least in part to the processing power of the hardware. As the computer was able to work faster and do more, so, too, could the software. A simple example of this is the inclusion of graphics in a spreadsheet or word-processing document. In the old days, this was extremely difficult and time consuming. People would spend minutes upon minutes just waiting for the computer to complete this process. Nowadays, however, due primarily to the increase in processing power, such a task is virtually effortless.
Furthermore, as processing power increased, a wider variety of software became available. This included niche applications such as video or music editing, the ubiquitous presentation preparation, or personal finance management programs. Obviously, a real spike in utility occurred with the popularization of the Internet and e-mail.
But in the last 10 years or so, advances in usefulness have been few and far between. Delivery of video over the Internet? Voice over IP? Great, but these aren't really breakthroughs. In fact, cable TV and plain old telephone service are still much more stable technologies with equal or better price-quality ratios.
So what do we have to look for? Can computers be made more useful? We believe the answer is yes, and in our next installment, we'll paint a picture of the future computing landscape.
is president of ISDI Technologies Inc., a Honolulu-based IT consultancy. Call him at 944-8742 or e-mail email@example.com