Whispering in Public

Hot Potatoes


Pedal to the Metal


Fast, faster, even more fast, the march of the computer processor speeds keep going. You cannot be too rich or too thin, and processors can never be too fast. But how fast can processors go? Physically, we know that the fastest one can travel is the speed of light. In the case of computers, even Dr. Moore does not quite seem to know.


Even before the invention of the microprocessor, which sowed the seeds of the computer revolution, computers have been getting faster all the time, and cheaper too. In 1965, a mere 4 years after the integrated circuit was invented, Dr. Gordon Moore, drew a graph of the number of transistors in a single chip of memory plotted against the year of manufacture of the chip. He found that over the past 4 years (starting 1961) the number of transistors had doubled every 18 to 24 months, reaching about 50 transistors per chip in 1965. Dr. Moore speculated that this trend is expected to continue to 1975. Plugging the numbers into Dr. Moore’s formula shows that the number of transistors in 1975 would be between 1,6000 (doubling in 24 months) and 5,000 (doubling every 18 months).


In 1968 Dr. Moore co-founded the Intel Corporation, one of the powerhouses of processor manufacturing today. His 1965 prediction became very famous and was dubbed Moore’s Law. Moore’s law predicts the density of transistors per chip of memory but is also applicable to transistors in a CPU, as they use roughly similar manufacturing processes. If a processor has more transistors, it is capable of doing more. Hence it achieves more in the same period of time, and hence is faster. Also, since the size of the chip does not increase appreciable, the doubling of he transistors means a shrinking of the physical size of each of them. As the size of a transistor shrinks, so does its switching time and hence they can be made to run faster. The speed at which a transistor switches is dependant on the capacity to store electrons—the less the capacity the faster it switches.


The shrinking transistor coupled with its population explosion is the primary cause of the increasing speeds of processors. However, Moore’s law seemed to have a fatal flaw, it predicts the future in terms of a high index exponential growth. Exponential growth is scary, explosive and unsustainable. Dr. Moore knew that, and limited the time frame of his prediction to 1975. By then, he reasoned, growth would top out or at least slow down.


Consider an old tale. A king played chess so well that he decided if anyone could beat him at chess, he would grant the person any reasonable, but lavish wish. No one succeeded, until a poor old man showed up and beat the king. The poor man asked for nothing much, he wanted a grain of rice for the first square of the chess board, two grains for the second square, four grains for the third square and so on (each square doubling the previous one). The king thought it was a measly request, and granted it immediately.


If the man had asked for 1 for first, 2 for second, 3 for third, it would have been linear growth. He would have received a total of 2080 grains of rice for the 64 squares of the board (about 70 grams). Since the king had not taken Mathematics 101, and did not comprehend exponential growth, he then had to declare bankruptcy. If we do the counting, we find that while the 10th square needs a measly 512 grains and the 15th square needs 16,384 grains, which is about half a kilo. Then suddenly the rice supply starts falling apart, as 22nd square needs 69 kilos and, oh my, the 30th square consumes 17,700 kilos. There is no point calculating what is needed for all the 64 squares, that number exceeds all the rice ever grown on the planet earth. Actually it is about 1000 times higher than the total number of atoms composing our planet.


When in 1975, Moore’s law lived up to its billing, the computing community was quite impressed with Dr. Moore. Today, 36 years later, Moore’s law is still right on the money, an absolutely amazing feat. The first microprocessor used in personal computers was the Intel 8080 chip, released in 1974, which had whopping 6,000 transistors and ran at a blazing clock speed of 2 MHz. Today, Intel sells the Pentium-4, which has 42,000,000 transistors and runs at 2,000 MHz.


What determines the actual speed of a computer? There is no clear-cut answer; it is a rather complex issue. The speed of the CPU is definitely one of the major contributors to the overall speed of the computer. One measure of speed that is used, due to its convenience is the “clock speed” that is how fast does the heartbeat of the CPU, (also called the clock) runs. Faster clock speeds in general tantamounts to faster speeds, but clock speeds do not tell the whole story of speed.


The clock speed of CPUs of similar types has been following Moore’s law rather well. The microprocessor from Intel in 1985 was the 386 that ran at 16MHz and the next generation called the 486 was introduced in 1989 had a clock speed of 25 MHz. The next big jump was the Pentium, debuted in 1993 at 60MHz, and was soon raised to 90MHz. The Pentium-2 started at 233 MHz (1997) followed by Pentium-3 at 450 MHz (1999) and Pentium-4 at 1000MHz (2000). Today the fastest processor from Intel, runs at 2000MHz, and can be construed to have slightly outpaced Moore’s law.


Does clock speed really tell the speed of the processor? Absolutely not, but it is an indicator. The speed of a processor is actually quite elusive. It refers to the capacity of the processor to perform a certain number of computations in a unit of time (higher is faster). Suppose I need to sort a million numbers, and a computer that does it in one minute is definitely twice as fast as a computer that takes two minutes. However, we find that a computer that sorts twice as fast may not multiply twice as fast. This muddies the speed issue quite a bit. Complicating the issue even more is the fact that when the clock speed doubles, sometimes computing power increases more than double and sometimes it increases less than double.


AMD is a competitor of Intel, and makes the Athlon series of processors. AMD claims, quite rightly so that the Athlon running at 1000MHz routinely beats a Pentium running at 1400MHz. Also, it is quite apparent that the 2000MHz Pentium is tweaked such that although its runs twice as fast as the 1000MHz Pentium it does not do twice as much. Intense competition in the “clock speed” market made Intel up the clock speed using tricks that does not translate to raw power. However, nitpicks aside, a 2000Mhz processor is a very fast processor.


Programs called benchmarks do measure the speed of a processor. A benchmark exercises the processor using simulations of various tasks—adding, multiplying, sorting, branching and so on. The number of seconds a processor takes to complete a benchmark determines its speed. Often benchmarks include the effects of the relative speeds of the peripherals that go along with the CPU, that is the memory, the bus, the disks and the video system. As is obvious, the speed of the computer is a complex sum of the speed of the CPU and the associated peripherals. As the speed of the CPU has risen so has the speed of much of the peripherals, albeit not as much. No one seems to pay much attention to benchmarks, as the details and results get too confusing. Clock speeds are easier to remember and compare.


The price of a computer is another fascinating story. In 1984 the price of a basic personal computer was about $4,000. In 1993 a basic system could be had for about $2,500. Today it is possible to get one for just over $500. Of course, note that $4,000 in 1984 is worth about $6,700 today (inflation). Also, the machines of 1984 are incomparable; today’s $500 systems have clock speeds 150 times higher (800Mhz vs. 4.7Mhz), memories that are 128 times larger (128MB vs. 1MB) and disks that are 2,000 times bigger (20,000MB vs. 10MB).


If you read the above paragraph, you should have noted the sharp runup in disk capacity. No one really managed to predict the astonishing climb and a corresponding decline in price. In 1984, a large disk was something that was attached to large minicomputers. The disk unit was as large as a washing machine, had to be placed in a clean climate controlled room, held about 700MB of data and would cost well over $10,000. Today a medium disk unit is about the size of two decks of cards, needs no special treatment, holds 40,000MB and costs about $150. Prices of consumer electronics always seem to fall with time. However, nothing has kept pace with the downward spiral of prices of computing equipment. Memory sold for close to $1000/MB in 1983, and came down to about $100/MB in 1989 and then to about $10/MB in 1995 and today it is under $1/MB.  Maybe soon, computers will be given away for free.


Dr. Moore is as surprised as anyone else. In 1997 he spoke again, saying “To be honest, I did not expect this law to still be true some 30 years later”. However, this time he has gone on a limb and extended his exponential growth prediction by quite a bit, with the dramatic statement, “I am now confident that it will be true for another 20 years.” Fifty years of exponential growth simply boggles the mind. It is a lot of grains of rice.


Partha Dasgupta is on the faculty of the Computer Science and Engineering Department at Arizona State University in Tempe. His specializations are in the areas of Operating Systems, Cryptography and Networking. His homepage is at http://cactus.eas.asu.edu/partha




Partha Dasgupta