The 8080 chip at 40: What's next for the mighty microprocessor
Now, some 40 years after the debut of the Intel 8080 microprocessor, the industry can point to direct descendants of the chip that are astronomically more powerful (see sidebar, below). So what's in store for the next four decades
For those who were involved with, or watched, the birth of the 8080 and know about the resulting PC industry and today's digital environment, escalating hardware specs aren't the concern. These industry watchers are more concerned with the decisions that the computer industry, and humanity as a whole, will face in the coming decades.
The 8080's start
While at Intel, Italian immigrant Fredericco Faggin designed the 8080 as an enhancement of Intel's 8008 chip -- the first eight-bit microprocessor, which had debuted two years earlier. The 8008, in turn, had been a single-chip emulation of the processor in the Datapoint 2200, a desktop computer introduced by the Computer Terminal Corp. of Texas in late 1970.
Chief among the Intel 8080's many improvements was the use of 40 connector pins, as opposed to 18 in the 8008. The presence of only 18 pins meant that some I/O lines had to share pins. That had forced designers to use several dozen support chips to multiplex the I/O lines on the 8008, making the chip impractical for many uses, especially for hobbyists.
"The 8080 opened the market suggested by the 8008," says Faggin.
As for the future, he says he hopes to see development that doesn't resemble the past. "Today's computers are no different in concept from the ones used in the early 1950s, with a processor and memory and algorithms executed in sequence," Faggin laments, and he'd like to see that change.
He holds out some hope for the work done to mimic other processes, particularly those in biology. "The way information processing is done inside a living cell is completely different from conventional computing. In living cells it's done by non-linear dynamic systems whose complexity defies the imagination -- billions of parts exhibiting near-chaotic behavior. But imagine the big win when we understand the process.
"Forty years from now we will have begun to crack the nut -- it will take huge computers just to do the simulations of structures with that kind of dynamic behavior," Faggin says. "Meanwhile, progress in computation will continue using the strategies we have developed."
Nick Tredennick, who in the late 1970s was a designer for the Motorola 68000 processor later used in the original Apple Macintosh, agrees. "The big advances I see coming in the next four decades would be our understanding of what I call bio-informatics, based on biological systems," he says. "We will start to understand and copy the solutions that nature has already evolved."
Carl Helmers, who founded Byte magazine for the PC industry in 1975, adds, "With all our modern silicon technology, we are still only implementing specific realizations of universal Turing machines, building on the now nearly 70-year-old concept of the Von Neumann architecture."
Human-digital synthesis
How we will interface with computers in the future is of more concern to most experts than is the nature of the computers themselves.
"The last four decades were about creating the technical environment, while the next four will be about merging the human and the digital domains, merging the decision-making of the human being with the number-crunching of a machine," says Rob Enderle, an industry analyst for the past three decades.
This merging will involve people learning how to perform direct brain control of machines, much as they now learn to play musical instruments, predicts Lee Felsenstein. He helped design the Sol-20 (one of the first 8080-based hobbyist machines) and the Osborne 1, the first mass-market portable computer.
"I learned to play the recorder and could make sounds without thinking about it -- a normal process that takes a period of time," he notes. Learning a computer-brain interface will likewise be a highly interactive process starting in about middle school, using systems that are initially indistinguishable from toys, he adds.
"A synthesis of people and machines will come out of it, and the results will not be governed by the machines nor by the designers of the machines. Every person and his machine will turn out a little different, and we will have to put up with that -- it won't be a Big Brother, one-size-fits-all environment," Felsenstein predicts.
"An effortless interface is the way to go," counters Aaron Goldberg, who heads Content 4 IT and has been following the technology industry as an analyst since 1977. "Ideally it would understand what you are thinking and require no training," considering the computational power that should be available, he adds.
"Interaction with these devices will be less tactile and more verbal," says Andrew Seybold, also a long-time industry analyst. "We will talk to them more and they will talk back more and make more sense. That's either a good thing or a scary thing."
The dark side
Some observers believe increasingly powerful computers could bring problems.
"In the next four decades the biggest issue is what happens when devices become smarter, more capable and more knowledgeable than we are," says Goldberg. "If you follow the curve we will clearly be subordinate to the technology. The results could be terrifying, or empowering. There may always be tension between the two. Much as it has been thrilling to live in this generation, the next should be really exciting -- but the problems will also be much bigger."
"There's a lot of concern that we are developing the race that will replace us," adds Enderle, fears that have been articulated by scientists and others, from tech entrepreneur Elon Musk to renowned physicist Stephen Hawking. "We could create something so smart that it could think that it would be better off without us," Enderle adds. "It would see that we're not always rational and fix the problem, either by migrating off-planet as many hope, or by wiping out the human race."
Not everyone agrees with the doomsday scenario. "I am a meliora conservative regarding computer technology's future," counters Byte magazine's Helmers. (Meliora is Latin for "ever-better," and it is the motto of Helmers' alma mater, the University of Rochester.) "Given another 40 years of creative engineering minds building on the vast past achievements of prior creative minds, our technology will be meliora to the nth degree."
Either way, "The CPU is only a small part of the problem these days; it's what we do with it that's the problem," adds Bob Frankston, who co-invented VisiCalc, the first PC "killer app," in 1978.
"You will have the equivalent of Watson in your wristwatch or embedded inside you -- what will you then want to do" wonders Jonathan Schmidt, one of the designers of the Datapoint 2200. Watson is the name of the IBM artificial intelligence entity famous for winning the TV quiz show "Jeopardy!" against two human champions in 2011.
Ted Nelson, who invented the term hypertext in the 1960s and whose still-unrealized Project Xanadu has many features in common with the later World Wide Web, pretty much rejects both the past and the future. "Advances It has all turned to crap and imprisonment," he says. As for the next four decades, "More crap, worse imprisonment." (Nelson's Xanadu would give all users file access, including editing privileges. Users on today's Web can do only what a specific site lets them do.)
Specific advances
Some experts have predicted, or called for, specific advances. For example, Stan Mazor, who as a chip designer at Intel was involved in the 8008, says machine vision may be the next frontier.
"When computers can see, we will have a large leap forward in compelling computer applications," says Mazor. "Although typical multiprocessors working on a single task saturate at around 16 CPUs, if a task can be partitioned, then we might see 100,000 CPUs on a chip. Vision's scene analysis might be one of those problems suitable for large-scale parallelism."
"Why can't we use the computing power we have available today to make computers communicate with humans more efficiently, without the need for programming languages, operating systems, etc" asks Marcian E. "Ted" Hoff, who was Mazor's boss at Intel during the 8008 project. "There has been insufficient progress in natural language processing, a disappointment I hope will be remedied."
"I am a bit concerned about the whole cloud thing," Hoff adds. "Consider that the delay due to a few inches of wire between a CPU chip and its memory now corresponds to several [CPU] instructions. Local storage has never been cheaper. And yet we are planning to move our data miles and miles away, where its security is questionable, and the time to reach it is several orders of magnitude longer than with local storage. And consider the bandwidth requirements."
Nick Tredennick, a former design engineer at Motorola who worked on the MC68000 microprocessor and a co-founder of chip-maker NexGen, calls for hardware that is configurable according to the needs of the software. "We need to make the hardware accessible to programmers, not just logic designers. I predicted that years ago, but it did not happen." He foresees that micro-electromechanical systems (MEMS) combined with the Internet of Things should lead to buildings and bridges that can report when they are stressed and otherwise need maintenance.
As for the cumulative future result of the rising tide of computer power, "I don't think it will be world peace, or a human lifespan of 300 years," says Frankston. "Whatever it is, it will be the new normal, and people will complain about 'kids these days.'"