History of Microprocessor

The History Behind the Microprocessor

by / / Published in Blogs
Share It:

microprocessor inventor

Jack S. Kilby created the first Integrated Circuit (IC) back in 1958 while working for Texas Instruments. This company developed transistors, and Kilby’s first task was to design micro modules for the military in which he connected various wafers of discrete components by stacking them on top of each another.

Seeing the process as far too complex, he discovered that a piece of germanium that was engineered properly could act as various components at the same time. This is where the IC came to fruition. One year later, Fairchild Semiconductor came up with the modern silicon diffusing process, which today is now the planar process. The IC process changed over the following decade, including shifting the development process to computer assisted design in 1967.

Robert Noyce, who was part of helping to develop Fairchild’s silicon process, founded another company in 1968 along with Gordon Moore and Andrew Grove called Intel.


Behind the Production of the Integrated CPU

Busicom – the trading name of Japanese-based ETI – planned a wide array of programmable calculators. Busicom designed 12 chips and requested that they be produced by Intel because of the latter company’s skill and experience with high transistor count chips. The project was assigned to Marcian Hoff Jr., but once he studied the designs, he determined that its level of complexity was too much for the actual usefulness of a calculator.

Instead, Hoff contrasted the design with the DEC Program Data Processor 8 (PDP-8). While it was simple in architecture, it was able to perform very high-level operations. Hoff determined that a general purpose processor would be simpler in design, yet capable of handling more tasks. This lead to the development of the MCS-4 chipset and the 4004 integrated CPU.


Busicom ended up choosing Intel’s “Microcomputer on a chip” over its own 12 chip design in 1969, with Busicom retaining exclusive rights to purchase the new chip set according to the contract. In 1971, it was agreed upon that in exchange for lower chip prices, Intel would retain full marketing rights that would allow them to sell the chips as they saw fit.


8086 Design

In 1969, after the 4004 instruction set was determined, Computer Terminals Corporation (CTC) asked Intel to come up with an LSI registers chip for a new intelligent terminal that was being worked on. Stan Mazor (who helped develop the 4004) and Hoff agreed to put the complete processor on one chip, thanks to their expertise with the 4004. The 8008 was defined and work started right away. CTC rejected the chip because it needed many support chips and was too slow. Chip design continued along with the MCS-4, and in 1972, the 45 instruction CPU came into effect.

The chip was a huge success. Intel looked at the rejection of the 8008 by CTC and discovered that they had to make a general purpose processor that only needed a few support chips. The Intel 8080 came about in 1974 despite the fact that it was announced earlier. This was intentionally done to give customers enough time to design the chip into their products.

The 8080 had 4,500 transistors, two-times more than the 4004 and was capable of addressing 64K bytes of memory. Its speed depended on electron doped technology, and not hole doped MOS transistors. The chip was a big success and quickly became and industry standard.

In 1978, Intel’s first 16 bit processor, the 8086, was produced, and was compatible with the 8080 and 8085. This chip was compatible with 8080, which meant it had to use an overlapping segment register process in order to access 1 Megabyte of memory.


Early Years of the Microprocessor

Even though Intel invented the microprocessor, they weren’t the only company making them. By 1974, 19 microprocessors were available or announced, and by 1976, this number increased to 54. The second ever processor was developed in 1972 with 4-bit processor Rockwells PPS-4. the Texas Instruments TMS 1000  – also a 4 bit processor – came onto the market in 1974, even though it was designed in 1971. The TMS 1000 was actually the first microcontroller, and contained its own RAM and ROM on chip.

The cost of chips dropped considerably by the late 1970s and was the processor of choice for consumer electronics. It was being produced in more than 40 variants.


Microprocessor Clones

After the success of the Intel 8008, committing chips were produced by Zilog and Motorola. Motorola launched the 6800 in 1974, which was a processor in the same field as the 8080. The 6800 came with a wide array of support in terms of system oriented hardware.

MOS Technologies came up with their own 6502 processor in 1975, which was heavily influenced by the 6800. This version was successful thanks to its simple design, affordability, and single power line. It soon became the favorite for small-sized computer businesses, which included Apple, Commodore and Acorn.It was able to keep up with the newer and more powerful Z80.

The Z80 was significant because it was compatible with the 8080, and even added an additional 80 instructions. This compatibility wasn’t unexpected, however, as Zilog was founded by engineers who were from Intel. The Zilog Z80 was a powerful processor that allowed the system designers to produce computers with little extra circuitry at an affordable cost.

One year after Intel came up with their 16 bit processor, Motorola came up with another powerful chip, the 68000, which was capable of addressing 16 megabytes and act like a 32 bit processor internally. Macintosh, Amiga and Atari computers found this chip highly useful.



While RISC can be seen as a modern concept, it can be traced back to 1965 and CDC’s 6600. RISC design is based on simplicity of processor instruction set, which enables high-tech architectural techniques to boost the speed of the instructions. The CDC 6600 has plenty of RISC traits, such as a small instruction set of 64 op codes, register to register operations, and a load/store architecture.

IBM formalized the principles of RISC in the IBM 801, which featured a small instruction set, pipelining, load/store memory operations, and 24 registers.

As RISC increased in popularity in the late 1980s, IBM attempted to market the design as the Research OPD Mini Processor (ROMP) CPU. However, this was not successful, and the chip ended up becoming the nucleus of the I/O processor for the IBM 3090. RISC initially came from a University research project in the US. The Berkeley RISC 1 was the foundation of the commercial Scalable Processor Architecture (SPARC) processor, and Stanford University’s Microprocessor without Interlocked Pipeline Stages (MIPS) processor is now owned by Silicon Graphics Inc.

The RISC I, II and SPARC processors are unique, and feature register windows. A CPU has only a few registers that are visible to the programmer, but the set can be changed for another set as the programmer chooses. This helps provide a very low overhead thanks to fast context switches. Later on, it was discovered that a compiler can produce code for non-windowed machines that were almost as efficient as a windowed processor. By the mid-1980s, ‘RISC’ was very popular. Intel used the term for its 80486 processor.

At approximately the same time that the Stanford and Berkeley projects came out, a UK-based home computer company, Acorn, was searching for a processor that could be swapped with the 6502. They found that commercial microprocessors, including the 8086 and 68000, weren’t advanced enough. In 1983, they started their own project to design a RISC microprocessor, which resulted in the ARM.


The Transputer

Inmos was developed by the UK government in 1979 to come up with silicon-based goods that were to compete with other international products. Throughout the summer of 1980, Inmos worked on its first microprocessor, however, it was not a streamlined process as two of the engineers – David May and Robert Milne – did not agree on their positions about their idea of the architecture of the microprocessor.

Milne believed that the Transputer chip should be the first in the world that would be specifically tailored to run Ada. However, May disagreed and instead had a more simplistic approach to the design of the Transputer. Iann Barron was the initial enforcer behind Inmos, and interrupted the two. His views encompassed those of both May and Milne. The transputer hit the market in 1985 as the T-212, a 16 bit version with a RISC type instruction set. In 1994, the T-9000 was launched, which featured a design that was optimized for utilization in parallel computers through systolic configuration.

The Super RISC

DEC developed a team in 1988 that came up with a new architecture for the firm. A decade before that, it moved from the PDP-11 to VAX architecture. The project ended up with over 30 engineering groups in 10 nations that worked in on the Alpha AXP architecture.

The team looked at the previous decade of computer chips processing, and discovered that a 1000-fold increase in computing power happened during that time span. They therefore concluded that the same would happen over the next 25 years, and determined that their designs could be run at 10-times the clock speed and 10 times the instruction issue pace. They placed interrupts, memory management, and exceptions into code called PALcode to enable the processor to run a number of operating systems at once.

The Alpha architecture was selected as RISC but honed in on pipelining and multiple instruction issues instead of conventional RISC traits. The chip was launched in 1992 and was known as the fastest single-chip microprocessor in the world. The PowerPC came from the needs of IBM and Motorola, to come up with a high performance workstation CPU. Apple required a CPU to replace the Motorola 680×0 in its Macintosh computers. The PowerPC is the end result of IBMs Performance Optimized with Enhanced RISC multichip processors.



While the microprocessor was initially designed for a calculator, it eventually found its way into many designs. Used in everything from PCs, to cars, to telephones and TVs, there’s no sign of the microprocessor slowing down in innovation any time soon.

Share It: