Advertisment

Integrated Chips: Miniaturization of Computer Begins

author-image
PCQ Bureau
New Update


Advertisment

Advertisment

Integrated Circuits

Integrated Circuits are usually called ICs or chips. They are complex circuits which have been etched onto tiny chips of semiconductor (silicon). Integrated circuits are used in virtually all electronic equipment today and have revolutionized the world of electronics. If are able to use computers, mobile phones, and other digital appliances at affordable prices, it is because of the low cost production of integrated circuits.

The development of the integrated circuit allowed thousands of transistors to be stored in a coin sized object. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper.

Advertisment

Despite their great significance, transistors were of large size and could not be moved at will. It was the integrated circuit that facilitated millions of transistors to be packed onto a single wafer of silicon. Placing such large numbers of transistors on a single chip vastly increased the power of a single computer and lowered its cost considerably. Since the invention of integrated circuits, the number of transistors that can be placed on a single chip has doubled every two years, shrinking both the size and cost of computers even further and further enhancing its power. Most electronic devices today use some form of integrated circuits placed on printed circuit boards, also called a mother board.

Microprocessors

Advertisment

Microprocessors initiated the rise of computers at unprecedented scale. Ted Hoff invented a chip the size of a pencil eraser that could do all the computing and logic work of a computer. The microprocessor was made to be used in calculators, not computers. It led to the invention of personal computers, or microcomputers.

It wasn't until the 1970's that people began buying computer for personal use. The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer, from the central processing unit and memory to input/output controls, on a

single chip.

Advertisment

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices. In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.



Advertisment