The first computers used vacuum tubes for circuitry and
magnetic drums for
memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, the first computers generated a lot of heat, which was often the cause of malfunctions.
|
uniac system |
First generation computers relied on
machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time, and it could take days or weeks to set-up a new problem. Input was based on punched cards and paper tape, and output was displayed on printouts.
Second Generation (1956-1963) Transistors
Transistors replace vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors.
Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic
binary machine language to symbolic, or
assembly, languages, which allowed programmers to specify instructions in words.
High-level programming languageswere also being developed at this time, such as early versions of
COBOL and
FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.
Third Generation (1964-1971) Integrated Circuits
The development of the
integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on
silicon chips, called
semiconductors, which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through
keyboardsand
monitors and
interfaced with an
operating system, which allowed the device to run many different
applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.
Fourth Generation (1971-Present) Microprocessors
The
microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the
central processing unit and memory to
input/output controls—on a single chip.
In 1981
IBM introduced its first computer for the home user, and in 1984
Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of
GUIs, the
mouse and
handheld devices.
Fifth Generation (Present and Beyond) Artificial Intelligence