Sunday, 27 April 2014

Computer Generation

We have study history subject in our school days in those time history was common for all student but in college level its get divided on behalf of time like Ancient, Middle and Modern history. The Topic computer generation is also dividing on time. We read what type of hardware was used on time to time in computers .So Generation is computer history based on hardware. There are five generation

First Computer

There is no easy answer to this question because of all the different classifications of computers. The first mechanical computer created by Charles Babbage doesn't really resemble what most would consider a computer today. Therefore, this document has been created with a listing of each of the computer firsts starting with the Difference Engine and leading up to the types of computers we use today. Keep in mind those early inventions that helped lead up to the computer such as the abacus, calculator, and tablet machines are not accounted for in this document.

Analytical Engine

The Analytical Engine was a proposed mechanical general-purpose computer designed by English mathematician Charles Babbage.
It was first described in 1837 as the successor to Babbage's Difference engine, a design for a mechanical computer. The Analytical Engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete.

Babbage began his work on the Analytical Engine in 1834. He envisaged the computer to be constructed with brass fittings and powered by steam. It was never built, since the government of the day was unwilling to fund its construction, having already sunk 17,000 English pounds into Babbage's fruitless project to build an earlier invention, the Difference Engine .

Babbage was assisted in his endeavors by Ada Augusta, Countess of Lovelace (and daughter of the poet Byron) who is regarded as the world's first computer programmer for her work with Babbage. She developed a punched card program to calculate the Bernoulli numbers.

While Babbage's earlier Difference Engine was finally constructed in 1991, his Analytical Engine remains unrealized. As the originator of several important concepts in computing, however, Babbage's place in history is secure.

First Generation (1940-1956) Vacuum Tubes

The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.

Vacuum tube

Alternatively referred to as an Electron tube or valve, a Vacuum tube was first developed by John Ambrose Fleming in 1904 and is a glass tube that has had all gas has been removed creating a vacuum. Vacuum tubes contain electrodes for controlling electron flow in early computers that used them as a switch or an amplifier. The picture shows a collection of different vacuum tubes to give you a better understanding of what these look like. Today, vacuum tubes are no longer used with computers and have been replaced by the transistor.


  1. Latest technology of Vacuum Tubes was used in them.
  2. They were smaller in size.
  3. The invention of vacuum tubes helped to develop the digital computers.
  4. They were the fastest of the age and could process in milliseconds.


  1. Too bulky in size.
  2. Unreliable.
  3. Thousands of vacuum tubes that were used emitted large amount of heat and burnt out frequently
  4. Air conditioning required.
  5. Prone to frequent hardware failures.
  6. Constant maintenance required.
  7. No portable.
  8. Manual assembly of individual components into functioning unit required.
  9. Commercial production was difficult and costly.
  10. Limited commercial use


ENIAC ( Electronic Numerical Integrator And Computer) was the first electronic general-purpose computer. It was Turing-complete, digital, and capable of being reprogrammed to solve "a large class of numerical problems".

ENIAC was initially designed to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory. When ENIAC was announced in 1946 it was heralded in the press as a "Giant Brain". It had a speed of one thousand times that of electro-mechanical machines. This computational power, coupled with general-purpose programmability, excited scientists and industrialists.

Technology behind ENIAC 

he ENIAC contained 17,468 vacuum tubes, along with 70,000 resistors, 10,000 capacitors, 1,500 relays, 6,000 manual switches and 5 million soldered joints. It covered 1800 square feet (167 square meters) of floor space, weighed 30 tons, consumed 160 kilowatts of electrical power. There was even a rumor that when turned on the ENIAC caused the city of Philadelphia to experience brownouts, however, this was first reported incorrectly by the Philadelphia Bulletin in 1946 and since then has become an urban myth.
In one second, the ENIAC (one thousand times faster than any other calculating machine to date) could perform 5,000 additions, 357 multiplications or 38 divisions. The use of vacuum tubes instead of switches and relays created the increase in speed, but it was not a quick machine to re-program. Programming changes would take the technicians weeks, and the machine always required long hours of maintenance. As a side note, research on the ENIAC led to many improvements in the vacuum tube.

Media                                                Speed  
Photoelectric Tape Reader          942 sexadec char/sec 78 words/sec 
Card Reader (IBM)                     15 rows/sec 100 cards/ndn


Media Speed 
Paper Tape Perf.      6 sexadecimal char/sec 30 words /min

Teletypewriter        6 sexadecimal char/sec 30 words/min 

Card Punch (IBM)       100 cards/min 800 words/min

Output of Eniac Computer


Short for Universal Automatic Computer, the UNIVAC I, a trademark of the Unisys corporation, was released in 1951 and 1952 when first developed by J. Presper Eckert and John Mauchly. The UNIVAC is an electrical computer containing thousands of vacuum tubes that utilizes punch cards and switches for inputting data and punch cards for outputting and storing data. The UNIVAC was later released the UNIVAC II, and III with various models, such as the 418, 490, 491, 1100, 1101, 1102, 1103, 1104, 1105, 1106, 1107, and 1108. Many of these models were only owned by a few companies or government agencies.In the above public domain picture is an example of what the UNIVAC computer looked like. As can be seen in the picture, this is a room-sized computer and often required multiple people to operate.

Input/Output devices : First Generation computer

The external devices of the ENIAC are the IBM card reader and the Constant Transmitter. Just as external devices are today, the constant transmitter is a mix of electronic and mechanical components. The IBM card reader reads values used to set switches from punch cards and supplices this information to the constant transmitter. An punch card can store 80 digits and is read by the speed 120 to 160 per minutes. This value is slower than the speed of which the ENIAC performs calculations, but as the ENIAC is constructed to solve problems involving many iteration, it is not a major problem. The constant transmitter has switches associated with tranceivers with input and output ports communicating with the rest of the units if the ENIAC. The card reader can either be activated by a pulse from the Initiation Unit, or mechanically by pressing a button.

Neon tubes on the ENIAC display the numbers stored in each accumulator which makes it possible for an observer to follow an operation taking place within the machine as he or she sees the digits "rushing by". This is a great help in order to identifying unit of the ENIAC is not working correctly when the machine is set in a debugging mode.

Second Generation (1956-1963) Transistors

Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.

The first computers of this generation were developed for the atomic energy industry.

IBM 7000, NCR 304, IBM 650, IBM 1401, ATLAS and Mark III are the examples of second generation computers.


  1. Smaller in size as compared to first generation computers.
  2. More reliable.
  3. Less heat generated.
  4. These computers were able to reduce computational times from milliseconds to microseconds.
  5. Less prone to hardware failures
  6. Better portability.
  7. Wider commercial use.


  1. Air-conditioning required.
  2. Frequent maintenance required.
  3. Manual assembly of individual components into a functioning unit was required.
  4. Commercial production was difficult and costly.


Transistors replaced vacuum tubes. Transistors use a semiconducting material to control the flow of electricity through the circuit. Circuits using transistors are smaller, faster, more powerful and more energy efficient than vacuum tube based circuits.

The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.

IBM 700/7000 series

The IBM 700/7000 series was a series of large-scale (mainframe) computer systems made by IBM through the 1950s and early 1960s. The series included several different, incompatible processor architectures. The 700s used vacuum tube logic and were made obsolete by the introduction of the transistorized 7000s. The 7000s, in turn, were eventually replaced by System/360, which was announced in 1964. However the 360/65, the first 360 powerful enough to replace 7000s, did not become available until November 1965. Early problems with OS/360 and the high cost of converting software kept many 7000s in service for years afterward.

Third Generation (1964-1971) Integrated Circuits

The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.


  1. Smaller in size as compared to previous generation computers.
  2. Even more reliable than second-generation computers.
  3. Even lower heat generated than second generation computers.
  4. These computers were able to reduce computational times from microseconds to nanoseconds.
  5. Maintenance cost is low because hardware failures are rare.
  6. Easily portable.
  7. Totally general purpose. Widely used for various commercial applications all over the world.
  8. Less power requirement than previous generation computers.
  9. Manual assembly of individual components into a functioning unit not required. So human labour and cost involved at 
  10. assembly stage reduced drastically.
  11. Commercial production was easier and cheaper.


Air-conditioning required in many cases.
Highly sophisticated technology required for the manufacture of IC chips.

Integrated circuit (IC)

An integrated circuit (IC), sometimes called a chip or microchip, is a semiconductor wafer on which thousands or millions of tiny resistors, capacitors, and transistors are fabricated. An IC can function as an amplifier, oscillator, timer, counter, computer memory, or microprocessor. A particular IC is categorized as either linear (analog) or digital, depending on its intended application.

Integrated circuits are used for a variety of devices, including microprocessors, audio and video equipment, and automobiles. Integrated circuits are often classified by the number of transistors and other electronic components they contain:

  1. SSI (small-scale integration):Up to 100 electronic components per chip
  2. MSI (medium-scale integration):From 100 to 3,000 electronic components per chip
  3. LSI (large-scale integration):From 3,000 to 100,000 electronic components per chip
  4. VLSI (very large-scale integration):From 100,000 to 1,000,000 electronic components per chip
  5. ULSI (ultra large-scale integration): More than 1 million electronic components per chip

Fourth Generation (1971-Present) Microprocessors

The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.


  1. Smallest in size because of high component density.
  2. Very reliable.
  3. Heat generated is negligible.
  4. No air conditioning required in most cases.
  5. Much faster in computation than previous generations.
  6. Hardware failure is negligible and hence minimal maintenance is required.
  7. Easily portable because of their small size.
  8. Totally general purpose.
  9. Minimal labour and cost involved at assembly stage.
  10. Cheapest among all generations.


Highly sophisticated technology required for the manufacture of LSI chips.

Fifth Generation (Present and Beyond) Artificial Intelligence

Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nano technology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.