BC Business
BCBusiness brings to you a short timeline of the evolution of computers. From Ancient Greece to Burnaby, British Columbia, computers have been solving problems and making regular people wonder, What's next? Image source: iStock/Lagui
Mankind has invented many ways to help him process and calculate data easier and faster. Computer history could be traced back 3,000 years ago, when the abacus was widely used. It’s not a computer because it can’t be programmed, but it is the first mechanical device that performed basic arithmetic calculations.
Later, the Greeks invented a mechanism, named Antikythera, to predict the movements of the planets and stars, and events like the Olympic games, too. The device was reconstructed recently by a group of scientist and amateurs in Britain. Click on the video to see a demonstration.
It wasn’t until the 19th century that an automatic machine was created. Joseph Jacquard – with knowledge based on Falcon and Vaucanson’s punchcard system – perfected a textile loom that used the cards to control the weaving of fabric patterns mounted on a treadle-operation loom.
However, the first automatic computing engine was designed by Charles Babbage in 1837. The purpose of the machine was to help in the calculation of astronomical tables, and it used punchcards and steam.
The machine was never built during Babbage’s time. But in 1989 the Science Museum in London started a 17-year project to build Babbage’s creation based on his original designs. The machine will be displayed and demonstrated in the Computer History Museum in Mountain View, California, until May 2009. Based also on the Jacquard Loom, Herman Hollerith (later an IBM founder) invented the standard punchcard, which was used from the 1890 US Census until the late 1980s. Over the 19th century and the first half of the 20th century, electromechanical punchcard machines were the primary technology used for performing automatic calculations.
[pagebreak] FIRST GENERATION OF MODERN COMPUTERS
It wasn’t until 1938 that the first electromechanical and digital computer was built, invented by the German Konrad Zuse. He created an electromechanical machine, the Z1, that used the binary number system, was programmable, and had a memory of 64 words. The machine was built in Zuse’s parent’s living room in isolation from other scientists. Image source: Epemag.com/zuse
During the World War II, computer science developed at great speed. At the same time that Zuse was finishing his binary computer, the German army built Enigma and Lorenz; meanwhile the British built Colossus, which is considered the first computer that abandoned all kinds of mechanical devices and performed electronically, using valves or vacuum tubes. Image source: iStock/icefront
American scientists were also busy trying to perfect computers that could help them to win the war, examples include the “Atanasoff-Berry,” the “Mark I” and its various iterations, and the ENIAC, completed after the war, which could realize up to five thousand additions per second, and it was used in the 1950 US Census. It had almost 18,000 vacuum tubes and filled a whole room in the basement of the University of Pennsylvania.
[pagebreak] SECOND GENERATION: TRANSISTORS
The second half of the 20th century also welcomes the second generation of modern computers. In 1950, the transistor was developed by William Shockley, John Bardeen and Walter Houser Brattain. The transistor is a simple device that amplifies or can switch an electronic input signal from 1 to 0 quicker than the vacuum tube. Soon the transistor supplanted vacuum tubes in all electronic devices, allowing for smaller, cheaper and faster machines to be built.
“IBM’s 7000 series computers were the first machines to use transistors. Back in 1959, the 7090 was the most powerful computer in IBM’s lineup. The fully transistorized system had computing speeds six times faster than its vacuum tube predecessor, the IBM 709, and it was 7.5 times faster than the IBM 704. Although the 7090 was a general purpose data processing system, it was designed with special attention to the needs of engineers and scientists.” Source: IBM.com/history/exhibitsTHIRD GENERATION: CHIPS
The third generation of computers started when Jack St. Clair Kilby created he integrated circuit or “chip”, which is a miniature transistors circuit built into a semiconducting surface. Robert Noyce, one of the co-founders of Intel, created his own chip improving the one that Kilby had created months before, making the integrated circuit more suitable for mass production.
“On April 7, 1964, IBM introduced the System/360, the first large “family” of computers to use interchangeable software and peripheral equipment. It was a bold departure from the monolithic, one-size-fits-all mainframe. Fortune magazine dubbed it “IBM’s $5 billion gamble.” Source: IBM.com/history
FOURTH GENERATION: MICROCHIPS
Image source: www.apple2history.org The fourth generation of computers starts in 1971 when the microprocessor or microchip is built by Intel scientists Federico Faggin and Ted Hoff. The first microprocessor had more than 2000 transistors and could perform 90,000 calculations per second. This is the time when minicomputers start to be available in the market. The microprocessor made possible the surge of software and hardware companies like by IBM, Compaq, Apple and Hewlett Packard. In 1974, Intel improve greatly in the first microprocessor: the Intel 8080 which became the brain or the first personal computer the Altair which was sold as an assembly kit.
Another example worth mentioning is the first Apple computer, the Apple 1, which used a cheaper version of the Intel 8080 chip and place all onto a single circuit board, also it was the first one to include a keyboard instead of switches and light bulbs. [pagebreak] FIFTH GENERATION: QUANTUM COMPUTERS
In a classic computer, the “bit” is the basic unit of information which takes a 1 or 0 value, this data is processed by electrons passing through transistors inside microchips. In a quantum computer the “quatum bit” or “qubit” the data will be processed by circuits on an atomic scale. Quantum bits theoretically have the capacity of taking value 1 and 0 at the same time (superposition), which means that a quantum processor could perform many calculations at once becoming more powerful than any computer ever built. In the mean time you can watch the demonstration that D-Wave Inc. made on November 2007 of their 16qubits computer: