COMPUTER
The modern electronic digital computer is the result of a long series of developments,
which started some 5000 years ago with the abacus. The first mechanical adding device was
developed in 1642 by the French scientist-philosopher, Pascal. His 'arithmetic machine',
was followed by the 'stepped reckoner' invented by Leibnitz in 1671, which was capable of
also doing multiplication, division, and the evaluation of square roots by a series of
stepped additions, not unlike the methods used in modern digital computers.
In 1835, Charles Babbage formulated his concept of an 'analytical machine' which combined
arithmetic processes with decisions based on the results of the computations. This was
really the forerunner of the modern digital computer, in that it combined the principles
of sequential control, branching, looping, and storage units.
In the later 19th-c, George Boole developed the symbolic binary logic which led to Boolean
algebra and the binary switching methodology used in modern computers. Herman Hollerith, a
US statistician, developed punched card techniques, mainly to aid with the US census at
the turn of the century; this advanced the concept of automatic processing, but major
developments awaited the availability of suitable electronic devices. J Presper Eckert and
John W Manchly produced the first all-electronic digital computer, ENIAC (Electronic
Numerical Integrator and Calculator), at the University of Pennsylvania in 1946, which was
1000 times faster than the mechanical computers.
Their development of ENIAC led to one of the first commercial computers, UNIVAC I, in the
early 1950s, which was able to handle both numerical and alphabetical information. Very
significant contributions were made around this time by Johann von Neumann, who converted
the ENIAC principles to give the EDVAC computer (Electronic Discrete Variable Automatic
Computer) which could modify its own programs in much the same way as suggested by
Babbage.
The first stored program digital computer to run an actual program was built at Manchester
University, UK and first performed successfully in 1948. This computer was later developed
into the Ferranti Mark I computer, widely sold. The first digital computer (EDSAC) to be
able to be offered as a service to users was developed at Cambridge University, UK, and
ran in the spring of 1949. The EDSAC design was used as the basis of the first business
computer system, the Lyons Electronic Office.
Advances followed rapidly from the 1950s, and were further accelerated from the mid-1960s
by the successful development of miniaturization techniques in the electronics industry.
The first microprocessor, which might be regarded as a computer on a chip, appeared in
1971, and nowadays the power of even the most modest personal computer can equal or
outstrip the early electronic computers of the 1940s.
Additional information provided by Cambridge Dictionary of Scientists
The Development of the Computer
Computers today are used to perform a dazzlingly wide range of functions and have become
indispensable to modern life. Although most of their development in their current
electronic form has happened over the past 20 years, they have their origins in the
mechanical calculating machines of the 17th-c.
Calculating machines are a very primitive form of computer in that they can only perform
one arithmetic operation at a time, whereas computers can be programmed to perform a whole
sequence of operations, using the answers from the first calculation as the input to the
second and so on. This makes them infinitely more powerful than the humble calculator.
Among the first calculating machines were the 1624 'calculating clock' of Wilhelm
Schickard (1592-1635), which could perform addition and subtraction, PASCAL'S calculator
of 1642 and that of LEIBNIZ in the 1670s. Although Leibniz's invention used a stepped gear
principle which became common in future designs, all of these were essentially curiosities
rather than practical machines.
In 1820 Thomas de Colmar (1785-1870) made a practical calculator which partially
mechanized all four basic arithmetic operations, and in 1875 another major advance was
made with the invention by the American Frank Baldwin (1838-1925) of the pinwheel, a
gearwheel with a variable number of teeth.
These developments led in turn to perhaps the zenith of mechanical calculator technology,
the 'comptometer' of Dorr Felt (1862-1930) in 1885, which was a reliable desktop
calculator with the convenience of entering numbers by striking keys as on a typewriter.
The comptometer became a standard office calculating machine until it was superseded by
electronic devices in the 1970s.
While these were the forerunners of today's calculators, they still lacked the essential
ability of the computer to perform a sequence of operations automatically. The first
attempt at that was made by BABBAGE in 1834, who conceived, but never built, an
'analytical engine' capable of executing any series of arithmetic operations input via
punched cards and to print the answer.
Sadly, and despite substantial financial backing and ingenious design, Babbage never saw
any of his machines completed, and many of his ideas were subsequently reinvented by the
pioneers of electronic computers in the 1940s. However, Babbage's machine was to store its
instructions on punched cards, and this concept was turned into reality in the 1890s by
HOLLERITH, who developed the idea into a practical means of storing data that could be
read by mechanical calculating machines (for the American census, in his case). Hollerith
went on to found a company to market his inventions, which subsequently grew to become
IBM.
Even with data storage, mechanical calculating machines were far too slow to be of much
practical value, and DE FOREST'S invention of the thermionic triode in 1907 sowed the
seeds for a potentially much faster type of electronic calculator. A number of
transitional machines marked the passage from mechanical devices to purely electronic
machines, such as those of Konrad Zuse (1910-95), who between 1938 and 1945 used
mechanical parts and electromechanical relays to make several automatic programmable
calculators. In 1943 Howard Aiken (1900-73) devised a giant, electrically driven
mechanical calculator, the Harvard Mark 1, which helped demonstrate that large-scale
automatic calculation was possible.
It took the stimulus provided by the Second World War, however, together with the
development at that time of the thermionic valve as a reliable and mass-produced device
(for radio and radar), to open up a new range of possibilities for electronic machines.
Many scientists and engineers made simultaneous developments in the history of the
computer around this time.
Colossus, a British computer designed in 1943 specifically for code-breaking work, first
established the practical large-scale use of thermionic valves in computers, and the
American ENIAC (Electronic Numerical Integrator And Computer) built in 1945 by John
Mauchly (1907-80) and John Presper Eckert (1919-95) was designed to compute ballistics
tables for the US army. Also involved in the ENIAC project was the mathematician VON
NEUMANN, who went on to formalize the two essential components of the modern
stored-program computer-a central processing unit (CPU) and the ability to hold the
results of calculations in memory and use them in subsequent operations.
After the war many of these experimental machines began to be developed into commercial
computers . In Manchester the first electronic stored-program machine was run in 1948, and
a collaboration with the Ferranti Company resulted in a number of computers such as
Pegasus (1956), Mercury (1957) and Atlas (1962). In Cambridge, WILKES built the EDSAC
computer in 1949, which was developed in 1951 via a collaboration with the J Lyons Company
into the first machine designed exclusively for business use, LEO (Lyons Electronic
Office).
In 1946 at the National Physical Laboratory, London, TURING, a mathematician who had been
involved in the wartime code-breaking work at Bletchley Park, designed ACE (Automatic
Computing Engine). First run in 1950, ACE was commercialized as DEUCE by the General
Electric Company in 1955. In the USA, Eckert and Mauchly founded the first electronic
computer business and in 1951 produced their first UNIVAC computer. This was used to
correctly predict the results of the US presidential election the following year, a widely
televised feat which did much to popularize the computer.
The next step forward came in the early 1960s with the transistor, invented by SHOCKLEY,
BARDEEN and BRATTAIN in 1947, which began to be utilized to make a new generation of
compact and relatively power-efficient machines. Even so, computer circuit boards were so
large that their size and complexity limited overall speed and performance. In 1958 Jack
Kilby (1923- ) of Texas Instruments established that a number of transistors could be
manufactured on the same block of semiconductor material, and the following year Robert
Noyce (1927-90) of rival Fairchild Semiconductors devised a way of interconnecting and
integrating such components to form an integrated circuit, or 'microchip'.
The next stage was to put most of the essential components for a complete computer on a
single chip, and the resulting 'microprocessor' was announced by Intel Corporation in
1971. This led to the pocket-sized calculators of the early 1970s and to the development
of the desktop personal computer in 1977.
Subsequent development in computer hardware has largely been one of continued refinement
and miniaturization of the microprocessor components, with doubling of speed and
decreasing price becoming routine. Recent developments in computing have increasingly
focused on the software that runs on the computer, rather than the hardware itself.
Developments such as the graphical user interface (GUI), pioneered by Apple Computer,
Inc., have made sophisticated computer systems accessible and useful to many people. In
areas such as in engineering, advanced visualization techniques that use 3D colour
graphics to interactively display and analyse problems have become commonplace.
The development of high-capacity data-storage devices such as CD-ROM has opened up another
role for the computer in publishing and education, and the current development of fast
public information networks and multimedia promises yet more uses, which will combine the
traditional roles of computer, television and telephone. Today the 'computer' effectively
embraces a host of devices and applications based on microprocessor technology, and few
are used just for computing.
From: Webster's World Encyclopedia 2000. Published by Webster Publishing, 1999.
Copyright Webster Publishing, and/or contributors.
|