computing and computing devices started much earlier. The abacus, a simple counting device invented
in Babylonia in the fourth century BC, is considered by many to be the first computing device.
In 1614 the Scottish lord John Napier, inventor of logarithms, invented a calculating device consisting of
a series of rods (often called “bones”) that reduced the complex process of multiplication and division into the
relatively simple tasks of addition and subtraction. Some regard his inventions as the first attempt at mechanical
computation.
Blaise Pascal is often credited with the invention of the first mechanical calculator, the Pascaline, in 1642
(there is evidence that Leonardo DaVinci may have beaten Pascal by 150 years). According to Pascal’s memoirs,
he developed the machine to help his father with his work as a tax collector. Pascal’s device could only add
and subtract but it was possible to perform multiplication and division operations using a series of additions or
subtractions. What is noteworthy about Pascal’s machine is that it was capable of calculating with eight figures,
and that subtraction was performed using complement techniques. Subtracting using complementary addition
is same technique that is used to implement subtraction in most modern computers.

industrial revolution. One of these inventors, Joseph Marie Jacquard, invented a loom in 1801 that revolutionized
the weaving industry. Although it was not the first mechanical loom, Jacquard’s loom was revolutionary in that
it could be used to weave complex and intricate patterns automatically.
The key idea behind the loom was that the pattern to be woven into the cloth was encoded by holes punched
in a card. A group of these cards, that were literally strung together, provided the information required to control the actions of the loom. The Jacquard loom required fewer people and little skill to operate, and versions
of the loom are still in use today. The Jacquard loom also had a profound impact on computing in that it
was the one of the first devices that could be programmed. The loom gave birth to the concept of punched cards,
which played a fundamental role in the early days of computing.
Charles Babbage, a mathematician and inventor, grew tired of calculating astronomical tables by hand,
and conceived of a way to build a mechanical device to perform the calculations automatically. In 1822
Babbage started work on a computing device, the difference engine, to automatically calculate mathematical
tables. During the course of his work on the difference engine, he conceived of a more sophisticated
machine he called the analytical engine. The analytical engine was meant to be programmed using punched
cards, and would employ features such as sequential control, branching, and looping. Although Babbage
never built a complete working model of either machine, his work became the basis on which many modern
computers are built. (One of Babbage’s earlier difference engines was eventually constructed from drawings
by a team at London’s Science Museum in the 1990s. The machine weighs 3 tons and is 10 feet wide
by 6 1/2 feet tall.)
In his work on the analytical engine, Babbage made an important intellectual leap regarding the punched
cards. In the Jacquard loom, the presence or absence of each hole in the card physically allows a colored thread
to pass or stops that thread. Babbage realized that the pattern of holes could be used to represent an abstract
idea such as a problem statement or the raw data required for that problem’s solution.
Because of the connection to the Jacquard loom, Babbage called the two main parts of his Analytic Engine
the “Store” and the “Mill”, as both terms are used in the weaving industry. The Store was where numbers were
held, and the Mill was where they were “woven” into new results. In a modern computer these same parts are
called the memory unit and the central processing unit (CPU).
Perhaps the key concept that separated the analytical engine from its predecessors was that it supported
conditional program execution. This allows the machine to determine what to do next, based upon a condition
or situation that is detected at the very moment the program is running.
Augusta Ada Byron, the countess of Lovelace, was a mathematician who worked with Charles Babbage on
his analytical engine. Unlike Babbage, who was interested in building a computing device, Lovelace sought to
understand and reason about methods for computing. She studied these methods, their implementations, and the
properties of their implementations. Lovelace even developed a program that would have been able to compute
the Bernoulli numbers. (Bernoulli numbers comprise a sequence of rational numbers that have many roles in
mathematics and number theory.)
In her published analysis of the analytical engine, Lovelace outlined the fundamentals of computer
programming, including looping and memory addressing. The influence of the Jacquard loom on her work was
evident in her writing, “We may say most aptly that the analytical engine weaves algebraic patterns just as the
Jacquard loom weaves flowers and leaves.”
It is because of this work that many consider Lovelace to be the world’s first programmer.
The US Department of Defense named the computer language ADA in honor of Lovelace’s work as a
programmer.
Figure 1-4 Ada Lovelace (http://www-groups.dcs.st-and.ac.uk/~history/PictDisplay/Lovelace.html).
The 1890 census of the United States proved another milestone in the history of computing when punch
cards were used with automatic sorting and tabulating equipment invented by Herman Hollerith to speed the
compilation of the data. His machines reduced the time required for a full compilation of census results from
10 years to 3 months, and saved $5,000,000 in costs to the census bureau.
Building on the success of his equipment with the US Census Bureau, Hollerith founded the Tabulating
Machine Company in 1896. After merging with two other companies and changing its name, the company
became known as the International Business Machines (IBM) Corp. The punch card remained a staple of data
storage well into the 20th century.
The 1940s were a decade of dramatic events for the world. World War II changed the face of the world and
many lives forever. Although terrible atrocities were taking place during this period, it was also a time of innovation
and invention in computing. During the 1940s the first electronic computers were built, primarily to support the war.
Unfortunately the clouds of war make it difficult to determine exactly who invented the computer first.
Legally, at least in the United States, John Atanasoff is credited as being the inventor of the computer.
Atanasoff was a professor of mathematics and physics at Iowa State. Atanasoff was frustrated at the difficulty
his graduate students were having finding solutions to large systems of simultaneous algebraic equations for
solving differential equations. Like Babbage, almost 100 years earlier, Atanasoff believed that he could build a
machine to solve these equations.
Working with graduate student Clifford Berry, Atanasoff completed a prototype of his machine
near the end of 1939. Atanasoff and Berry sought simplicity in their computer. The Atanasoff–Berry Computer (ABC) used only 300 vacuum tubes and was capable of performing arithmetic electronically.
Perhaps what is most important about this particular machine is that is operated on base-2 numbers (binary).
The ABC did not implement the stored program idea, however, so it was not a general-purpose
computer.
During the same time period, Howard Aiken was working on the Mark I computer at Harvard University.
As completed in 1944, the Mark I contained more than 750,000 parts, including switches, relays, rotating shafts,
and clutches. The machine was huge, at 51 feet long, 8 feet high, 2 feet thick, and weighing 5 tons. It had 500
miles of wiring, and three million wire connections. The machine sounded like a “roomful of ladies knitting”
when it was running. Aiken showed that it was possible to build a large-scale automatic computer capable of
reliably executing a program.
Figure 1-7 The Aiden/IBM Mark 1 Computer installed at Harvard, photograph IBM Corporate Archives.
One of the people who worked with Aiken on the Mark I was Grace Murray Hopper, a freshly commissioned
lieutenant in the US Naval Reserve. Hopper was involved with programming the Mark I from the very start.
One of her most significant contributions to the field of computing was the concept of a compiler. Hopper was
troubled by the mistake-plagued nature of code writing, and developed a piece of software that would translate
an entire set of programmer’s instructions, written in a high-level symbolic language, into the machine’s language.
The first compiler developed by Hopper was named A-0, and was written in 1952.
Grace Murray Hopper is also credited as the individual who coined the term “bug.” During the summer of
1947 the Mark II computer, a successor to the Mark I, was acting strangely. At times it would produce the correct
answer, and at other times the same program would produce erroneous results. Hopper traced the problem
down to a faulty relay within the computer. When she physically examined the relay to correct the problem, she
discovered that a moth had been trapped in the relay, causing it to malfunction. Once she removed the moth
from the relay, the machine functioned normally. The “bug” was taped onto a page of the laboratory’s notebook
with the inscription “First actual bug found.”

After World War II ended, the allies discovered that Konard Zuse, a German engineer, had been developing
computers for use by the Germans. Zuse’s first computer, the Z1, was built between 1936 and 1938. The
machine contained all of the parts of a modern computer; however, it was not reliable. Its mechanical construction
was very complex and error-prone. Zuse’s Z3 was the first fully functional program-controlled computer in
the world.
The Z3 was finished in 1941 and predated Aiken’s Mark I. Zuse’s accomplishments are all the more incredible
given the material and worker shortages in Germany during World War II. Zuse couldn’t even obtain paper tape,
so he had to make his own by punching holes in discarded movie film. Zuse also invented what might be the
first high-level computer language, “Plankalkul”, though it, too, was unknown outside Germany.
The work done by the code breakers at Bletchley Park (between London and Birmingham, UK)
during World War II provided the allies with information that literally turned the tide of the war. Computers
played a vital role in the work of the code breakers and made it possible for them to break the Enigma
and Lorenz ciphers. Colossus, a computer developed at Bletchley Park to break ciphers, became operational
in 1943. Colossus was one of the first major computers to employ vacuum tubes, and was capable of reading
information stored on paper tape at a rate of 5000 characters per second. Colossus also featured limited
programmability.
When the allies invaded North Africa in 1942, they discovered that the firing tables they used to aim their
artillery were off. This resulted in requests for new ballistics tables that exceeded the ability to compute them.
John Mauchly and J. Presper Eckert used this opportunity to propose the development of an electronic high-speed
vacuum tube computer. Even though many experts predicted that, given the number of vacuum tubes in the
machine, it would only run for five minutes without stopping, they were able to obtain the funding to build
the machine.
Under a cloak of secrecy, they started work on the machine in the spring of 1943. They completed
their work on the machine in 1946. The result was the Electronic Numerical Integrator Analyzer and
Computer (ENIAC), a machine that weighed 30 tons and was built using 17,468 vacuum tubes and 6000
switches. The machine was more than 1000 times faster than any machine built to date. Unlike modern
computers, reprogramming ENIAC required a rewiring of the basic circuits in the machine. ENIAC heralded
the dawning of the computer age.
Soon after ENIAC become functional, Mauchly and Eckert formed the Electronic Control Corporation
(ECC) and received contracts from the government to design and build a computer for the Bureau of the Census.
ECC developed financial difficulties and as a result sold its patents to, and became an employee of, the
Remington Rand Corporation. In 1951 Remington Rand delivered the Universal Automatic Computer
(UNIVAC) to the census bureau.
UNIVAC was the fastest computer of the time and was the only commercially available general-purpose
computer. It contained only 5000 vacuum tubes and was more compact than its predecessors. UNIVAC
computers were sold to government agencies, the A.C. Neilson Company (market researchers), and Prudential
Insurance. By 1957 Remington Rand had sold over 40 machines.
Probably what made UNIVAC most famous was its use by CBS to predict the results of the 1952 presidential
election. Opinion polls predicted that Adalai Stevenson would beat Dwight D. Eisenhower by a landslide.
UNIVAC’s analysis of early returns, however, showed a clear victory for Eisenhower. Newscasters Walter
Cronkite and Charles Collingwood questioned the validity of the computer’s forecast, so they postponed
announcing UNIVAC’s prediction until very late.
For many years, Mauchly and Eckert were considered the inventors of the electronic computer. In fact they
applied for, and received, a patent for their work in 1947. After purchasing ECC, Remington Rand owned the
rights to their patent and was collecting royalties from firms building computers. In a legal battle, initiated by
Honeywell’s refusal to pay royalties, a judge ruled the original patent invalid. Part of his decision to invalidate
the patent was based on the fact that Mauchly had visited John Atanasoff’s laboratory in 1941, and used the
knowledge he gained during the visit to build ENIAC. The results of this lawsuit legally established John
Atanasoff as the inventor of the modern computer.
After the war, commercial development of computers continued, resulting in the development of many new
machines that provided improved performance in terms of computing capability and speed. Computers at this
time were large, cumbersome devices that were capable of performing simple operations. These machines were
very expensive to build and maintain. The only organizations that could afford to purchase and run the
equipment were the government and large corporations.
Not surprisingly, many individuals working in the computing field felt that the use of computers would be
limited. In a 1950 article, Business Week noted, “Salesmen will find the market limited. The UNIVAC is not the
kind of machine that every office could use.” And though the story is probably apocryphal, the lore of computing
attributes the following prediction to Thomas Watson, the founder of IBM, in 1943: “I think there is a world
market for maybe five computers.”
In the early 1950s, a group of scientists working at Bell Laboratories in New Jersey was studying
the behavior of crystals as semiconductors in an attempt to replace vacuum tubes. Its work resulted in the
development of the transistor, which changed the way computers and many electronic devices were built.
Transistors switch and modulate electric current in much the same way as a vacuum tube. Using transistors
instead of vacuum tubes in computers resulted in machines that were much smaller and cheaper, and that
required considerably less electricity to operate. The transistor is one of the most important inventions in the
20th century.
While computer companies such as IBM and Honeywell focused on the development of mainframe
computers, Digital Equipment Corporation (DEC) focused on the development of smaller computers. DEC’s
PDP series of computers were small and designed to serve the computing needs of laboratories. The PDP-8 was
one of the first computers purchased by end users. Because of their low cost and portability, these machines
could be purchased to fill a specific need. The PDP-8 is generally regarded as the first minicomputer.
The invention of the integrated circuit caused the trend toward smaller, cheaper, and faster computers to
accelerate. Popular Electronics featured an article on a kit that home hobbyists could purchase that would
enable them to build a computer at home. This machine, offered first in 1974, was the Altair 8800, manufactured
by a company named MITS. It ushered in the personal computer era. These initial machines were designed
to be built at home, which was fine for the home hobbyist but limited the availability of the machine. The first
programming language for the Altair was Altair BASIC, the first product of a little company called Microsoft.
In 1981 IBM introduced its personal computer, or PC, which changed the face of computing forever.
It was now possible for individuals to simply purchase these machines and use them at home.
No comments:
Post a Comment