Since the beginning of the Industrial Revolution, people wanted and needed an easier way of calculating and measuring. Through the dreams of Charles Babbage, the computer was born. These new machines could do any regular math more than twice as fast as any human. Sadly, these ideas were not appreciated until almost one hundred years later.
In the 1950’s, the idea of computers was broght up again. This is when people finally started crediting Babbage’s work. The technology available now made it possible for people to construct a digital computer. The idea of building a computer became a necessity when World War II came about. Many important names such as ENIAC and IBM cam about, and computers became a very wide interest in the world.
What influenced personal computers of today?
The thought of a machine being more intelligent than a mathmetician was laughed at, and thought of as an impossibility. That all changed when Charles Babbage was brought into the world.
Charles Babbage was a mathmetician, engineer, and a future computer designer. He was actually known as the “Grandfather” (Slater 3) of the modern day computer. He was and still is, thought of as ahead of his time.
Charles Babbage entered Trinity College, Cambridge in 1810. There he studied mathematics and chemistry. Between 1815 and 1820, he was involved mostly in mathematics, studying algebra.
In 1822, he finally built his first mechanical computer, the Difference Engine. This was the first ever mechanical computer. It could add, subtract, divide, and multiply. He then started working on a more advanced machine in 1834, the Analytical Engine. This would be much more advanced than the Difference Engine. It would be steam powered and fully automatic. This would have been his greatest achievement. Unfortunately, the technology available to him was not advanced enough for Babbage to build, what would have been, the first digital computer. Another reason he never built his Analytical Engine was because
he almost never completely finished a project, as he was obsessed with perfection.
Charles Babbage died in 1871. Sadly, he was forgotten for seventy years, until the computer revolution. In the 1940’s, he finally got recognition for his ideas. Some of the first digital computers were in fact very similar to his plans of the Analytical Engine. Technology was indeed very much better in the 1940’s, a perfect environment for the digital computer to be born.
The vacuum tube was already used widely in all types of electronic devices, including televisions and stereos. When they are activated, however, people found that large numbers of them could represent zeros and ones, which would be computer code. The government was the only place to go to get a grant for this project.
The government at first wanted nothing to do with computers. Basically, computers were thought of as too radical an idea, and that there was no need for them. This view was brought about mostly by the military. However, this view would soon change.
When World War II started, the computer had new hope. New weapons were designed which required trajectory tables for guidance. Also, Russia had detonated an atomic bomb. Tracking enemy and friendly planes as well as guiding missiles
and bombs to their targets was extremely difficult work for people. And, with the detonation of the Russian atomic bomb, Enemy targets had to be neutralized much quicker. In other words, people were too slow.
A computer would do all of these jobs much quicker and more precise than humans. In 1949, the U.S. Military gave in.
Construction of this new computer was started immediately. When it was finished and operational on September 7, 1952, it was named the ENIAC, meaning Electronic Numerical Integrator And Computer. It used 17,468 vacuum tubes, covering three gigantic walls, for data storage. It was very much quicker than any human mathmetician. It could monitor 47 airplanes at once, while performing other various tasks. There was only one major flaw. Every time a new operation was started, all of the tubes had to be reset, which meant they all had to be unplugged and plugged again.
New technology was already being developed which made the ENIAC look sluggish. For this reason, a newer computer was built. In 1945, the EDVAC was operational. EDVAC stands for Electronic Discrete Variable Automatic Computer. The EDVAC made use of 4000 vacuum tubes and 10000 crystallized diodes. There was an improvement on speed over the ENIAC, but the main
evolution was that the EDVAC could store data for long periods of time. This computer was first used on April 20,1951, and was used until 1983. Project Whirlwind was the really drastic milestone. This was not true at first, however. At first it only consisted of advanced vacuum tubes. The basic vacuum tube had a life of 500 hours. Whirlwind’s, however, used a silicon-free cathode. This decreased the cathode emission to barely nothing. This, in turn, increased the life of a vacuum tube to 5,000 hours. This meant less time and money spent on replacing vacuum tubes. It was still unreliable and would break down. Also, although it’s vacuum tubes were very much more efficient, they had to be replaced often. Computers needed a new form of memory storage.
Finally, the big breakthrough occurred. A theory of magnetic coils was introduced. A special ferrite metal was used for these coils, which now were metal rings. Ferrite was much less expensive than vacuum tubes. Also, these coils could store information as long as needed. For vacuum tubes to do this, they had to have a constant supply of electricity. The coils, however, did not. A man named Jay Forrester proved this theory very useful. He strung many of these coils shaped like donuts on a wire grid. Each coil had its own location on the grid, and could be accessed and used much quicker than vacuum tubes. When a vacuum tube was
turned on, it represented a one, and when off, a zero. The ferrite coils worked in much the same way. When charged north a one, and south a zero. This was named Random-Access Coincident-Current Memory. It more than doubled operating speed. This prototype was improved, and received a shorter name, RAM. This stands for just Random Access Memory. This type of memory is used and then reset for its next instructions.
Other important advances came about around this time. One of these was time sharing. Many people wanted to use these new computers. but they were too expensive and large to fit in a house. In time sharing, a computer would process many problems at one time, using EXE files. An EXE file is an executive file.
Another method of making computers quicker and more available to the public was batch processing. In batch processing, problems are prepared and held on magnetic drums, disk packs, or tapes. After they are solved, the results are displayed or printed. The problem is then deleted, or “dumped” to make room for the next problem. With all of these new advances, the computer was becoming easier to use. Also, with the new ferrite coils and other smaller parts, the computer was shrinking very much. Soon they would be affordable and small enough for every household to have one.
This all became possible with two simple inventions: The transistor and microprocessor. The transistor worked like a vacuum tube. However, it was much smaller and much more affordable. A transistor consists of only a plastic casing containing three fine wire strands. A vacuum tube is an electrical valve used to control the flow of electricity and amplification of electric signals. They are about the size of a light bulb, and cost about one dollar. The transistor does everything a vacuum tube does, except it only costs about five cents, and is about as big as a fingertip. It in fact also holds more information. The transistor increased RAM amounts in computers from 8,000 to 64,000 words. Eventually, they were also made incredibly small, allowing hundreds of transistors to be placed on one “chip.”
They were faster, too. Transistors increased computer accessing speed from three to two milliseconds. By 1980, transistors were so small that hundreds of thousands could be placed on one chip, about the size of a fingernail.
Transistors made computers small enough to fit on a desktop. However, there was no way of controling this. In older computers, a central processing unit was used. But these units were very large and extremely expensive. Thus, the birth of the microprocessor. Intel is the company that created the first microprocessor. It was named the 4004 microprocessor. This microprocessor (as with all
microprocessors today) ran on ROM. ROM stands for Read Only Memory. ROM stores constantly used, unchanging memory. The Microprocessor controls all the functions of the computer. There were many improvements over the 4004, and eventually, by the late 1980’s, some personal computers run by microprocessors could run 4,000,000 instructions per second.
However, not many poeple had computers. Sure, they were very powerful, small enough, and affordable enough. But what could they really do? In order for these new machines to come into the household, they needed something interesting to do. ” ‘… People don’t realize that frivolity is the gateway to the future, in that most future products don’t start as necessities, but toys.’ ” (qtd. in Slater 300) These are the words of Nolan Bushnell. He is the founder of Atari, a big company that manufactured arcade games. Atari means ” Watch out or i’m going to get you on the next turn. ” (Slater 301) Atari eventually got into the home computer business when it created home video games. This was the key to getting computers into homes… fun. By 1985, two out of five houses who had a television also had a computer. Eventually, this allowed personal computers advance into what they are today.
Todays computers are incredibly highly advanced. They run on new operating systems, such as Windows 95, Windows NT, and Windows 98. These
OS’s are so incredibly simple so that virtually anyone can use a computer. They have CD quality sound systems and many accessories, peripherals, and programs. Modern day computers have peripherals such as the mouse, joystick, etc. They make use of modems and phonelines to communicate with other computers, and can also literally give the user access to anywhere in the world via the internet. Some computers use Pentium technology, making clock and access speeds even quicker, Many of these also make use of new MMX technology, making graphics and video speed and quality better.
Processors have gotten so fast that they are now clocked in megahertz. (one million hertz) by which it reads ROM and RAM. Computers now can use timing and voice recognition. This means you can control your whole house by linking it with your computer. Your entire house can be completely automated.
Computers today also use highly advanced forms of input. One of the newest forms is the CD-ROM, and even newer, DVD-ROM. CD-ROM stands for Compact Disk Read Only Memory. these disks can hold up to 600 megabytes of information, and are much quicker than regular floppy disks. A CD-ROM consists of a grooved metal ultra-thin sheet enclosed in hard plastic, in which case a grove stands for a one, and a flat area a zero. DVD-ROM is even newer than this. They
are advanced CD-ROMs, as they hold twice as much information. (1.2 gigabytes) These DVD-ROMs can also hold full-length movies in perfect quality.
Nearly anything can be stored on a computer, such as pictures, sound, movies, etc. These files are either stored by internal or external means. When Information is stored internally, it is stored on a Hard Disk, which is like a regular disk, but much larger storage and quicker access speed. It is also slightly larger than a regular disk. These drives store so much information sometimes that their capacity must registered in gigabytes. To put it in perspective, One gigabyte is equal to 1,000 megabytes. One megabyte is equal to 1,000 kilobytes. One kilobyte is equal to 1,000 bytes, which can be represented by letters and numbers.
The computer started in the thoughts of Charles Babbage. He created the mechanical computer, the Difference Engine. He was working on a new, more advanced computer, but he died before it could be completed.
In the 1940’s, the interest for computers was re-sparked. With newer technology and the creation of the vacuum tube, the Eniac was created. It was an incredibly large machine which filled an entire office floor. New advancements were made which made computers smaller and smaller.
Eventually they were small enough to fit in the home, with the invention of the transistor and microprocessor. They were brought into homes with the Atari game system, and computers were a large success.
Nowadays, computers 100 times faster than the ENIAC can fit on a desktop. They can be linked together through phone wires and modems, and can store millions of times more information than any computers of the 1940’s and 1950’s. Computers are a very big part of everyday life, and it all started with one little dream of a regular man, Charles Babbage.
Slater, Robert. Portraits in Silicon.
London: The MIT Press, 1987
Computer History Association of California.
HCS Virtual Computer History Museum.
Perspectives of the Smithsonian: Smithsonian Computer History