The Evolution of Personal Computers
There is some argument about what was actually the very first personal computer. Some say it was the MITS Altair 8800 and others claim it was the Apple. It may depend on the definition you use for just what a personal computer is or was.
The Altair 8800 was a kit computer that fast became the favorite of hobbyists looking to get a computer all their own. Ed Roberts and his company, MITS (Model Instrumentation Telemetry Systems), developed this early personal computer kit, which was named the Altair by his daughter after a planetary destination on the TV show Star Trek. The software for the MITS was a BASIC programming language written by a fledgling company called Microsoft. However, like nearly all early personal computers, the Altair 8800 did not have off-the-shelf application software, and users had to write their software themselves using the BASIC language interpreter. While this was a challenge, to those kindred spirits looking to get in on the computing craze, it wasn't a problem.
In 1978, after seeing a demonstration of the Altair 8800, two young computer enthusiasts, Steve Jobs and Steve Wozniak, set out to build their own computer and developed a computer they named the Apple I. Like its predecessors, the Apple I established a following that encouraged its young developers to continue. The Apple II soon followed bolstered by what may have been the first killer application, an early spreadsheet program called VisiCalc, and become a commercial success.
It wasn't long before nearly every mainframe and minicomputer manufacturer leaped into the personal computer market. IBM, Digital Equipment, and others soon had their own PCs in the marketplace. The IBM PC and its extended technology (XT) and advance technology (AT) versions soon became the standard for computers using Intel microprocessors, while Apple Computer continued to carve its own niche. The BMPCAT and the Apple Macintosh represent commercially successful PCs that largely defined the personal computer in terms of its size, shape, and functions-a standard that has continued unit today. This is the point at which we will begin looking at the technology of the PC and its hardware.
Here is a list of some of the key events that have lead to the personal computer as we know it today. Each of these events was instrumental in either the development of the hardware of the PC or its software.
Abacus
The Abacus was an early aid for mathematical computations. Its only value is that it aids the memory of the human performing the calculation. A skilled abacus operator can work on addition and subtraction problems at the speed of a person equipped with a hand calculator (multiplication and division are slower). The abacus is often wrongly attributed to China. In fact, the oldest surviving abacus was used in 300 B.C. by the Babylonians. The abacus is still in use today, principally in the far east. A modern abacus consists of rings that slide over rods, but the older one pictured below dates from the time when pebbles were used for counting.
Napier's Bones.
In 1617 an eccentric (some say mad) Scotsman named John Napier invented logarithms, which are a technology that allows multiplication to be performed via addition. The magic ingredient is the logarithm of each operand, which was originally obtained from a printed table. But Napier also invented an alternative to tables, where the logarithm values were carved on ivory sticks which are now called Napier's Bones.
The Slide Rule
Napier's invention led directly to the slide rule, first built in England in 1632 and still in use in the 1960's by the NASA engineers of the Mercury, Gemini, and Apollo programs which landed men on the moon.
Calculating Clock
The first gear-driven calculating machine to actually be built was probably the calculating clock, so named by its inventor, the German professor Wilhelm Schickard in 1623. This device got little publicity because Schickard died soon afterward in the bubonic plague.
Pascaline
In 1642 Blaise Pascal, at age 19, invented the Pascaline as an aid for his father who was a tax collector. Pascal built 50 of this gear-driven one-function calculator (it could only add) but couldn't sell may because of their exorbitant cost and because they really weren't that accurate (at that time it was not possible to fabricate gears with the required precision). Up until the present age when car dashboards went digital, the odometer portion of a car's speedometer used the very same mechanism as the Pascaline to increment the next wheel apter each full revolution of the prior wheel. Pascal was a child prodigy. At the age of 12, he was discovered doing his version of Euclid's thirty-second proposition on the kitchen floor. Pascal went on to invent probability theory, the hydraulic press, and the syringe. Shown below is an 8 digit version of the Pascaline, and two views of a 6 digit version.
Leibniz's Calculating Machine
Just a few years after Pascal, the German Gottfried Wilhelm Leibniz (co-inventor with Newton of calculus) managed to build a four-function (addition, subtraction, multiplication, and division) calculator that he called the stepped reckoner because, instead of gears, it employed fluted drums having ten flutes arranged around their circumference in a stair-step fashion. Although the stepped reckoner employed the decimal number system which is fundamental to the operation of modern computers. Leibniz is considered one of the greatest of the philosophers but he died poor and alone.
Charles Babbage's Difference Engine (Analytical Engine)
By 1822 the English mathematician Charles Babbage was proposing a steam driven calculating machine the size of a room, which he called the Difference Engine. This machine would be able to compute tables of numbers, such as logarithm tables, He obtained government funding for this project due to the importance of numeric tables in ocean navigation. by promoting their commercial and military navies, the British government had managed to become the earth's greatest empire. But in that time frame the British government was publishing a seven volume set of navigation tables which came with a companion volume of corrections which showed that the set had over 1000 numerical errors. It was hoped that Babbage's machine could eliminate errors in these types of tables. But construction of Babbage's Difference
Engine proved exceedingly difficult and the project soon become the most expensive government funded project up to that point in English history. Ten years later the device was still now here neat complete, acrimony abounded between all involved, and funding dried up. The device was never finished.
Babbage was not deterred, and by then was on to his next brainstorm, which he called the Analytic Engine. This device, large as a house and powered by 6 steam engines, would be more general purpose in nature because it would be programmable, thanks to the punched card technology of Jacquard. but it was Babbage who made an important intellectual leap regarding the punched cards. In the Jacquard loom, the presence or absence of each hole in the card physically allows a colored thread to pass or stops that thread (you can see this clearly in the earlier photo). Babbage saw that the pattern of holes could be used to represent and abstract idea such as a problem statement or the raw data required for that problem's solution. Babbage saw that there was no requirement that the problem matter itself physically pass thru the holes.
Furthermore, Babbage realized that punched paper could be employed as a storage mechanism, holding computed numbers for future reference. Because of the connection to the Jacquard loom, Babbage called the two main parts of his Analytic Engine the "Store and the "Mill," as both terms are used in the weaving industry. The Store was where numbers were held and the Mill was where they were "woven" into new results, In a modern computer these same parts are called the memory unit and the central processing unit (CPU).
Dr.Herman Hollerith
Hollerith built a company, the Tabulating Machine Company which, after a few buyouts, eventually became International Business Machines, known today as IBM. IBM grew rapidly and punched cards became ubiquitous. Your gas bill would arrive each month with a punch card you had to return with your payment. This punch card recorded the particulars of your account: your name, address, gas usage, etc. As another example, when you entered a toll way (a highway that collects a fee from each driver) you were given a punch card that recorded where you started and then when you exited from the toll way your fee was computed based upon the miles you drove. When you voted in an election the ballot you were handed was a punch card. The little pieces of paper that are punched out of the card are called "chad" and were thrown as confetti at weddings. Until recently all Social Security and other checks issued by the Federal government were actually punch cards.
Howard Aiken
One early success was the Harvard Mark I computer which was built as a partnership between Harvard and IBM in 1944. This was the first programmable digital computer made in the U.S. But it was not a purely electronic computer. Instead the Mark I was constructed out of switches, relays, rotating shafts, and clutches. The machine weighed 5 tons, incorporated 500 miles of wire, was 8 feet tall and 51 feet long, and had a 50 ft rotating shaft running its length, turned by a 5 horsepower electric motor. The Mark I ran non-stop for 15 years, sounding like a roomful of ladies knitting. To appreciate the scale of this machine note the four typewriters in the foreground of the following photo.
Electronic Numerical Integrator and Calculator (ENIAC):
The title of forefather of today's all-electronic digital computers is usually awarded to ENIAC, which stood for Electronic Numerical Integrator and Calculator. ENIAC was built at the University of Pennsylvania between 1943 and 1945 by two professors, John Mauchly and J.Presper Eckert, who got funding from the war department after promising they could build a machine that would replace all the "computers", meaning the women who were employed calculating the firing tables for the army's artillery guns. The day that Mauchly and Eckert saw the first small piece of ENIAC work, the persons they ran to bring to their lab to show off their progress were some of these female computers (one of whom remarked, "I was astounded that it took all this equipment to multiply 5 by 1000),
ENIAC filled a 20 by 40 foot room, weighed 30 tons, and used more than 18,000 vacuum tubes. Like the Mark I ENIC\AC employed paper card readers obtained from IBM (these were a regular product for IBM, as they were a long established part of business accounting machines, IBM's forte). When operating, the ENIAC was silent but you knew it was on as the 18,000 vacuum tubes each generated waste heat like a light bulb and all this heat (174,000 watts of heat) meant that the computer could only he operated in a specially designed room with its own heavy duty air conditioning system. Only the left half of ENIAC is visible in the first picture, the right half was basically a mirror image of what's visible.
UNIVAC-1 (Universal Accounting Computer):
By the end of the 1950's computers were no longer one-of-a-kind hand built devices owned only by universities and government research labs. Eckert and Mauchly left the University of Pennsylvania over a dispute about who owned the patents for their invention. They decided to set up their own company. Their first product was the famous UNIVAC computer, the first commercial (that is, mass produced) computer. In the 50's UNIVAC (a contraction of "Universal Automatic Computer") was the household word for "computer" just as "Kleenex" is for "tissue". The first UNIVAC was sold, appropriately enough, to the Census bureau. UNIVAC was also the first computer to employ magnetic tape. Many people still confuse a picture of a reel-to-reel tape recorder with a picture of a mainframe computer.
The Microchip and the Microprocessor
The microchip (or integrated circuit) is one of the most important advances in computing technology. Many overlaps in history existed between microchip-based computers and transistor-based computers throughout the 1960s, and even into the early 1970s.
Microchips allowed the manufacturing of smaller computers. The microchip spurred the production of minicomputers and microcomputers, which were small and inexpensive enough for small business and even individuals to own. The microchip also led to the microprocessor, another breakthrough technology that was important in the development of the personal computer.
Personal computers
The first personal computers were built in the early 1970s. Most of these were limited-production runs, and worked based on small-scale integrated circuits and multi-chip CPUs.
The Altair 8800 was the first popular computer using a single-chip microprocessor. It was also sold in kit form to electronics hobbyistis, meaning purchasers had to assemble their own computers.
clones of this machine quickly cropped up, and soon there was an entire market based on the design and architecture of the 8800. It also spawned a club based around hobbyist computer builders, the
Homebrew Computer Club
1977 saw the rise of the "Trinity"(based on a reference in Byte magazine): the Commodore PET, the Apple II, and the Tandy Corporation's TRS-80. These three computer models eventually went on to sell millions. These early PCs had between 4KB and 4KB of RAM. The Apple II was the only one with a full-color, graphics-capable display, and eventually became the best-seller among the trinity, with more than 4 million units sold.
The Early Notebooks and Laptops
One particularly notable development in the 1980s was the advent of the commercially available portable computer. The first of these was the Osborne 1, in 1981. It had a tiny 5"monitor and was large and heavy compared to modern laptops (weighing in at 23.5 pounds). Portable computers continued to develop, though, and eventually became streamlined and easily portable, as the notebooks we have today are. These early portable computers were portable only in the most technical sense of the word. Generally, they were anywhere from the size of a large electric typewriter to the size of a suitcase.
The first laptop with a flip form factor, was produced in 1982, but the first portable computer that was actually marketed as a "laptop" was the Gavilan SC in 1983. Early models had monochrome displays, though there were color displays available starting in 1984 (the Commodore SX-64). Laptops grew in popularity as they became smaller and lighter. By 1988, displays had reached VGA resolution, and by 1993 they had 256-color screens. From there, resolutions and colors progressed quickly. Other hardware features added during the 1990s and early 2000s included high-capacity hard drives and optical drives.
The Rise of Mobile Computing
Mobile computing is one of the most recent major milestones in the history of computers. Many smartphones today have higher processor speeds and more memory than desktop PCs had even ten years ago. With phones like the iPhone and the Motorola Droid, it's becoming possible to perform most of the functions once reserved for desktop PCs from anywhere.
The Droid is a smartphone capable of basic computing tasks such as emailing and web browsing. Mobile computing really got its start in the 1980s, with the pocket PCs of the era. These were something like a cross between a calculator, a small home computer and a PDA. They largely fell out of favor by the 1990s. During the 1990s, PDAs (Personal Digital Assistant became popular. A number of manufacturers had models, including Apple and Palm. The main feature PDAs had that not all pocket PCs had was touchscreen interface. PDAs are stll manufactured and used today, through they've largely been replaced by smartphones.
Smartphones have truly revolutionized mobile computing. Most basic computing functions can now be done on a smartphone, such as email, browsing the internet, and uploading photos and videos.
Netbooks
Another recent progression in computing history is the development of netbook computers. Netbooks are smaller and more portable than standard laptops, while still being capable of performing most functions average computer users need (using the Internet, managing email, and using basic office programs). Some netbooks go as far as to have not only built-in WiFi capabilities, but also built-in mobile broadband connectivity options.
The first mass-produced netbook was the Asus Eee PC 700, released in 2007. They were originally released in Asia, but were released in the US not long afterward. Other manufacturers quickly followed suit, releasing additional models throughout 2008 and 2009.
The history of computing spans nearly two centuries at this point, much longer than most people realize. From the mechanical computers of the 1800s to the room-sized mainframes of the mid-20th century, all the way up to the notebooks and smartphones of today, computers have evolved radically throughout their history.
The past 100 years have brought technological leaps and bounds to computing, and there's no telling what the next 100 years might bring.
No comments:
Post a Comment