The first programmable computers were invented in the 1940’s. Before then, people were stuck with the abacus, adding machine, and slide rule.
During the 1950’s, 1960’s, and 1970’s, most computers used punched cards — whose history is weird. The cards were first used for weaving tapestries. Where the cards had holes, rods could move through the cards; those moving rods in turn made other rods move, which caused the threads to weave pictures. That machine was called the Jacquard loom.
Charles Babbage was a wild-eyed English mathematician who, in the 1800’s, believed he could build a fancy computing machine. He convinced the British government to give him lots of money, then bilked the government for more. Many years later — and many British pounds later — he still hadn’t finished his machine. So he dropped the idea and — can you believe this? — tried to build an even fancier machine. He didn’t finish that one either. You might say his life was a failure that was expensive for the British government.
But Charlie (as I’ll call him) is admired by all us computerniks (in spite of his face, which was even sterner than Beethoven’s), because he was the first person to realize that a computing machine must be composed of an input device (he used a card reader), a memory (which he called "The Store"), a central processing unit (which he called "The Mill"), and an output device (he used a printer).
Feminists will kill me if I don’t mention Charlie’s side-kick, Lady Lovelace. (No, she’s not related to Linda.) She was one of Charlie’s great admirers, but he never noticed her until she translated his stuff. And boy, it was impossible for him not to notice her translations. Her "footnotes" to the translation were three times as long as what she was translating!
She got very intense. She wrote to Charlie, "I am working very hard for you — like the Devil in fact (which perhaps I am)."
The two became love-birds, although he was old enough to be her father. (By the way, her father was Lord Byron, the poet. She was Lord Byron’s only "official" daughter. His other daughters were illegitimate.) Some people argue that she was actually brighter than Charlie, despite Charlie’s fame. She was better at explaining Charlie’s machines and their implications than Charlie was. Some people have dubbed her, "the world’s first programmer".
StunningShe stunned all the men she met. She was so bright and… a woman! Here’s how the editor of The Examiner described her (note the pre-Women’s-Lib language!):
"She was thoroughly original. Her genius, for genius she possessed, was not poetic, but metaphysical and mathematical. With an understanding thoroughly masculine in solidity, grasp, and firmness, Lady Lovelace had all the delicacies of the most refined female character. Her manners, tastes, and accomplishments were feminine in the nicest sense of the word; and the superficial observer would never have divined the strength and knowledge that lay hidden under the womanly graces. Proportionate to her distaste for the frivolous and commonplace was her enjoyment of true intellectual society. Eagerly she sought the acquaintance of all who were distinguished in science, art, and literature."
MadEventually, she went mad. Mattresses lined her room to prevent her from banging her head. Nevertheless, she died gruesomely, at the ripe young age of 36, the same age that her father croaked. (I guess premature death was popular in her Devilish family.)
Who’s the heroine?I wish feminists would pick a different heroine than Lady Lovelace. She was not the most important woman in the history of computing.
Far more important were Grace Hopper and Jean Sammet. In the 1950’s Grace Hopper invented the first programming languages, and she inspired many of us programmers until her recent death. Jean Sammet headed the main committee that invented COBOL; she’s the world’s top expert on the history of programming languages, and she’s been president of the computer industry’s main professional society, the ACM.
Lady Lovelace was second-string to Babbage. Grace Hopper and Jean Sammet were second-string to nobody.
But since Hopper and Sammet led less racy lives, journalists ignore them; and since Hopper was an Admiral in the Navy (bet you didn’t know the Navy had lady Admirals!), she irked some of us doves. Nevertheless, whenever she stepped in front of an audience she got a standing ovation because all of us realize how crucial she was to the computer industry.
But I’m straying from my story.…
The U.S. Bureau of the Census takes its census every ten years. To tabulate the results of the 1880 census, the Bureau took 7 years: they didn’t finish until 1887. When they contemplated the upcoming 1890 census, they got scared; at the rate America was growing, they figured that tallying the 1890 census would take 12 years. In other words, the results of the 1890 census wouldn’t be ready until 1902. So they held a contest to see whether anyone could invent a faster way to tabulate the data.
The winner was Herman Hollerith. He was the first person to successfully use punched cards to process data.
Hermie (as I’ll call him) was modest. When people asked him how he got the idea of using punched cards, he had two answers. One was, "Trains": he had watched a train’s conductor punch the tickets. His other, more interesting answer was, "Chicken salad". After saying "Chicken salad", he’d pause for you to ask the obvious question, "Why chicken salad?" Then he’d tell his tale:
One day, a girl saw him gulping down chicken salad. She said, "Oh, you like chicken salad? Come to my house. My mother makes excellent chicken salad." So he did. And her father was a head of the Census. (And he married the girl.)
By the way, Herman Hollerith hated one thing: spelling. In elementary school, he jumped out a second-story window, to avoid a spelling test.
In some versions of FORTRAN, every string must be preceded by the letter H. For example, instead of saying ___
you must say:
The H is to honor Herman Hollerith.
The Census used Hollerith’s punched-card system in 1980 and again in 1900. But in 1910 the Census switched to a fancier system created by a Census Bureau employee, James Powers. Later, Powers quit his job and started his own company, which merged intoRemington-Rand-Sperry-Univac. Meanwhile, Herman Hollerith’s own company merged into IBM. That’s how the first two computer companies began doing data processing.
World War II
The first programmable computers were invented in the 1940’s because of World War II.They could have been invented sooner — most of the know-how was available several decades earlier — but you can’t invent a computer unless you have big bucks for research. And the only organization that had big enough bucks was the Defense Department (which in those days was more honestly called the "War Department"). And the only event that was big enough to make the War Department spend that kind of money was World War II.
Of course, the Germans did the same thing. A German fellow, Konrad Zuse, built computers which in some ways surpassed the American ones. But since the Germans lost the war, you don’t hear much about old Konrad anymore. Fortunately, throughout World War II the German military ignored what he was doing.
During the 1940’s, most computers were invented at universities, usually funded by the War-Defense Department. Some of the most famous computers were theMark I (at Harvard with help from IBM), the ENIAC and the EDVAC (both at the University of Pennsylvania), the Whirlwind (at the Massachusetts Institute of Technology, M.I.T.), and the Ferranti Mark I (at the University of Manchester, in England). Which of those computers deserves to be called "the first programmable computer"? The answer’s up for grabs. Each of those machines had its own peculiar hang-ups and required years of debugging before working well.
Each of those computers was, as they say in the art world, a "signed original". No two of those computers were alike.
1st generation (1951-1958)
The first computer to be mass-produced was the UNIVAC I, in 1951.It was made by the same two guys (Eckert & Mauchly) who’d built the ENIAC and EDVAC at the University of Pennsylvania. (Mauchly was an instructor there, and Eckert was the graduate student who did the dirty work.) While others at the school were helping build the EDVAC, Eckert and Mauchly left and formed their own company, which invented and started building the UNIVAC. While building the UNIVAC, the Eckert-Mauchly company merged into Remington Rand (which later merged into Sperry-Rand, which later merged into Unisys).
The UNIVAC I was so important that historians call it the beginning of the "first generation". As for computers before UNIVAC — historians disparagingly call them the "zeroth generation".
So the first generation began in 1951. It lasted through 1958. Altogether, from 1951 to 1958, 46 of those UNIVACs were sold.
46 might not sound like many. But remember: in those days, computers were very expensive, and could do very little. Another reason why only 46 were sold is that newer models came out, such as the UNIVAC 1103, the UNIVAC 80, and the UNIVAC 90. But the biggest reason why only 46 of the UNIVAC I were sold is IBM.
The rise of IBMAlthough IBM didn’t begin mass-marketing computers until 1953 — two years after UNIVAC — the IBM guys were much better salesmen, and soon practically everybody was buying from IBM. During the first generation, the hottest seller was the IBM 650. IBM sold hundreds and hundreds of them.
There were many smaller manufacturers too. People summarized the whole computer industry in one phrase:IBM and the Seven Dwarfs.
Who were the dwarfs? They kept changing. Companies rapidly entered the field — and rapidly left when they realized IBM had the upper hand. By the end of the first generation, IBM was getting 70% of the sales.
Primitive input and outputDuring the first generation, there were no terminals. To program the UNIVAC I, you had to put the program onto magnetic tape (by using a non-computerized machine), feed that tape to the computer, and wait for the computer to vomit another magnetic tape, which you had to run through another machine to find out what the tape said.
One reason why the IBM 650 became more popular was that it could read cards instead of tapes. It really liked cards. In fact, the answers came out on cards. To transfer the answers from cards to paper, you had to run the cards through a separate non-computerized machine.
MemoryAt the beginning of the first generation, there was no RAM, no ROM, and no core. Instead, the UNIVAC’s main memory was banks of liquid mercury, in which the bits were stored as ultrasonic sound waves. It worked slowly and serially, so the access time ranged from 40 to 400 microseconds per bit.
UNIVAC’s manufacturer and IBM started playing around with a different kind of memory, called the Williams tube, which was faster (10 to 50 microseconds); but since it was less reliable, it didn’t sell well.
In 1953, several manufacturers started selling computers that were much cheaper, because they used super-slow memory: it was a drum that rotated at 3600 rpm, giving an average access time of 17000 microseconds (17 milliseconds). (During the 1970’s, some computers still used drums, but for auxiliary memory, not for main memory.) The most popular first generation computer, the IBM 650, was one of those cheap drum computers.
Core memoryconsists of tiny iron donuts strung on a grid of wires, whose electrical current magnetizes the donuts. That scheme was first conceived in 1950. The first working models were built in 1953 at M.I.T. and RCA, which argued with each other about who owned the patent. The courts decided in favor of M.I.T., so both RCA and IBM came out with core-memory computers. Core memory proved so popular that most computers used it through the 1970’s, though in the 1980’s RAM finally overshadowed it.
LanguagesDuring the first generation, computer programming improved a lot. During the early 1950’s, all programs had to be written in machine language. In the middle 1950’s, assembly language became available. By 1958, the end of the first generation, three major high-level languages had become available: FORTRAN, ALGOL, and APT.
Fancy programsProgrammers tried to make computers play a decent game of chess. All the attempts failed. But at IBM, Arthur Samuel had some luck with checkers. He got his first checkers program working in 1952 and then continually improved it, to make it more and more sophisticated. In 1955, he rewrote it so that it learned from its own mistakes. In 1956, he demonstrated it on national TV. He kept working on it. Though it hadn’t reached championship level yet, it was starting to look impressive.
Computer music scored its first big success in 1956, on the University of Illinois’ ILLIAC computer. Hiller & Isaacson made the ILLIAC compose its own music in a style that sounded pre-Bach. In 1957, they made the program more flexible, so that it produced many styles of more modern music. The resulting mishmash composition was dubbed "The ILLIAC Suite" and put on a phonograph record.
In 1954, IBM wrote a program that translated simple sentences from Russian to English. Work on tackling harder sentences continued — with too much optimism.
2nd generation (1959-1963)
Throughout the first generation, each CPU was composed of vacuum tubes. Back in 1948, Bell Telephone had invented the transistor, and everybody realized that transistors would be better than vacuum tubes; but putting transistors into computers posed many practical problems that weren’t solved for many years.
Finally,in 1959, computer companies started delivering transistorized computers. That year marked the beginning of the second generation. Sales of vacuum-tube computers immediately stopped.
All second-generation computers used core memory.
IBMThe first company to make transistors for computers was Philco, but the most popular second-generation computer turned out to be the IBM 1401, because it was business-oriented and cheap. IBM announced it in 1959 and began shipping it to customers in 1960.
Its core memory required 11˝ microseconds per character. Each character consisted of 6 bits. The number of characters in the memory could range from 1.4K up to 16K. Most people rented the 1401 for about $8,000 per month, but you could spend anywhere from $4,000 to $12,000 per month, depending on how much memory you wanted, etc.
Altogether, IBM installed 14,000 of those machines.
IBM also installed 1,000 of a faster version, called the1410. It required only 4˝ microseconds per character, had 10K to 80K, and rented for $8,000 to $18,000 per month, typically $11,000.
Altogether, IBM produced six kinds of computers.…
small business computers: the 1401, 1410, 1440, and 1460
small scientific computers: the 1620
medium-sized business computers: the 7010
medium-sized scientific computers: the 7040 and 7044
large business computers: the 7070, 7074, and 7080
large scientific computers: the 7090 and 7094
CDCSeveral employees left Remington-Rand-Sperry-Univac and formed their own company, called the Control Data Corporation (CDC). During the second generation, CDC produced popular scientific computers: the 1604, the 3600, and the 3800.
SoftwareDuring the second generation, software improved tremendously.
The three major programming languages that had been invented during the first generation (FORTRAN, ALGOL, and APT) were significantly improved. Six new programming languages were invented:COBOL, RPG, LISP, SNOBOL, DYNAMO, and GPSS.
Programmers wrote advanced programs that answered questions about baseball, wrote poetry, tutored medical students, imitated three-person social interaction, controlled a mechanical hand, proved theorems in geometry, and solved indefinite integrals. The three most popular sorting methods were invented: the Shuffle Sort, the Shell Sort, and Quicksort.
Dawn of 3rd generation (1964-1967)
The third generation began with a big bang, in 1964. Here’s what happened in 1964, 1965, 1966, and 1967.…
FamiliesThe first modern computer families were shipped. They were the CDC 6600, the IBM 360, and DEC’s families (the PDP-6, PDP-8, and PDP-10).
Of those families, the CDC 6600 ran the fastest. The IBM 360 was the most flexible and was the only one that used integrated circuits. The PDP-6 and PDP-10 were the best for timesharing. The PDP-8 was the cheapest.
Here are the dates. CDC began shipping the CDC 6600 in 1964. IBM announced the IBM 360 in 1964 but didn’t ship it until 1966. DEC began shipping the PDP-6 maxicomputer in 1964, the PDP-8 minicomputer in 1965, and the PDP-10 maxicomputer (a souped-up PDP-6) in 1967.
New languagesIBM announced it would create PL/I, a new computer language combining FORTRAN, COBOL, ALGOL, and all other popular languages. It was designed especially for IBM’s new computer, the 360. In 1966, IBM began delivering PL/I to customers.
Programmers invented the first successful languages for beginners using terminals. Those languages wereBASIC, JOSS, and APL.
Dartmouth College invented the first version ofBASIC in 1964, and significantly improved it in 1966 and 1967.
The RAND Corporation inventedJOSS in 1964 for the JOHNNIAC computer, and put an improved version (JOSS II) on the PDP-6 in 1965. During the 1970’s, three popular variants of JOSS arose: a souped-up version (called AID), a stripped-down version (FOCAL), and a business-oriented version (MUMPS).
IBM completed the first version ofAPL in 1965 and put it on an IBM 7090. IBM wrote a better version of APL in 1966 and put it on an IBM 360. IBM began shipping APL to customers in 1967.
Stanford University invented the most popular language for statistics:SPSS.
Artificial intelligenceResearchers calling themselves "experts in artificial intelligence" taught the computer to chat in ordinary English. For example, Bertram Raphael made the computer learn from conversations, Daniel Bobrow made it use algebra to solve "story problems", The Systems Development Corporation made it know everything in an encyclopedia, General Electric made it answer military questions, Ross Quillian made it find underlying concepts, and Joe Weizenbaum made it act as a psychotherapist.
Also, Richard Greenblatt wrote the first decent chess program. It was good enough to play in championship tournaments against humans.
Era of boredom (1968-1974)
As you can see, the first, second, and third generations — up through 1967 — were exciting, full of action. But then, from 1968 to 1974, nothing newsworthy happened. That was the era of boredom.
During that era, progress was made, but it was gradual and predictable. Nothing dramatic happened.
Of course, nobody actually came out and said, "Life is boring." People phrased it more genteelly. For example, in September 1971 Robert Fenichel and Joe Weizenbaum wrote this introduction to Scientific American’s computer anthology:
"Partly because of the recent recession in the American economy, but more for reasons internal to the field, computer science has recently relaxed its pace. Work has not stopped, but that the current mood is one of consolidation can scarcely be doubted. Just a few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative. This book could not then have been produced without great risk of misfocus. Today it’s much easier to put the articles that constitute this book — even the most recent ones — into context."
Since the first generation had lasted eight years (1951-1958), and the second generation had lasted four years (1959-1963), people were expecting the third generation to last at most four years (1964-1967) and some kind of "fourth generation" to begin about 1968. But it never happened.
The only "major" announcement around then came in 1970, when IBM announced it would produce a new line of computers, called theIBM 370, which would make the IBM 360 obsolete. But to IBM’s dismay, many computer centers decided to hang onto the old 360 instead of switching to the 370. The 370’s advantage over the 360 was little, until years later when IBM started developing 370 software that wouldn’t run on the 360.
Since the difference between the 370 and 360 was disappointingly small, not even IBM claimed that the 370 marked a fourth generation. Computer historians, desperate for something positive to say about the 370, called it the beginning of the "late third generation", as opposed to the 360, which belonged to the "early third generation".
The cruel fact is, in the entire history of computers, there was just one year all computer manufacturers acted together to produce something new. That year was 1959, when all manufacturers switched from vacuum tubes to transistors. Since 1959, we haven’t had any consistency. For example, although the third generation began with a "big bang" in 1964, each manufacturer was banging on a different drum. IBM was proclaiming how great the IBM 360 would be because it would contain integrated circuits; but other manufacturers decided to ignore integrated circuits for several years, and concentrated on improving other aspects of the computer instead. For many years after the beginning of the third generation, CDC and DEC continued to use discrete transistors (a sign of the second generation) instead of integrated circuits.
Why?The era of boredom happened for three reasons:
1. The preceding years, 1964-1967, had been so successful that they were hard to improve on.
2. When the Vietnam War ended, the American economy had a recession, especially the computer industry, because it had depended on contracts from the Defense Department. In 1969, the recession hit bottom, and computer companies had to lay off many workers. In that year, General Electric gave up and sold its computer division to Honeywell. In 1971, RCA gave up too and sold its computer division to Remington-Rand-Sperry-Univac.
3. The world wasn’t ready yet for "the era of personal computing", which began in 1975.
Quiet changesDuring the era of boredom, these changes occurred — quietly.…
In 1970, DEC began shipping thePDP-11. The PDP-8 and PDP-11 became the most popular minicomputers — far more popular than IBM’s minicomputers. So in the field of minicomputers, IBM no longer had the upper hand.
BASIC became the most popular language for the PDP-8 and PDP-11 and most other minicomputers (except IBM’s, which emphasized RPG). In high schools and business schools, most of the introductory courses used BASIC, instead of FORTRAN or COBOL.
Many businesses and high schools bought their own minicomputers, instead of renting time on neighbors’ maxicomputers. The typical high-school computer class used a PDP-8. The richest high schools bought PDP-11’s.
In universities, the social sciences started using computers — and heavily — to analyze statistics.
All new computer families used 8-bit bytes, so the length of each word was a multiple of 8 (such as 8, 16, 32, or 64). Most older computer families, invented before the era of boredom, had used 6-bit bytes, so the length of each word had been a multiple of 6: for example, the PDP-8 had a word of 12 bits; the PDP-10 , UNIVAC 1100, and General Electric-Honeywell computers had a word of 36 bits; and the CDC 6600 had a word of 60 bits. The IBM 360 was the first computer to use 8-bit bytes instead of 6-bit; during the era of boredom, all manufacturers copied that feature from IBM.
CRT terminals (TV-like screens attached to keyboards) got cheaper and cheaper, until they were finally as cheap as hard-copy terminals (which use paper). Most computer centers replaced hard-copy terminals by CRT terminals because CRT terminals were quicker, quieter, and could do fancy editing.
Use of keypunch machines decreased because many computer centers replaced cards by CRT terminals.
Interest in new computer languages died. Most computer managers decided to stick with the old classics (FORTRAN and COBOL), because switching to a progressive language (such as PL/I) would require too much time to retrain the programmers and rewrite all the old programs.
Programmers made two last-ditch attempts to improve ALGOL. The first attempt, calledALGOL 68, was too complicated to win popular appeal. The second attempt, called PASCAL, eventually gained more support.
Maxicomputers were givenvirtual core — disks that pretend to be core, in case you’re trying to run a program that’s too large to fit into core.
Memory chips got cheaper, until they were finally cheaper than core. Most manufacturers replaced core by memory chips.
In 1971,Intel began shipping the first microprocessor (complete CPU on a chip). It was called the 4004 and had a word of just 4 bits. In 1972, Intel began shipping an improved version, the 8008, whose word had 8 bits. In 1973, Intel began shipping an even better version, the 8080.
In 1975, the first popular microcomputer was shipped.It was called the Altair and was built by a company called MITS. It cost just $395.
It was just a box that contained a CPU and very little RAM: just Ľ of a K!
It included no printer, no disk, no tape, no ROM, no screen, and not even a keyboard! The only way to communicate with the computer was to throw 25 switches and watch 36 blinking lights.
It didn’t understand BASIC or any other high-level computer language. To learn how to throw the switches and watch the blinking lights, you had to take a course in "machine language".
You also had to take a course in electronics — because the $395 got you just a kit that you had to assemble yourself by using a soldering iron and reading electronics diagrams. Moreover, when you finished building the kit, you noticed some of the parts were missing or defective, so that you had to contact MITS for new parts.
That computer contained several empty slots to hold PC cards. Eventually, many companies invented PC cards to put into those slots. Those PC cards, which were expensive, let you insert extra RAM and attach a printer, tape recorder, disk drives, TV, and terminal (keyboard with either a screen or paper).
Bill Gates invented a way to make the Altair handle BASIC. He called his methodMicrosoft BASIC. He patterned it after DEC’s BASIC; but he included extra features that exploited the Altair’s ability to be "personal", and he eliminated features that would require too much RAM.
Gary Kildall invented a disk operating system that the Altair could use. He called that operating systemCP/M.
Many companies built computers that imitated the Altair. Those imitations became more popular than the Altair itself. Eventually, the Altair’s manufacturer (MITS) went out of business.
The computers that imitated the Altair were calledS-100 bus computers, because they each used a Standard cable containing 100 wires.
In those days, the microcomputer industry was standardized. Each popular microcomputer used Microsoft BASIC, CP/M, and the S-100 bus. The microcomputer was just a box containing PC cards; it had no keyboard, no screen, and no disk drive. A cable went from the microcomputer to a terminal, which was priced separately. Another cable went from the microcomputer to a disk drive, which was also priced separately.
In 1977, four companies began selling microcomputers that had built-in keyboards,so you didn’t have to buy a terminal. Their computers became popular immediately. The four companies were Processor Technology, Apple, Commodore, and Radio Shack.
Processor Technology’s computer was called the Sol 20, to honor Solomon Libes, an editor of Popular Electronics.
Apple’s computer was called the Apple 2, because it improved on the Apple 1, which had lacked a built-in keyboard.
Commodore’s computer was called the Pet (inspired by Pet Rocks).
Radio Shack’s computer was called the TRS-80, because it was manufactured by Tandy’s Radio Shack and contained a Z-80 CPU.
For a fully assembled computer, Processor Technology charged $1850, Apple charged $970, Commodore charged $595 (but quickly raised the price to $795), and Radio Shack charged $599 (but soon lowered the price to $499).
Notice that Commodore and Radio Shack had the lowest prices. Also, the low prices from Commodore and Radio Shack included a monitor, whereas the prices from Processor Technology and Apple didn’t. So Commodore and Radio Shack were the real "bargains".
In those days, "the lower the price, the more popular the computer". The cheapest and most popular computer was Radio Shack’s. The second cheapest and second most popular was Commodore’s Pet. The third cheapest and third most popular was the Apple 2. Processor Technology, after a brief fling of popularity, went bankrupt. The most expensive kind of microcomputer was the CP/M S-100 bus system, which was the oldest kind and therefore had accumulated the greatest quantity of business software.
In 1978 and 1979, the three main companies (Apple, Commodore, and Radio Shack) improved their computers.
The improved Apple 2 was called theApple 2-plus. The improved Commodore Pet was called the Commodore Business Machine (CBM). The improved Radio Shack TRS-80 was called the TRS-80 model 2.
After announcing the Apple 2-plus, Apple Computer Company stopped selling the plain Apple 2.
Commodore continued selling its old computer (the Pet) to customers who couldn’t afford the new version (the CBM), which cost more. Likewise, Radio Shack continued selling its model 1 to customers who couldn’t afford the model 2.
Texas Instruments & Atari
In 1979, Texas Instruments (TI) and Atari entered the microcomputer marketplaceand began selling low-priced computers.
TI’s microcomputer was called theTI 99/4. Atari offered two microcomputers: the Atari 400 and the Atari 800.
TI charged $1150. Atari charged $1000 for the regular model (the Atari 800) and $550 for the stripped-down model (the Atari 400).
TI’s price included a color monitor. Atari’s prices did not include a screen; you were to attach Atari’s computers to your home’s TV.
TI’s computer was terrible, especially its keyboard. The Atari 800 computer was wonderful; reviewers were amazed at its easy-to-use keyboard, easy-to-use built-in editor, gorgeous color output on your TV, child-proofing (safe for little kids), and dazzling games, all at a wonderfully low price! It was cheaper than an Apple (whose price had by then risen to $1195) and yet was much better than an Apple.
From that description, you’d expect Atari 800 to become the world’s best-selling computer, and the TI 99/4 to become an immediate flop. Indeed, that’s what most computer experts hoped. And so did the TI 99/4’s product manager: when he saw what a mess the TI 99/4 had become, he quit TI and went to work for Atari, where he became the product manager for the Atari 400 & 800!
But even though computer experts realized that TI’s computer was junk, TI decided to market it aggressively:
TI coaxed Milton Bradley and Scott Foresman to write lots of programs for the 99/4. TI paid researchers at MIT to make the 99/4 understand LOGO (a computer language used by young children and very popular in elementary schools). TI improved the keyboard just enough so that people would stop laughing at it; the version with the new keyboard was named the99/4A. TI paid Bill Cosby to praise the 99/4A and ran hundreds of TV ads showing Bill Cosby saying "wow". TI dramatically slashed the $1150 price to $650, then $150, and then finally to just $99.50! (To bring the price that low, TI had to exclude the color monitor from the price; instead, TI included a hookup to your home’s color TV.)
By contrast, Atari did hardly anything to market or further improve the Atari 400 & 800. Instead, Atari concentrated on its other products: the big Atari game machines (which you find in video arcades) and the Atari VCS machine (which plays video games on your home TV).
The TI 99/4A therefore became more popular than the Atari 400 & 800 — even though the TI 99/4A was inherently worse.
Sinclair, Osborne, backlash
In 1980 and 1981, two important companies entered the microcomputer marketplace: Timex Sinclair (1980) and Osborne (1981).
The first complete computer selling for less than $200 was invented by a British chap named Clive Sinclair and manufactured by Timex. The original version was called theZX-80 (because it was invented in 1980, contained a Z-80 CPU, and was claimed to be "Xellent"); it sold for $199.95. In 1981, Clive Sinclair invented an improved version, called the ZX-81. Later, he and Timex invented further improvements, called the ZX Spectrum and the Timex Sinclair 1000. When TI dropped the price of the TI 99/4A to $99.50, Timex retaliated by dropping the list price of the Timex Sinclair 1000 to $49.95, so that the Timex Sinclair 1000 remained the cheapest complete computer.
In April 1981, Adam Osborne began the Osborne Computer Corp. and began selling theOsborne 1 computer, designed by Lee Felsenstein (the inventor of Processor Technology’s Sol 20 computer). The Osborne 1 computer included practically everything a business executive needed: its $1795 price included a keyboard, a monitor, a Z-80A CPU, a 64K RAM, two disk drives, CP/M, Microsoft BASIC, a second version of BASIC, the Wordstar word processor, and the Supercalc spreadsheet program. Moreover, it was the world’s first portable business computer: the entire computer system (including even the monitor and disk drives) was collapsible and turned itself into an easy-to-carry attaché case. (Many years later, Compaq copied Osborne’s idea.)
While Timex Sinclair and Osborne were entering the marketplace, Radio Shack, Apple, and Commodore were introducing new computers of their own:
In 1980, Radio Shack began selling three new computers. TheTRS-80 model 3 replaced Radio Shack’s cheapest computer (the model 1) and was almost as good as Radio Shack’s fanciest computer (the model 2). The TRS-80 Color Computer drew pictures in color and cost less than the model 3. The TRS-80 Pocket Computer fit into your pocket, looked like a pocket calculator, and was built for Radio Shack by Sharp Electronics in Japan.
In 1980, Apple began selling theApple 3. It was overpriced; and to make matters worse, the first Apple 3’s that rolled off the assembly line were defective. Apple eventually lowered the price and fixed the defects; but since the Apple 3 had gotten off to such a bad start, computer consultants didn’t trust it and told everybody to avoid it.
In 1981, Commodore began selling theVic-20, which drew pictures in color and cost less than Radio Shack’s Color Computer. In fact, the Vic-20 was the first computer that drew pictures in color for less than $300.
The Vic-20 originally sold for $299.95. When TI lowered the price of the TI 99/4A to $99.95, Commodore lowered the price of the Vic-20. At discount department stores (such as K Mart, Toys R Us, and Child World), you could buy the Vic-20 for just $85: it was still the cheapest computer that could handle color. (The Timex Sinclair 1000 was cheaper but handled only black-and-white.)
Moreover, the Vic-20 had standard Microsoft BASIC, whereas the Timex Sinclair 1000 and TI 99/4A did not; so the Vic-20 was the cheapest computer that had standard Microsoft BASIC. It was the cheapest computer that was pleasant to program.
Also, the Vic-20 had a nice keyboard, whereas the keyboards on the Timex Sinclair 1000 and TI 99/4A were pathetic.
The Vic-20 became immediately popular.
On August 12, 1981, IBM announced a new microcomputer, called the IBM Personal Computer (or IBM PC).
Although IBM had previously invented other microcomputers (the IBM 5100 and the IBM System 23 Datamaster), they’d been overpriced and nobody took them seriously — not even IBM. The IBM Personal Computer was IBM’s first serious attempt to sell a microcomputer.
The IBM Personal Computer was a smashing success, because of its amazingly high quality and amazingly low price. It became the standard against which the rest of the microcomputer industry was judged.
Every 8 years, the country’s mood about computers has changed. After 8 years of dramatic revolution, we switched to 8 years of subtle evolution, then back again.
The pivotal years were 1943 (beginning the first revolution), 1951 (beginning the first period of evolution), 1959 (revolution), 1967 (evolution), 1975 (revolution), 1983 (evolution), and 1991 (revolution). Here are the details.…
RevolutionFrom 1943 to 1950, researchers at universities were building the first true computers, which were big monsters. Each was custom-built; no two were alike.
EvolutionIn 1951, Sperry began selling the first mass-produced computer: the UNIVAC I. Sperry built 46 of them. During the 8-year era from 1951 to 1958, computers gradually became smaller and cheaper and acquired more software. That evolutionary era was called the first generation.
RevolutionThe next computer revolution began in 1959, when IBM began selling the IBM 1401, the first IBM computer to use transistors instead of vacuum tubes. During that eight-year revolution from 1959 to 1966, computerists polished FORTRAN and ALGOL (which had been begun earlier), invented 9 other major computer languages (COBOL, BASIC, PL/I, LISP, SNOBOL, APL, DYNAMO, GPSS, and RPG), and began developing FORTH and SPSS. They created many amazing programs for artificial intelligence, such as Weizenbaum’s Eliza program, which made the computer imitate a therapist. During that same eight-year period, IBM invented the IBM 360: it was the first popular computer that used integrated circuits, and all of IBM’s modern mainframes are based on it.
EvolutionThe years from 1967 to 1974 showed a gradual evolution. Computer prices continued to drop and quality continued to improve. DEC began selling PDP-10 and PDP-11 computers, which became the favorite computers among researchers in universities.
RevolutionIn 1975, MITS shipped the first popular microcomputer, the Altair, which launched the personal computer revolution. Soon Apple, Commodore, Tandy, and IBM began selling microcomputers also. Programmers developed lots of useful, fun software for them. The revolution climaxed at the end of 1982, when many Americans bought microcomputers as Christmas presents.
EvolutionIn January 1983, the cover of Time magazine declared that the 1982 "man of the year" was the personal computer. But consumers quickly tired of the personal-computer fad, chucked their Commodore Vic and Timex Sinclair computers into the closet, and shifted attention to less intellectual pursuits. Many computer companies went bankrupt. In 1983, Lotus announced 1-2-3, but that was the computer industry’s last major successful new product. After that, prices continued to fall and quality gradually increased, but no dramatic breakthroughs occurred. The computer industry became boring. During that time, if you were to ask "What fantastically great happened in the computer industry during the past year?" the answer was: "Not much".
RevolutionIn 1991, the computer industry became exciting again. Here’s why:
Part of that excitement came from revolutionary influences of the previous two years: in 1989 & 1990 the Berlin Wall fell, the Cold War ended, a new decade began, Microsoft finally invented a version of Windows that worked well (version 3.0), and Apple invented a color Mac that was affordable (the LC). In 1991, Microsoft put the finishing touches on Windows (version 3.1) and DOS (version 5).
In 1991 and 1992, a series of price wars made the cost of computers drop 45% per year instead of the customary 30%. Those lower prices made people spend more money on computers, because the ridiculously low prices for fancy stuff encouraged people to buy fancier computers: 486 instead of 286, Super VGA instead of plain VGA, 8M RAM instead of 1M, 200M hard drives instead of 40M.
The sudden popularity of Windows whetted the public’s hunger for those muscle machines, since Windows requires lots of muscle to run well. That growing American muscle (bigger and bigger!) then made Windows practical enough to become desirable. All big software companies hastily converted their DOS and Mac software to Windows.
The challenge of doing that conversion forced them to rethink the twin questions of software wisdom: "What makes software easy to use?" and "What kinds of software power do users want?" Many creative solutions were invented to those questions.
During the 1992 Christmas season, fast CD-ROM drives finally became cheap enough to create a mass market: many American bought them, and CD-ROMs became the new standard way to distribute encyclopedias, directories, other major reference works, and software libraries (full of fonts and shareware). The attention given to CD-ROMs made customers think about the importance of sound, and many customers bought sound cards such as the Sound Blaster.
I’d tell you more about this computer revolution, but I’m stuck in the middle of it and must get back to my battle station.
When the revolution ends, historians will try to summarize it. They’ll sit back in their easy chairs, smoke their pipe dreams, wax eloquent about their war stories, and gigglishly play Monday-morning quarterback, which is much funnier than calling the shots while the game’s in progress.
The 8-year computer cycle coincides with the American cycle of switching political parties. After years of Roosevelt & Truman, the presidential election of 1952 ushered in eight years of a Republican (Eisenhower); 1960 brought eight years of Democrats (Kennedy & Johnson); 1968, eight years of Republicans (Nixon & Ford).
1976 began another 16-year experience of "Democrat followed by Republicans"; but alas, the Democrat (Carter) got just 4 of those years, and the Republicans (Reagan and Bush) got the remaining 12. (Carter got just 4 of those years instead of 8 because he lost face in the middle of the Iran hostage crisis, oil crisis, and recession.)
1992 began another experience of "Democrat followed by Republicans". The Democrat was Clinton.
I wonder who’ll come after Clinton. If you’re reading this book after Clinton has gone, please enter your time machine, go through a time warp, come back to my time, and tell me who Clinton’s successor will be. I’m dying to know.
When Americans love liberals and revolution, they vote for Democrats; when Americans prefer conservative evolution, they vote for Republicans. As historian Krigsman remarked, "An excitable mood in the country causes a computer revolution, and the next year the Democrats grab power."