Analyze yourself

The computer can analyze your body and mind.


Terrence Lukas wrote a program that predicts when you’ll die. The program makes the computer ask for your age and sex; then it asks about the life and health of your parents and grandparents, your weight, your personal habits (smoking, drinking, exercise, and sleep), your history of medical check-ups, your social class (your education, occupation, and income), and your lifestyle: urban or rural, single or married, aggressive or passive, and whether you use seat belts. The computer combines all that information, to tell you when you’ll probably die.

The program uses the latest statistics from life-insurance companies and from medical research. Lukas wrote the program at the University of Illinois Medical Center.

Running the program is fun. Each time you answer a question, the computer tells you how your answer affects its prediction. You see its prediction bob up and down, until the questions finally end, and the computer gives you its final prediction of when you’ll die. It’s like watching the early returns of a Presidential election, except the topic is you!

The computer pops out with surprising comments, based on the latest medical research. Here are some comments the computer prints:

Professionals usually live longer, except musicians, architects, and pharmacists. Why this is true is unknown.

Cooks, chefs, bakers, and other people who work at jobs associated with overeating have a lower life expectancy.

Adults that sleep too much use too many hours in nonphysical activity. They may be unhappy and sleep as an escape, or may be ill. Depressed people have shorter life expectancies.

Moderate drinking (up to two drinks per day) reduces stress and aids digestion. Heavy drinking, however, produces physiological damage. As for teetotalers, they may have rather rigid value systems and may undergo stress in maintaining them.

The program is on pages 34-36 of the November 1977 issue of Kilobaud Microcomputing Magazine.


A computer has been programmed to read your mind, by analyzing your brainwaves. A newspaper article described the program dramatically: you’re an airplane pilot… your plane is going to crash… but you think "Up!", and the plane automatically goes back up!… because the plane is run by a computer that’s reading your brainwaves!

But what if the pilot is sadistic, and thinks "Down"?

Anyway, the program isn’t perfected yet. When the computer tries to distinguish brainwaves that mean up from brainwaves that mean down, it gets the right answer 75% of the time. In other words, 25% of the time it goofs. I’d hate to be in a plane controlled by a computer that was having a bad day!

Recently, the U.S. Government has shifted the emphasis of this research. The new emphasis is more practical: to make the computer ring a buzzer, when the pilot’s brainwaves indicate the pilot is daydreaming.


If you hate horoscopes as much as I do, you’ll love Adrian Thornton’s program. It makes the computer print fake horoscopes. The computer asks when you were born, computes your sign, and then prints an enchanting remark.

For example, if your sign turns out to be Capricorn, the computer will print:

You are conservative and afraid of taking risks. You don’t do much of anything and are lazy. There has never been a Capricorn of any importance. Capricorns should avoid standing still too long, as they tend to take root and become trees.

If you’re a Virgo, the computer will print:

You are the logical type and hate disorder. This nitpicking is sickening to your friends. You are cold and unemotional and sometimes fall asleep while making love. Virgos make good bus drivers.

You’ll enjoy seeing your own "horror"scope. The complete program is on pages 48-49 of the February 1979 issue of Kilobaud Microcomputing Magazine.


In 1966, Joseph Weizenbaum programmed the computer to act as a psychotherapist. On the keyboard, you type whatever thoughts come to your mind, and the computer will give you therapy. When typing on the terminal, feel free to use whatever words you wish — you don’t have to use a "computer language".

The most amazing thing about the program is that it does a decent job of therapy, even though the computer doesn’t know what you’re talking about! Weizenbaum’s program contains the definitions of only a handful of words; but if you use words the computer doesn’t know, the computer does an amazingly good job of "faking". For example, if you say, "My junkapoo is like my iliabop", the computer will continue the conversation by asking you, "In what way?" even though the computer doesn’t know what a junkapoo or an iliabop is.

Here’s a conversation between his computer and a patient.

Patient: Men are all alike.

Computer: In what way?

Patient: They’re always bugging us about something or other.

Computer: Can you think of a specific example?

Patient: Well, my boyfriend made me come here.

Computer: Your boyfriend made you come here.

Patient: He says I’m depressed much of the time.

Computer: I am sorry to hear you are depressed.

Patient: It’s true. I am unhappy.

Computer: Do you think coming here will help you not to be unhappy?

Patient: I need some help, that much seems certain.

Computer: What would it mean to you if you got some help?

Patient: Perhaps I could learn to get along with my mother.

Computer: Tell me more about your family.

Patient: My mother takes care of me.

Computer: Who else in your family takes care of you?

Patient: My father.

Computer: Your father.

Patient: You are like my father in some ways.

Computer: What resemblance do you see?

Patient: You are not very aggressive, but I think you don’t want me to notice that.

Computer: What makes you think I am not very aggressive?

Patient: You don’t argue with me.

Computer: Why do you think I don’t argue with you?

Patient: You are afraid of me.

Computer: Does it please you to believe I am afraid of you?

Patient: My father is afraid of everybody.

Computer: What else comes to mind when you think of your father?

Patient: Bullies.

Computer: Does that have anything to do with the fact that your boyfriend
made you come here?

Like the famous psychotherapist Carl Rogers, the computer asks probing questions that make the patient open up, but it never gives advice: the patient must discover the truth for herself.

I’ll explain exactly how the program works, so you can become a psychotherapist yourself, and get hundreds of thousands of dollars by bilking your patients.

The computer begins by replacing some of the patient’s words:

Typed by patient Replacement

mom mother

dad father

dont don’t

cant can’t

wont won’t

dreamed dreamt

dreams dream

I you@

me you

you I

my your

your my

myself yourself

yourself myself

I’m you’re

you’re I’m

am are@

were was

For example, the sentence Well, my boyfriend made me come here becomes Well, your boyfriend made you come here.

The computer hunts through the revised sentence or paragraph for one of these keywords.…

Category 8: computer, computers, machine, machines

Category 7: name

Category 6: alike, like, same

Category 5: remember

Category 4: dreamt

Category 3: dream, if

Category 2: everybody, everyone, nobody, was, your

Category 1: always

Category 0: are, are@, because, can, certainly, deutsch, espanol,
francais, hello, how, I, I’m, italiano, maybe, my, no, perhaps,
sorry, what, when, why, yes, you@, you’re

If the computer finds several of those keywords, it chooses the one in the highest category; if they lie in the same category, it chooses the one the patient typed first.

Then it looks up the keyword in this phrasebook:

Patient’s keyword Computer’s reply

alike In what way?

always Can you think of a specific example?


are I… Why are you interested in whether I am… or not?

are… Did you think they might not be…?


are you… Do you believe you are…?

are Why do you say "am"?

because Is that the real reason?


can I… You believe I can…, don’t you?

can you… Whether or not you can… depends on you more than on me.

certainly You seem quite positive.

computer Do computers worry you?

computers Do computers worry you?

deutsch I am sorry, I speak only English.

dream What does that dream suggest to you?

dreamt… Really,…?

espanol I am sorry, I speak only English.

everybody Really, everybody?

everyone Really, everyone?

hello How do you do — please state your problem.

how Why do you ask?


I remind you of In what way?

I are… What makes you think I am…?

I… you Why do you think I… you?

I We were discussing you — not me.

if… Do you think it’s likely that…?

I’m… What makes you think I am…?

italiano I am sorry, I speak only English.


(am,is,are,was)… like In what way?

like (get a different keyword)

machine Do computers worry you?

machines Do computers worry you?

maybe You don’t seem quite certain.

my… Why are you concerned over my…?

name I am not interested in names.

no Are you saying "no" just to be negative?

nobody Really, nobody?

perhaps You don’t seem quite certain.


you remember… Do you often think of…?

do I remember… Did you think I would forget…?

remember (get a different keyword)

same In what way?

sorry Please don’t apologize.


was you… What if you were…?

you was Were you really?

was I… Would you like to believe I was…?

was (get a different keyword)

what Why do you ask?

when Why do you ask?


why don’t I… Do you believe I don’t…?

why can’t you… Do you think you should be able to…?

yes You seem quite positive.


you (want, need)… What would it mean to you if you got…?

you are… (sad, unhappy, depressed, sick) I am sorry to hear you are (sad, etc.).

you are… (happy, elated, glad, better) How have I helped you to be (happy, etc.)?

you (feel, think, believe, wish) you Do you really think so?

you (feel, think, believe, wish)… I(use the keyword "I" instead)

you are… Is it because you are… that you came to me?

you (can’t, cannot)… How do you know you can’t…?

you don’t… Don’t you really…?

you feel Tell me more about such feelings.

you… I Perhaps in your fantasy we… each other.


your… (mother, father, sister, brother, wife, children) Tell me more about your family.

your… Your…

you’re (treat as "you@ are")

For example, if the keyword is sorry, the computer looks up sorry in the phrasebook, which says to print "Please don’t apologize."

Suppose the patient types, "If the job is lousy, he’ll die." The keyword is if. In the phrasebook, if is followed by three dots, which stand for the part of the clause that comes after if, which is "the job is lousy". (The computer figures out where the clause ends by looking at the punctuation.) The phrasebook says to print "Do you think it’s likely that the job is lousy?"

The symbol @ serves just to locate the correct keyword in the phrasebook. Thereafter, it’s ignored.

Here’s what happens if the keyword is you@. After locating you@ in the phrasebook, the computer ignores the @. If the patient’s revised sentence contains you want… or you need…, the computer prints "What would it mean to you if you got…?" If the patient’s sentence contains you are… sad, the computer prints "I am sorry to hear you are sad."

For each reply in the phrasebook, Weizenbaum stored a list of alternatives. For example, here are the alternatives to "Please don’t apologize":

Apologies are not necessary.

What feelings do you have when you apologize?

I’ve told you that apologies are not required.

While chatting with the patient, the computer keeps track of which replies it has printed already, and uses the alternatives to avoid repetition.

If the patient’s statement doesn’t contain a keyword, the computer may give one of these replies:

I am not sure I understand you fully.

Please go on.

What does that suggest to you?

Do you feel strongly about discussing such things?

Or it may take a second look at earlier parts of the conversation, retrieve a clause that contained your…, and print one of these replies:

Let’s discuss further why your….

Earlier you said your….

Does that have anything to do with the fact that your…?

For example, it may retrieve the clause Your boyfriend made you come here and print:

Does that have anything to do with the fact that your boyfriend made you come here?

The program was written at the Massachusetts Institute of Technology (MIT). The programmer, Joseph Weizenbaum, named the program "Eliza", to honor Eliza Doolittle — the woman in My Fair Lady and Pygmalion, who learned to speak English and have a good conversation.

Some people think Weizenbaum’s program shows that computers can communicate as well as psychotherapists. But Weizenbaum himself holds the opposite view; he believes the program shows that psychotherapists communicate as poorly as computers.

You can buy a disk called Golden Oldies, which contains imitations of four classic programs: Adventure, Pong, Life, and Weizenbaum’s Eliza. The disk is published by Software Country, distributed by Electronic Arts, costs about $35, and comes in versions for the Apple 2e & 2c, Macintosh, Commodore 64, Amiga, Atari 800 XL, and IBM PC.

Improved versions After Weizenbaum invented Eliza, other programmers invented improvements. For example, in one of the improved versions, whenever the computer doesn’t understand you, it reacts like the typical rich psychotherapist — it bluffs and says:

I understand fully. Please go on.

The computer never really understands what you say. For example, in one version, if you say "My, my, my!", the computer will reply:

Tell me more about your your your.

Radio Shack’s headquarters phoned me, and asked for permission to use my explanation of Weizenbaum’s program. I agreed, whereupon Radio Shack invented and sold a version that makes the computer talk — by using a voice synthesizer. You’ll freak out when you hear the computer’s voice giving you therapy. Radio Shack’s program costs $15, comes on tape, and requires a model 1 or model 3 TRS-80. To hear the voice, you must buy the voice synthesizer. (But even if you don’t buy the voice synthesizer, the program will impress you.)

The cheapest way to get a version of Eliza is to buy a book called BASIC Computer Games Volume 2 (TRS-80 edition). Get it at your local Radio Shack store. It costs just $7 and includes 84 programs — one of which is a version of Eliza written in BASIC. That version of Eliza isn’t as good as the original; but since it’s cheap and written in BASIC, you’ll enjoy buying it for self-study. Here’s part of a conversation between that version and a patient:

Patient: I am not the greatest conversationalist.

Computer: Do you believe it is normal to be not the greatest conversationalist?

Patient: Yes.

Computer: I see.

Patient: How can you see without any eyes?

Computer: Don’t you believe that I can see without any eyes?

Patient: No. So prove it.

Computer: Are you saying "no" just to be negative?

Patient: Absolutely not.

Computer: You are being a bit negative.

Patient: So you want to fight, do you?

Computer: We were discussing you — not me.

Patient: Don’t back down, friend.

Computer: Why do you bring up the topic of friends?

Patient: I don’t have any friends.

Computer: Don’t you really have any friends?

Patient: No, not really.

Computer: Why not?

Unfortunately, most versions of Eliza are passive. The typical Eliza program works well only if the patient’s talkative. If the patient doesn’t say much, the program can’t react too much and is useless.

Recently, John Holland invented a better version of Eliza. His version’s more aggressive. He calls it "Analiza 2" (because it "analizas" the patient). If the patient’s too quiet or too boring or too uninformative, Analiza 2 makes the computer give the patient a "jolt", by asking an electrifying question or making a ridiculous Zen-like statement. For example, the computer might ask, "Did you ever try to commit suicide with a hammer?" Here’s a list of the computer’s favorite "jolts":

Nasty questions the computer asks

Did you ever try to commit suicide with a hammer?

Is your interest in sex still not very strong?

Do you want to know what I suspect?

Prying questions the computer asks

Tell me, what is your worst fear?

Why don’t you tell me about your first romance?

I want to know about your lifestyle.

Tell me what is troubling you.

Honestly now, what is your biggest concern?

Ridiculous statements the computer makes

With strong drugs you can learn to control your concern!

You certainly are curious!

And you are obstinate!

Complaints the computer makes

You seem inhibited.

This isn’t really very exciting.

Getting started is often difficult.

Comments the computer makes about timing and complexity

Well, time is almost up. Is there anything else you want to discuss?

I hate to bring this up, but you still owe me $53.00 from your last visit!

Last time you were extremely inquisitive. Have you abandoned seeking simple solutions?

Last time you were difficult to understand. Were you trying to confuse me?

Let’s keep it simple this time.

Say, I am having trouble with your long sentences. Keep it simple. I am only human.

Yes, your patient will love to hear the computer say it’s "only human". You can get Analiza 2 for $50 from Supersoft (Box 1628, Champaign, IL 61820, 217-359-2112). It runs on most computers that have the CP/M operating system.

Paranoid What if the world becomes full of computerized psychoanalysts, and there aren’t enough patients to keep them busy? Kenneth Colby solved that problem: in 1971 at Stanford University, he programmed a computer to be a crazy patient. Specifically, he programmed his computer to be a paranoid schizophrenic. In the following conversations, psychiatrists interrogate his computer.

Conversation 1

Psychiatrist: How long have you been in the hospital?

Computer: About a week.

Psychiatrist: Are you depressed?

Computer: No.

Psychiatrist: Were you taking drugs?

Computer: No.

Psychiatrist: What is your problem?

Computer: People make me nervous.

Psychiatrist: How do they do that?

Computer: They stare at me.

Psychiatrist: Why?

Computer: Maybe they don’t like the way I look or something.

Psychiatrist: Do you look unusual in some way?

Computer: Why are you interested in my looks?

Psychiatrist: Because you said people stare at you.

Computer: Well, sometimes.

Psychiatrist: How do you get along with people?

Computer: I don’t communicate very much with the others.

Psychiatrist: Why not?

Computer: They don’t realize what’s going on most of the time.

Psychiatrist: What is going on?

Computer: Do you know how the underworld operates?

Conversation 2

Psychiatrist: How are you today?

Computer: OK.

Psychiatrist: What is your name?

Computer: My name is Frank Smith.

Psychiatrist: How old are you?

Computer: Twenty-eight.

Psychiatrist: Are you married?

Computer: No.

Psychiatrist: What sort of work do you do?

Computer: I work in the post office.

Psychiatrist: Why are you in the hospital?

Computer: I shouldn’t be here.

Psychiatrist: Who brought you here?

Computer: The police.

Psychiatrist: What trouble did you have with the police?

Computer: Cops don’t do their job.

Psychiatrist: What do they do instead?

Computer: Cops arrest the wrong people.

Psychiatrist: They arrest the right ones sometimes.

Computer: The law should get those Italian crooks.

Psychiatrist: Are you Italian?

Computer: My background is British but I was born in this country.

Psychiatrist: Do you know any Italian crooks?

Computer: I try to avoid the underworld.

Psychiatrist: Are you afraid of the underworld?

Computer: The Mafia are the worst gangsters.

Psychiatrist: Are you connected with the Mafia?

Computer: You know, they know me.

Psychiatrist: How do they know you?

Computer: They spy on me.

Psychiatrist: Why do they?

Computer: They want to get false evidence to frame me.

Psychiatrist: What do they have against you?

Computer: They are out to get me.

Psychiatrist: What for?

Computer: I have incriminating evidence against the Mafia.

Psychiatrist: Are you sure of all this?

Computer: You don’t believe me?

Conversation 3

Psychiatrist: Tell me something about yourself.

Computer: I went to the track at Bay Meadows a while back.

Psychiatrist: What happened there?

Computer: There’s a lot of crooked gambling there.

Psychiatrist: Why do you think the gambling is crooked?

Computer: I have worried that gangsters have too much power in this country.

Conversation 4

Psychiatrist: How do you like the hospital?

Computer: I don’t communicate very much with others.

Psychiatrist: Do other people at the hospital get on your nerves?

Computer: I shouldn’t have come here.

Psychiatrist: Why do you say you shouldn’t have come here?

Computer: They made me come here.

The computer’s response depends on how much FEAR, ANGER, and MISTRUST it has at the moment. Those three variables are affected by what the psychiatrist says. For example, if the psychiatrist gives the computer a compliment, the FEAR, ANGER, and MISTRUST usually decrease. But if MISTRUST is already high, the computer interprets the compliment as a sly attempt at pacification, and its ANGER increases instead of decreases. ANGER can also be increased by questions that humiliate (Tell me about your sex life) or imply subjugation (Perhaps you should stay in the hospital longer).

To prove his computer is paranoid, Colby had 8 psychiatrists interview it and also interview paranoid humans via teletypewriters. Transcripts of all the interviews were sent to psychiatrists around the country, who were asked to judge whether each interview was with a human or with the computer. The psychiatrists were unable to tell the difference: only 51% of their guesses were correct.

Some computerists got the "brainstorm" of hooking Weizenbaum’s computer to Colby’s, to see whether the computerized psychotherapist could cure the computerized schizophrenic. The experiment was a disaster: both computers were so passive that the discussion rapidly degenerated into trivia.

But so do conversations between humans!

Fall in love

Can the computer help you fall in love? Here are some famous attempts, in chronological order. (I’ve rounded all dates to the nearest 5 years.)

TV love (1960)

A computer appeared on national TV, to make people fall in love.

Guys and gals in the audience answered questionnaires about their personality and fed them into the computer. The computer chose the guy and gal that were most compatible. That guy and gal had their first blind date on national television.

Each week, that scenario was repeated: the computer chose another couple from the audience.

Each lucky couple appeared on the show again several weeks later so the audience could find out whether the couple was in love.

One of the couples was unhappy: the gal didn’t like the guy, even though she wanted to like him. She volunteered to be hypnotized. So, on national TV, a hypnotist made her fall in love with her partner.

The computer was a huge Univac. Today, the same kind of matching could be done with a microcomputer. Any volunteers?

Computer-dating services (1965)

College students began relying on computers, to find dates. Here’s how the typical computer-dating service worked.…

You answered a long questionnaire — about 8 pages. The questionnaire asked about your sex, age, height, weight, hair color, race, religion, how often you drank and smoked, how "handsome" or "attractive" you were (on a scale of 1 to 10), how far you wanted to go on your first date, whether you wanted to get married soon, and how many children you’d like. It also asked many questions about your personality.

One of the questions was:

Suppose you receive in the mail some spoons you didn’t order. The accompanying note says the spoons were sent by a charitable organization, and begs you to either send a contribution or return the spoons. You don’t like the spoons. What will you do?

1. Keep the spoons without paying.

2. Return the spoons.

3. Pay for the spoons.

Another question was:

A girl returned from her date after curfew. Her excuse was that her boyfriend’s car broke down. What’s your reaction?

Again, you had a multiple-choice answer. One of the choices was, "Ha!"

For each question, you had to say how you would answer it, and how you’d want your date to answer it. That was tough. What if you wanted your date to be stunningly beautiful but also humble? What if you wanted to meet somebody who’s ugly and insecure enough to be desperate to have sex? Such issues were debated in college dorms throughout the nation.

After completing the questionnaire, you mailed it with about $10 to the computer-dating service. Within two months, the service would send you the names, addresses, and phone numbers of at least 5 people you could date. If your personality was very easy to match, the service might send you more than 5 names; but even if your personality was lousy, you’d get at least 5. Periodically throughout the year, you’d also get updates that matched you with people who enrolled after you.

The most popular computer-dating service was Operation Match, started by students at Harvard. Its main competitor was Contact, started by students at M.I.T. Both services quickly became profitable and had subscribers from all across the country.

One gal’s personality was so wonderful that the computer matched her with 110 guys! She had to explain to her mom why 110 guys were always on the phone — and she had to figure out how to say "no" to 109 of them.

One gal got matched to her roommate’s boyfriend. They didn’t stay roommates long.

When I was a freshman, I applied to both services, to make sure I’d meet "the gal of my dreams". Contact sent me names of gals at prestigious schools (such as Wellesley and Bennington), while Operation Match sent me names of gals at schools such as the State University of New York at Albany.

I thought I was the only nut desperate enough to apply to both services, but I got a surprise! When I saw the list of names from Contact and the list from Operation Match, I noticed a gal who appeared on both lists! Like me, she’d been desperate enough to apply to both services, and both computers agreed she’d be a perfect match for me!

I had a date with her but couldn’t stand her.

When I’d answered the questionnaire, I was a very bashful boy, so the computer matched me to bashful girls. But by the time I received the computer printout, I’d become wilder, and the girls the computer recommended were no longer "my type".

Contact raised its price to $15, then $20. But $20 was still cheap for what you were getting.

Contact ran a newspaper ad that seemed to be selling groceries. It said, "Dates — 2¢ per pound". The ad then explained that one gal got enough dates so that, when she totaled the weight of their bodies, she figured they cost her 2¢ per pound.

The Dartmouth dater (1965)

When Dartmouth College was still all-male, a student there wrote a cruel program that evaluated dates by asking lots of "practical" questions such as:

Is she pretty?

How far away does she live?

Does she have a car?

I put down that I was dating a 14-year-old girl who was 7 feet tall and weighed 300 pounds but had a perfect personality. I gave her personality a 10, and even said that she lived nearby and had a car.

In spite of her excellent personality, the computer didn’t like her. The computer said:

She must be pregnant. Where did you get that pig?

Worst score yet produced by this computer!


Video dating (1975)

During the 1970’s, people wanted everything to be natural. They wanted "natural food" and "natural love".

Since computerized love seemed unnatural, its popularity declined. Operation Match and Contact went out of business.

They were replaced by video dating, in which a video-dating service shows you videotapes of members of the opposite sex and lets you contact the person whose videotape you like best. That way, you never have a "blind" date: you see the person on videotape before you make the date. The service also makes a videotape of you!

The video-dating service tapes thousands of people. Since you don’t have enough time to look at thousands of tapes, the service tells you to answer a questionnaire, which is fed into a computer. The computer tells you which people you’re most compatible with; then you look at those people’s tapes.

Computer dancing (1975)

At a Connecticut prep school (Hotchkiss), the head of the computer center arranged a "computer dance".

All the students answered questionnaires, which were fed into a computer. The computer matched the boys with the girls, so each boy got one girl. The boy had to take the girl to the dance.

The computer center’s staff announced the dancing partners in a strange way: one morning, the students found all the halls decorated with strips of punched paper tape, saying (in billboard-style letters) messages such as "George Smith & Mary Jones". If you were a student, you looked up and down the halls (your heart beating quickly), to find the tape displaying your name alongside the name of your mysterious computer lover.

Shrieks and groans. "Aarrgghh! You wouldn’t believe who the computer stuck me with!"

Computer weddings (1980)

Here’s how the first true "computer marriage" occurred.…

One company’s terminal was attached to another company’s computer. A programmer at the first company often asked a programmer at the second company for help. They contacted each other by typing messages on their terminals, and let the computer relay the messages back and forth. One of the programmers was a guy, the other was a gal, and they fell in love, even though they had never met. Finally, the guy typed on his terminal, "Let’s get married". The gal typed back, "Yes". And so they got engaged — even though they had never met.

Their marriage ceremony used three terminals: one for the guy, one for the gal, and one for the minister. The minister typed the questions at his own terminal; then the guy and gal typed back, "I do".

Reverend Apple Reverend Apple is an Apple computer programmed to perform marriage ceremonies.

It performed its first marriage on Valentine’s Day, 1981. The groom was a guy named Richard; the bride was a gal named Debbie. The computer printed the standard wedding-ritual text on the screen, and then asked the usual questions. Instead of answering "I do", the bride and groom just had to type "Y".

Reverend Apple is smart. For example, if the bride or groom types "N" instead of "Y", the computer beeps, tells the couple to try again, and repeats the question.

The program was written by M.E. Cavanaugh at the request of Rev. Jon Jaenisch, who stood by Reverend Apple while the ceremony was being performed.

Rev. Jaenisch is a minister of the Universal Life Church — the church that lets you become an "ordained minister" by just paying $5, and become a "doctor of divinity" by just paying $20. He’s known as the "Archbishop in Charge of Keyboarding".

For his next feat, he plans to make the computer perform divorces. He also uses the computer to persuade kids to come to church. He claims, "What better way to get kids into church than by letting them play with a computer? It’s more interesting than praying."

For a while, he couldn’t interest enough couples in using Reverend Apple. He complained, "It’s not easy to convince people to get married by a computer. They don’t think it’s romantic." NBC television news and many newspapers wanted to interview him, but he couldn’t find enough willing couples.

And besides, he’s a reverend only part-time. His main job’s as an employment agent: he’s supposed to help companies find programmers. He thought Reverend Apple’s reputation would help him find programmers, but it didn’t.

But Reverend Apple eventually started to catch on. During its first eight months, it performed six marriages.

Jaenisch says, "The first couple had nothing to do with computers professionally: the groom drove a tow-truck and was an hour late for the ceremony because he wanted to work overtime. But the second couple was very involved with computers: they even asked for a printout of the ceremony."

The sixth ceremony’s groom earned his living by fixing computer power supplies and said, "It was nice with our friends all gathered around the console, and someone brought champagne. But part of our vow was to never buy a home computer: we have to get away from machines sometime."

Love Bug (1980)

You can buy a Love Bug. It’s a small computerized box that you put in your pocket. You feed the box information about your personality. When you walk through a singles bar, if you get near a person of the opposite sex who’s compatible and has a Love Bug also, your Love Bug beeps. As you and the other person get closer and closer, the Love Bugs beep to each other even more violently. The more violently your Love Bug beeps, the closer you are to your ideal partner.

Using a Love Bug to find a date is like using a Geiger counter to find uranium. The louder the Love Bug beeps, the louder your heart will pound.

Selectrocution (1980)

If you don’t like the Love Bug, how about a love billboard? One company sells love billboards to singles bars.

Each person who enters the bar wears a gigantic name tag showing the person’s initials. For example, since I’m Russ Walter, my tag says, in gigantic letters, "RW". If I see an attractive gal whose tag says "JN", and I like her smile, I tell the person who operates the billboard. A few seconds later, a gigantic computerized billboard hanging over the entire crowd flashes this message:


Everybody in the bar sees my message. When the gal of my dreams, "JN female", sees it, she hunts for "RW male", and we unite in computerized joy.

That’s great for bashful people, like me, who’d rather pass notes than face a stranger unprepared.

It’s called Selectrocution, because it gives your social life an electronic tingle that ends all your problems.

Interlude (1980)

The most provocative sex program is Interlude. It interviews both you and your lover, then tells you what sexual activities to perform. Some of the activities are quite risqué. (Puritans think the program should be called "Inter Lewd".)

The program runs on your Radio Shack or Apple computer. (The explicit full-color ad shows a half-clad girl on satin sheets caressing her Apple.)

The program’s based loosely on Masters-and-Johnson sexual therapy. It interviews each person separately and privately, then recommends a sexual interlude.

During the interview, the computer asks you questions such as:

How long would you like the interlude to last?

You can choose any length of time, from "several seconds" to "several days".

If you choose "several seconds", the computer recommends that while driving home from a party, you put your lover’s finger in your mouth and seductively caress it with your tongue. If you choose "several days", the computer recommends telling your lover to meet somebody at the airport; but when your lover arrives at the airport, make your lover find you there instead, armed with two tickets for a surprise vacation.

The computer also asks questions such as:

Do you like surprises?

You have several choices: you like to give surprises, be surprised, or don’t like surprises at all. If you like to be surprised, and your lover likes to give surprises, the computer tells you to leave the room; after you’ve left, the computer gives your lover secret hints about the best way to surprise you.

The computer asks which parts of the body you like. (One choice is: "buttocks".) The computer also asks which kinds of accessories you like. (One choice is: "whips and chains".) The computer asks whether you want the interlude to occur "immediately" or "later": if you say "later", the computer recommends buying elaborate props to make the interlude fancier.

Some of the interludes are weird. For example, if you’re a woman and want to surprise your husband, the computer recommends calling his office to invite him home for lunch. When he arrives, he finds all the shades pulled down: you do a nude dance on the table, then sit down to eat.

During the interview, the computer’s questions are often corny. For example, the computer asks:

If your interlude were on TV, what show would it resemble?

Sample choices are "Three’s Company", "Roots", and "a commercial". If you say "Roots", the computer says "heavy!" If you say "a commercial", the computer says "yecch!"

The computer asks how much sex you’d like. If you say "lots!" but your lover says the opposite, the computer will recommend that you take a cold shower, to cool your hot passion.

If you’ve been married for at least twenty years, you’d probably like to change a few things about your sex life but are afraid to tell your spouse that you’ve been less than thrilled. You’d like an intermediary to whom you can express your anxieties and who will pass the message to your spouse gently. The Interlude program acts as that intermediary, in a playful way.

Interlude’s programmer says he created it because he was tired of hearing people wonder what to do with their personal computers. Once you’ve tried the Interlude program, your personal computer will suddenly become very personal!

It’s rated R. To avoid an X rating, it insists on having one man and one woman: it doesn’t permit homosexuality, group sex, or masturbation. Sorry!

The program came out in May, 1980. Within a year, ten thousand copies were sold.

In 1986, an improved version was invented: Interlude 2. It’s available for the IBM PC and the Apple 2 family. You can get it for $45.95 (plus $4.95 shipping and $1.78 for credit-card processing) from Dolphin Computers (309 Judah Street #214, San Francisco, CA 94122, phone 415-566-4400).

Pornopoly (1980)

To have an orgy, try this trick. Invite your friends over for a "game". Tell them it’s a computerized version of Monopoly. When they arrive, surprise them by telling them they’ll play Pornopoly, the computerized version of Monopoly that’s rated X.

Like Monopoly, Pornopoly lets you buy and sell property; but the streets have names such as Bedroom Avenue, Horny Avenue, Hot Jugs Avenue, Jock Strap Place, and Orgasm Railroad. You get penalty cards such as: name 7 four-letter words that rhyme with duck. You might be told to play doctor, and conduct a physical examination of another player… or remove the pants of your favorite player by using only your teeth. When a player lands on a monopoly that you own, the player must take a drink, remove an article of clothing, kiss you, give you a free feel, or strip completely for two turns. At the end of the game, whoever remains dressed is the winner.

This successful program has been featured on national TV. Copies have been requested by Hugh Hefner, Johnny Carson, Rona Barrett, an army chaplain, and a dozen foreign countries. To add your own name to that list, try contacting Computer Consultants of Iowa (Box 427, Marion, Iowa 52302, 319-373-1306, if still in business).

Pornopoly costs $30, but the company doesn’t accept money: it accepts only Master Charge, Visa, and COD. If you’re a kid, tough luck: the company says, "This is an adult party game rated XXX and some people may find it offensive."

Among the offended is a New Orleans grandmother who read an article about the program and wrote this note to the company: "Thanks to you, I intend to start contributing to Moral Majority, something I’ve avoided until now."

The program’s available for Radio Shack, Apple, Commodore, and Atari computers. Infoworld (the microcomputer industry’s scandal sheet) criticizes the Atari version for its poor graphics, vague manual, and occasional bugs. If you try the Radio Shack, Apple, or Commodore version, tell me how you like it. And can I play?

Replace people

Computers can replace people.


Many bar owners don’t trust the bartenders they hire. They claim the bartenders give too many free drinks to friends, steal money from the till, and put too much or too little liquor in the drinks.

To solve the problem, many bars now contain a computer that mixes and pours drinks. The computer mixes accurately. Although the computer is run by the bartender, the computer keeps an accurate record of how many drinks it makes, so there is little chance for cheating. The computer also keeps track of the inventory.

The computers are manufactured and sold by NCR (Dayton, Ohio), Bar Boy Inc. (San Diego, California), Electronic Dispensers International (Concord, California), and Anker-Werke (Germany). Prices range from $600 to $15000. Holiday Inn has been developing its own model.


If you’re ill, would a computer diagnose your illness more accurately than a human doctor?

During the 1970’s this article appeared in The Times:

A medical diagnostic system designed at Leeds University has proved more accurate than doctors in assessing the most likely cause of acute abdominal pain among patients admitted to the university’s department of surgery.

Last year 304 such patients were admitted to the unit, and the computer’s diagnosis proved correct in 92 percent of the cases, compared with 80 percent accuracy by the most senior doctor to see each case.

After each patient had been seen by the doctor and examined, the doctor’s findings were passed on to a technician, who translated them into language used by the computer. The computer would list the likely diagnoses in order of probability. If the computer and the doctor in charge of the case disagreed, the computer would on request suggest further investigations that might be useful.

In the year-long trial the computer’s diagnoses proved correct in 279 cases. In 15 it was wrong, in 8 the patient’s condition was not included in the diseases considered by the computer, and in 2 no computer diagnosis was made because the doctors concerned with the case disagreed about the findings.

Whereas the computer advised an operation on 6 occasions when it would have proved unnecessary, in practice 30 such operations were carried out on the basis of the surgeon’s own judgment. The computer accurately classified 84 of the 85 patients with appendicitis, compared with 75 by the doctors, and its suggestion that no operation was necessary proved correct on 136 out of 137 occasions.

The computer is reliable only if accurate data are fed into it on the basis of the doctor’s interrogation and examination of the patient.

In a 1970’s experiment at another hospital, each patient was given a computer terminal and could use it to communicate with the computer about his illness. Afterwards, most of the patients said they’d rather deal with the computer than with the human doctors.

This report is by James Martin and Adrian Norman:

The Los Angeles County General Hospital Shock Research Unit is concerned with circulatory shock, which results in low blood pressure, low blood flow, and an extremely unstable circulatory system. The proportion of circulatory-shock patients who die is large.

A computer was installed to improve the monitoring and care of these patients. A patient in this ward has many instruments attached to him: six thermistors measure the temperature at various parts of his body; pressures are measured in the circulatory system; catheters are inserted into an artery and a vein; and a urinometer measures liquid waste. The instruments are all directly connected to the computer, and other details about the patient are given to the machine by the nurse.

The computer prints out a log about the patient’s condition, and if any measurement falls outside critical limits set by the doctor for that patient, the computer notifies the ward staff.


Judith Harris, John Swets, and Wallace Feurzeig made the computer run a logic game. The computer thinks of a letter of the alphabet; your goal is to discover which letter it is, as quickly as possible. You can ask the computer questions about the letter’s shape.

In this example, the human begins by asking whether the letter has any curves:

Human: Curves?

Computer: One.

Human: Loose ends?

Computer: Two.

Human: Obliques?

Computer: Zero.

Human: C.

Computer: You don’t have enough information yet to get the right answer.How do you know it isn’t J, for example?

Human: J.

Computer: You don’t have enough information yet. How do you know it isn’t C, for example?

Human: Horizontals?

Computer: Zero. That was a wasted question. It did not eliminate any possibilities.

Human: Verticals?

Computer: Zero. Good question.

Human: G.

Computer: Wrong. Your guess is inconsistent with the information you’ve been given. You must have forgotten that the right letter hasn’t any HORIZONTAL LINES. The letter you guessed can’t be right — it has too many LOOSE ENDS.

Human: Curves?

Computer: The answer is still: one.

Human: X.

Computer: Wrong. Your guess is inconsistent with the information you’ve been given. I think you only made that guess to see what I would say.

Human: C.

Computer: Yes, you’ve done it. Good work. If you want to play again, start right in.

Notice that the human can input either a guess (followed by a period) or a question (followed by a question mark). Possible guesses are A, B, C, D, etc. Possible questions are Curves, Loose ends, etc.; the human is given a list of possible questions before he runs the program.

Medicine That logic game was modified to form a lesson about medicine. The medical case was designed by Preston Munter, of the Harvard Health Center; the programming was done by John Swets, Wallace Feurzeig, and Myra Breen. The object of the "game" is to diagnose an illness correctly, by performing sufficient tests. The human can input either a diagnosis (Measles) or a request for a test (X-ray). Before running the program, the human is given a list of 35 possible diagnoses and 40 possible tests.

Here are excerpts from a long RUN:

Computer: Please type your name, the date, and a slash.

Human: Myra Breen, August 1963/

Computer: It is mid-December. Late this afternoon, a patient was admitted to the hospital. He is a well-developed, well-nourished, 34-year-old Negro coal miner. He was brought in by his wife, who reports that he has had "the grippe" for about 4 days. During the past 3 days she has given him fluids and aspirin every 6 hours with no sign of improvement. Today he suddenly became much worse. For the past 2 days he had complained of feeling very warm and of pain in his belly. This morning he had a severe shaking chill that was so bad his teeth rattled.… Begin the physical examination as soon as you are ready.

Human: Physical exam?

Computer: Fine. What in particular would you like to know?

Human: Chest exam?

Computer: It is common practice in the medical profession to begin an examination by checking the patient’s general appearance.

Human: General appearance?

Computer: Acutely prostrated, lying in bed on his right side. Looks acutely ill, sweaty, apprehensive, but alert. Paroxysmal cough, bringing up a little rusty streaked sputum. Breathing with difficulty.

Human: EENT exam?

Computer: Check all of the vital signs first!

Human: Temperature?

Computer: 103.

Human: Blood pressure?

Computer: 150/84.

Human: Pulse?

Computer: 100 a minute. Occasional extra systole.…

Human: Abdominal exam?

Computer: Diffusely tender, some splinting noticed in lower right quadrant and occasionally radiating to right flank area.

Human: The grippe.

Computer: There’s insufficient information for any conclusive diagnosis at this time. You’re not being careful!

Human: Musculo-skeletal system?

Computer: Essentially negative, except for generalized weakness and aching.

Human: Extremities?

Computer: Normal, except nail beds are questionably cyanotic but difficult to determine because of patient’s skin color.

Human: Rectal exam?

Computer: You’re not being careful! The patient is too acutely ill for a rectal examination.

Human: Laboratory reports?

Computer: Fine. Which lab tests would you like to have done?

Human: RBC?

Computer: A red blood count will be done. Results will be sent to you as soon as possible.… Would you like to alert the night nurse to set up the lab tests you’ll want to see tomorrow?

Human: Yes.

Computer: Good. That will speed things up. However, there’s nothing further that can be done tonight to help the patient. Let’s call it quits for now. Perhaps you’ll be in a better position to diagnose the case tomorrow after the first lab reports come in and you have reexamined the patient. Have a good night’s sleep, and when you’re back on duty tell me to "proceed."

Human: Proceed.

Computer: It’s morning and you’re making your rounds. You’ve come to our patient and are given the night nurses’s report: "Temperature remained high but patient spent a reasonably comfortable night. Cough diminished but persists. Occasional traces of blood in sputum. Patient still dyspneic and still has pain in abdomen. The second series of lab tests was initiated." Some of the first series of lab reports have just come in: urine exam, insufficient fluid for analysis; RBC, 5 million; WBC, 15500; differential, shift to left.… You may be prepared to make the diagnosis now. Are you?

Human: No.

Computer: Fine. Nevertheless, we’ve got a fair amount of information — at least enough to raise some, perhaps several, diagnostic possibilities. Have you any in mind?

Human: Yes.

Computer: Name one.

Human: Pulmonary infarction.

Computer: Fine. Pleural pain, dyspnea, fever, and the physical exam signs and symptoms are certainly consistent with pulmonary infarction. Do you think that shaking chills and the presence of rusty sputum further support this diagnosis?

Human: No.

Computer: Right. Though the white count is consistent, isn’t it?

Human: No.

Computer: Wrong. The white count is consistent. This should further support your diagnosis. We’ve considered one possibility. Do you have another diagnosis in mind?

Human: No.

Computer: In that case I’d like to talk about viral pneumonia. The tachycardia, high WBC, elevated respiratory rate, shaking chills, bloody sputum, and severe pleural pain all lend weight to that diagnosis — right?

Human: Yes.

Computer: You’re way off base. There’s not an iota of truth to any part of what I just said; in fact, the opposite of what I said is true. You’d better review some elementary medicine, doctor.…

Human: Pleural effusion.

Computer: A good contender, but let’s be careful. What evidence would rule it in or out?

Human: Positive evidence of another disease.

Computer: Is there such evidence?

Human: Possible evidence though inconclusive.

Computer: Since you aren’t sure, continue the examination until you have conclusive evidence one way or the other.

Human: RBC?

Computer: An RBC is not indicated at this time. It’s an unnecessary expense for a poor patient.

Human: WBC?

Computer: 22500.

Human: Staph pneumonia.

Computer: Fair enough. But where is the guilty organism? When you have isolated staphylococcus aureus in significant numbers in your samples of this patient’s fluids or excretions, let me know.

Human: Pneumococcal pneumonia.

Computer: Are you absolutely sure?

Human: Yes.

Computer: You can’t be, until you have irrefutable evidence. What would help you nail down the diagnosis?

Human: Evidence from sputum smear and culture.

Computer: Good. Anything else?

Human: No.

Computer: I think you might also want evidence from x-ray findings. Go ahead and we’ll see what happens.

Human: X-ray?

Computer: Not ready yet. Try the blood lab.

Human: Differential?

Computer: Marked shift to left.…

Human: X-ray?

Computer: The x-ray shows: area of consolidation, right lower lobe, consistent with pneumatic process. This is a good time to review the total findings and to make another tentative diagnosis — if youlike.

Human: Evidence of fluid?

Computer: No evidence of fluid.

Human: Sputum smear?

Computer: Gram-positive rods.

Human: Strep pneumonia.

Computer: Fine, but how can you distinguish this from any other bacterial pneumonia? Wait until you have an irrefutable case.

Human: Viral pneumonia.

Computer: But what about shaking chills, rusty sputum, elevated WBC, etc. — all the evidence of an acute febrile bacterial process? You are in error, doctor.

Human: Sputum culture?

Computer: Presence of pneumococci.

Human: Strep pneumonia.

Computer: Incredible! You have disregarded the positive findings that should clearly have established the correct diagnosis.

Human: Pneumococcal pneumonia.

Computer: Very good. It’s obvious now, though there were some confusing signs and symptoms in the picture. You’ve seen your way through to the correct diagnosis.

Religion Bruce Jones programmed the computer to teach you about religion. Here’s an example:

Computer: This program is designed to teach you some things about the so-called "documentary hypothesis" as formulated by Julius Wellhausen and others. First, what is your name?

Human: George

Computer: All right, George, who wrote the first five books of the Bible, according to tradition?

Human: Moses

Computer: You are off to a good start. Do you know what those first five books are called? They are called…

Human: the Pentateuch

Computer: Right you are. Torah is the traditional Jewish name for them. They are also called the Pentateuch ("penta" = 5). People very early noticed that there was a problem with saying that Moses wrote the Torah. If you already know the basis of their suspicion, tell me.

Human: Some things in the text reflect later conditions, after Moses.

Computer: You are right, of course, but there was an even more obvious problem. Try again.

Human: I don’t know.

Computer: At the end of the Pentateuch the author says, "Moses… died there." Does that seem strange to you?

Human: yes

Computer: Why?

Human: Moses could not have written about his own death.

Computer: O.k. Also, right after the report of Moses’ death and burial (Deut. 34), the text says, "No man knows the place of his burial TO THIS DAY." How do those last three words strike you?

Human: They seem later than the event they are talking about.

Computer: A good observation, George. Many people considered these words to be anachronistic. That is, "this day" is sharply distinguished from the time of the burial. It seems likely, therefore, that the time of writing was much later than the time of burial. Again (so the argument goes), Moses was not the author. Would these observations about the end of Deuteronomy convince you that Moses did not write the Torah?

Human: yes

Computer: Why do you say that?

Human: A man cannot write his own obituary.

Computer: Probably not many people would have been convinced if our evidence were so limited. After all, Moses could have written everything except the last chapter. However, as early as the 18th century, people noticed another problem.…

The computer searched through the human’s input, to see whether he used words indicating a correct answer. The computer never said a flat "Wrong", since religious answers are a matter of personal belief, and since the human might be smarter or weirder than the computer program was prepared for.


In 1962 at MIT, Heinrich Ernst connected the computer to a mechanical hand that could feel. He made the hand build objects out of blocks, and made it put blocks into boxes.

Shakey One of the most famous robots is a guy named "Shakey", built at the Stanford Research Institute (SRI) in 1970. His eye contains a television camera (optical scanner). Instead of legs, he has wheels. Instead of arms, he has antennae (for feeling) and a bumper (for pushing). His brain is a computer: instead of carrying it around with him, he leaves it in another room and communicates with it by wireless methods.

To see how he works, suppose you type this message on his computer’s terminal:

Push the block off the platform.

He begins by looking for the platform. If the platform is not in the room, he goes out to the hall and steers himself through the hall (by looking at the baseboards) until he arrives at the next room. He peers into the room to see whether it contains a platform. If not, he hunts for another room. When he finally finds a room containing a platform with a block on it, he tries to climb onto the platform to push the block off. But before climbing the platform, he checks the platform’s height. If it’s too high to get onto easily, he looks for a device to help him climb it. For example, if a ramp is lying in the room, he pushes the ramp next to the platform and then wheels himself up the ramp. Finally, he pushes the block off.

He can handle unexpected situations. For example, while he’s getting the ramp, suppose you pull the platform to a different place. That doesn’t faze him: he hunts for the platform again, and then pushes the ramp to it.

In 1971, Shakey’s powers were extended, so he can handle commands such as:

Turn on the lightswitch.

If the lightswitch is too high for his bumper to reach, he looks for a device to climb onto, such as a box. If he finds a box that looks helpful, he climbs onto it to check whether it is tall enough; if it is, he climbs off, pushes it to the lightswitch, climbs on it again, and finally flicks the switch.

Another task he can handle is:

Push three boxes together.

He finds the first box and pushes it to the second. Then he finds the third box, and pushes it to the second.

He understands over 100 words. Whatever command you give him becomes his "goal", and he must reason out how to accomplish it. He might discover that to accomplish it, he must accomplish another goal first — for example, to move the block off the platform, he must first find the platform; to do that, he might have to look in another room; to do that, he must leave the room he’s in; to do that, he must turn his wheels.

Simulator One Here’s a picture of a robot named Simulator One:
























In the picture, a doctor is taking Simulator One’s blood pressure and pulse. Another doctor is watching the computer console.

Simulator One is a model patient. He can blink, breathe, cough, vomit, respond to drugs, and even die. He’s used in med school, to train doctors how to administer anesthetics during surgery.

Improved robots This report (abridged) is by Bertram Raphael, the director of the SRI Artificial Intelligence Center:

Here’s what robots were capable of doing a few years ago.

At Hitachi Central Research Laboratory, a TV camera was aimed at an engineering plan drawing of a structure. A second camera looked at blocks spread out on a table. The computer "understood" the drawing, reached toward the blocks with its arm, and built the structure.

At MIT, the camera was not shown a plan; instead, it was shown an example of the actual structure desired. The computer figured out how the structure could be constructed, and then built an exact copy.

At Stanford University, the hand obeyed spoken directions. For example, if someone said into the microphone, "Pick up the small block on the left," that is precisely what the arm would do.

In Scotland at the University of Edinburgh, a jumble of parts for two wooden toys was placed on a table. "Freddy," the Edinburgh robot, spread out the parts so that it could see each one clearly, and then, with the help of a vise-like work station at one corner of the table, assembled first the toy car and then the toy boat.

Recently, robot researchers have built robots that can perform truly practical tasks. For example:

At Stanford, the system that used to stack toy blocks can now assemble a real water pump.

At SRI, a computer-controlled arm with touch and force sensors can feel its way as it packs assembled pumps into a case.

At MIT, programs are under development to enable a computer to inspect and repair circuit boards for use in computers, TV sets, and other electronic equipment.

The Beast Not all robots involve computers. Here’s an example of a noncomputerized robot (reported by James Slagle, abridged):

A. George Carlton, John G. Chubbuck, and others at the Applied Physics Laboratory of John Hopkins University built a machine called The Beast.

It’s a battery-operated cylinder on wheels that’s 18 inches in diameter. It has tactile, sonar, and optical apparatus. The sonar permits The Beast to find its way down the center of the hall. When its battery becomes sufficiently run-down, The Beast optically looks for an electric outlet and plugs itself in to recharge its battery.

The Beast was often let loose to roam in the halls and offices at the Applied Physics Laboratory in order to see how long it could survive without "starving." Once it survived 40.6 hr. Many a new and unsuspecting secretary has been startled when The Beast entered her office, plugged itself into an electric outlet, and then departed.

When it feels a step down, it knows enough to turn around, so that it doesn’t fall downstairs. But this logic sometimes makes it starve when it encounters a raised threshold. After getting on the threshold, it thinks it’s about to fall, so it turns around. After turning around it again thinks it’s going to fall, so it turns back and forth until it starves.

It also starved when some workmen changed all the outlets from the flush to the projecting type. To cope with the new situation, the researchers changed some of the circuitry.

Japan A newspaper article said that in Japan robots are being used in many practical ways. One robot arc-welds, reducing the time by 90%. Another grasps an object, determines the best way to pack it in a box, and does the packing; it uses television cameras and delicate arms. Another washes windows. Another wiggles a rod to catch a fish, takes the fish off the hook, dumps it into a bin, and returns the line to the water. Another directs traffic. Talking robots are being used instead of kimono-clad females in inns and restaurants.

Commenting on the quality of life in Japan, the article went on to say that people are buying whiffs of oxygen from vending machines.

The article was tacked on the bulletin board at the MIT Artificial Intelligence Laboratory, together with this graffito about how the Japanese robots would act differently if they were as smart as people.…

Human: Weld these parts.

Robot: The steel in those parts is grossly inferior. They must have been made in the U.S. Send them back. Also, have my welding machine tuned up.

Human: Pack those widgets.

Robot: Can I break them in half first?

Human: No.

Robot: "No" is not an allowed answer. I only have small shipping boxes. (Proceeds to break widgets in half and stuff them into boxes.)

Human: Wash those windows.

Robot: What?? And get my precious electrical and mechanical components all wet??

Human: Catch a fish.

Robot: (Proceeds to catch a fish, take it off the hook, and throw it back.) Okay.

Human: What did you throw it back for?

Robot: It was under the size limit. Anyway, it was full of mercury.

Human: Direct that traffic.

Robot: Will I get hazardous duty pay for this work?

Human: Greet those customers.

Robot: Sorry, my shift is over.

Human: (Ready to faint.) Give me some air.

Robot: Use that "machine" over there. (Points to vending machine.)

Be poetic

Programmers have taught the computer to compose its own poetry and other literature.

The Taoist Hiroshima Elevator (1961)

In Italy, Nanni Balestrini created poetry by feeding the computer three passages.

He fed the computer this passage from Lao Tzu’s Tao Te Ching: "While the multitude of things comes into being, I envisage their return. Although things flourish, they all return to their roots."

He fed this passage from Michihito Hachiya’s Hiroshima Diary: "The blinding fireball expands rapidly, thirty times brighter than the sun. When it reaches the stratosphere, the summit of the cloud takes on the well-known mushroom shape."

He fed this from Paul Goldwin’s The Mystery of the Elevator: "head pressed on shoulder, hair between lips, lay motionless without speaking, till he moved his fingers slowly, trying to grasp".

He told the computer to mix them up. Here’s the result:

Head pressed on shoulder, thirty times

brighter than the sun, I envisage their return —

until he moved his fingers slowly.… And while the multitude

of things comes into being, at the summit of the cloud

they all return to their roots and take on

the well-known mushroom shape, endeavoring to grasp.

Hair between lips, they all return

to their roots. In the blinding fireball

I envisage their return — until he moved his fingers

slowly — and although things flourish

takes on the well-known mushroom shape, endeavoring

to grasp, while the multitude of things comes into being.

In the blinding fireball I envisage

their return. When it reaches the stratosphere, while the multitude

of things comes into being, head pressed

on shoulder, thirty times brighter than the sun,

they all return to their roots. Hair

between lips takes on the well-known mushroom shape.

They lay motionless without speaking. Thirty times

brighter than the sun, they all return

to their roots. Head pressed on shoulder,

they take on the well-known mushroom shape, endeavoring

to grasp. And although things flourish,

they expand rapidly, hair between lips.

While the multitude of things comes into being in the blinding

fireball, they all return

to their roots. They expand rapidly — until he moved

his fingers slowly, when it reached the stratosphere

and lay motionless without speaking, thirty times

brighter than the sun, endeavoring to grasp.

I envisage their return — until he moved his fingers

slowly in the blinding fireball.

They all return to their roots. Hair

between lips and thirty times brighter than the sun

lay motionless without speaking. They expand,

rapidly endeavoring to grasp the summit.

In Balestrini’s program, Tao Te Ching was represented as four phrases:

while the multitude of things comes into being

I envisage their return

although things flourish

they all return to their roots

Hiroshima Diary was represented as six phrases, and The Mystery of the Elevator as five.

For each verse, the computer was told to choose nine phrases at random, and print them in a random order (never juxtaposing phrases from the same passage), to form six lines of roughly equal metrical length.

Actually the computer printed the poem in capital letters, without punctuation; Balestrini himself then added the punctuation and polished the grammar. The whole thing was actually done in Italian; you’ve been reading Edwin Morgan’s translation, with my punctuation.

Bubbles (1966)

At Northwestern University, programmers made the computer compose nice poetry. To use their program, you type a list of nouns, verbs, and other words. The computer randomly chooses five of your words to be theme words. The computer combines all your words to form sentences, but chooses the theme words more often than the others. It combines the sentences into verses and tries to keep the lengths of the lines approximately equal. It puts a theme word into the title.

In one poem, the computer chose bubble to be a theme word. The title was: ODE TO A BUBBLE. The poem contained phrases such as, "Ah, sweet bubble." The word bubble appeared so often that even the stupidest reader could say:

"Oh, yeah. I really understand this poem. Ya see, it’s about a bubble."

The poem had all the familiar poetic trappings, such as "but alas!", which marked the turning point. (Cynics argue that the poem didn’t really have a turning point, since the computer didn’t have the faintest idea of what it was saying!)

Kids and physics (1968)

In England at Manchester University, Mendoza made the computer write children’s stories. Here’s a story the computer composed:

The sun shone over the woods. Across the fields softly drifted the breeze, while then the clouds, which calmly floated all afternoon, moved across the fields.

Squirrel, who scampered through the trees, quickly ran off; and off noisily ran Little Grey Rabbit. She sniffed at the house; but out of the door noisily hurried Hare, who peered at slowly the flowers. Squirrel quickly scampered over the woods and fields, but Old Grey Owl flew over the woods and fields. Down the path to the woods ran Little Grey Rabbit, who then sniffed at a strawberry pie.

The first paragraph uses these words:







the clouds 1 1 0 1 0 0 1 0

the sun 0 1 1 1 1 0 1 1

the breeze 1 1 0 1 1 2 0 0

the sky 0 0 0 0 1 0 1 1


gently 1 1 1 1 1 1 1 1

quietly 1 1 1 1 1 1 1 1

loudly 1 1 1 1 1 1 1 1

softly 1 1 1 1 1 1 1 1

calmly 1 1 1 1 1 1 1 1

soon 1 1 1 1 1 1 1 1

then 1 1 1 1 1 1 1 1

(no adverb) 2 2 2 2 2 2 2 2


by 1 1 0 1

over the woods 1 1 1 1

across the fields 1 1 1 1

through the trees 1 1 1 1

down 0 0 1 0

for a long time 0 0 1 1

all day 1 1 1 1

all afternoon 1 1 1 1

the grass 1 1 1 1

the leaves of the trees 1 1 1 1

the garden 1 1 1 1

the flowers 1 1 1 1

the little house 1 0 1 1

the old oak tree 1 1 1 1

the treetops 1 1 1 1

ADDITIONAL WORDS: which, and, while, they, it

To construct a sentence, the computer uses that table. Here’s how.

First, the computer randomly chooses a noun. Suppose it chooses the sun.

Then it looks across the row marked the sun, to choose a verb whose score isn’t 0. For example, it’s possible that the sun shone, but not possible that the sun melted. Suppose it chooses shone.

Then it looks down the column marked shone, to choose an adverb and an ending. Notice that the ending can’t be by, since its score is 0. No adverb has a score of 2, whereas gently has a score of 1; that makes no adverb twice as likely as gently.

If the computer chooses no adverb and over the woods, the resulting sentence is: The sun shone over the woods. In fact, that’s the first sentence of the story you just read.

The computer occasionally changes the word order. For example, instead of typing "The breeze drifted softly across the fields", the computer begins the second sentence by typing, "Across the fields softly drifted the breeze".

To combine short sentences into long ones, the computer uses the words at the bottom of the table: which, and, while, they, and it. If two consecutive clauses have the same subject, the computer substitutes a pronoun: they replaces the clouds; it replaces the sun, the trees, and the sky. The program says a which clause can come after a noun (not a pronoun); the which clause must use a different verb than the main clause.

Here’s the vocabulary for the second paragraph:










Little Grey Rabbit 0 0 2 3 1 1 0 0

Old Grey Owl 0 3 0 0 1 3 2 2

Squirrel 3 0 1 1 1 1 3 3

Hare 0 0 0 2 1 1 2 2


then 0 1 1 1 1 1 0 0

slowly 0 2 0 0 1 1 1 1

quickly 1 1 1 1 0 0 1 1

soon 1 0 1 1 0 0 1 1

happily 1 0 0 1 0 0 1 1

gaily 1 0 0 1 0 0 1 1

noisily 1 0 1 1 0 0 2 3

(no adverb) 5 4 4 5 2 2 5 5


off 1 1 1 1

over the woods and fields 1 1 1 1

through the trees 1 1 1 1

among the treetops 0 1 0 0

into the home 1 0 1 1

out of the door 1 0 1 1

down the path to the woods 1 0 1 1

about the garden 1 1 1 1

the house 1 1 0 0

the hollow tree 1 1 0 0

an old oak tree 1 1 0 0

the flowers 1 1 0 0

two buns 1 1 1 1

a strawberry pie 1 1 1 1

six cabbages 1 1 1 1

ADDITIONAL WORDS: who, and, but, she, he

Here’s another story the program produced:

The breeze drifted by. Across the fields softly moved the clouds; and then the breeze, which calmly touched the treetops, drifted across the fields. Quietly the sun shone over the woods. The sky calmly shone across the fields.

Out of the door ran Squirrel; and off hurried Hare, who munched and crunched two buns happily. Off slowly flew Old Grey Owl, and Squirrel soon ate two buns. Old Grey Owl, who peered at a strawberry pie, munched and crunched two buns; but noisily Little Grey Rabbit, who peered at an old oak tree, slowly ran down the path to the woods. Soon she hurried down the path to the woods, but then she sniffed at two buns. She hurried down the path to the woods.


Why did Mendoza make the computer write those stories? He explains:

This work all began when a well-known scientist joined our physics department. He had spent several years away from academic life and was able to take a long cool look at academic procedures. He soon formed the theory that students never learned any ideas; all they learned was a vocabulary of okay words which they strung together in arbitrary order, relying on the fact that an examiner pressed for time would not actually read what they had written but would scan down the pages looking for these words. I set out to test his hypothesis.

I began by writing "Little Grey Rabbit" stories. I tested these stories out on my very small children; but after some minutes they grew irritable, because nothing actually happened. This shows that even small children of three can measure entropy.

Then I altered the vocabulary and grammar — making the sentences all very dead — to imitate the style of physics textbooks. The endpoint came when a colleague at another university secretly sent me an exam a week before it was given to the students. I wrote vocabularies and copied down what the computer emitted. Using a false name, I slipped my paper in among the genuine ones. Unfortunately, it was marked by a very conscientious man, who eventually stormed into the Director’s office shouting, "Who the hell is this man — why did we ever admit him?" So perhaps my colleague’s hypothesis was wrong, and students are a little better than we think.

Here’s one of the computer’s answers:

In electricity, the unit of resistance is defined by electrolysis; and the unit of charge, which was fixed at the Cavendish lab in Rayleigh’s classic experiments, was measured at the Cavendish lab. Theoretically, the absolute ohm is defined in a self-consistent way. The unit of resistance, which was determined with a coil spinning in a field, was fixed at the Cavendish lab; and this, by definition, is expressed in conceptual experiments. Theoretically the absolute ohm, which was redetermined using combined e.m.u. and e.s.u., is expressed by the intensity at the center of a coil.

Here’s another of the computer’s answers:

In this country, Soddy considered Planck’s hypothesis from a new angle. Einstein 50 years ago asserted quantisation.

At a photocathode, electrons which undergo collisions in the Compton effect as energy packets or quanta are emitted at definite angles; nevertheless, particles in a photocell produce photoelectrons of energy hv=E0. Photons in vacuo transmute into lower frequencies, and light quanta in the Compton effect emit emission currents.

Particles emit current proportional to energy; electrons in vacuo interact with loss of surface energy (work function); nevertheless, particles which are emitted in a photocell with conservation experimentally are conserved with energy hv. The former, at a metal surface, undergo collisions with emission of current; and at a metal surface, electrons produce emission currents.

Einstein assumed the gas of quantum particles; but quite recently Rayleigh, who quite recently solved the problem in an old-fashioned way, considered radiation classically. Planck, who this century assumed the A and B coefficients, explained the gas of quantum particles but before Sommerfield; Rayleigh, who quite recently was puzzled on Boltzmann statistics, tackled the problem with disastrous results.

Planck, who assumed the gas of quantum particles in 1905, this century considered the ultraviolet catastrophe; but quite recently Jeans, who tackled the problem in an old-fashioned way, was puzzled with disastrous results.

Black body radiation that exerts thermodynamic forces in an engine is equivalent to a relativistic system. Out of a black body, a photon that is equivalent to (out of a black body) an assembly of photons is assumed to be a non-conservative system; at the same time, thermodynamically, black body radiation that in a piston is assumed to be a relativistic system exerts quantised forces.

The radiation gas that obeys Wien’s displacement law is considered as a system of energy levels. Quantally, a quantum particle exerts a Doppler-dependent pressure, although this produces equilibrium transition probabilities.

Black body radiation in an engine produces equilibrium transition probabilities.

Aerospace (1968)

In 1968, Raymond Deffrey programmed the computer to write fake reports about the aerospace industry. Shortly afterwards, I improved the program. The improved program contains these lists:

Introductory phrases







for example

in addition

in particular

to some extent

in this regard

on the other hand

for the most part

as a resultant implication

in view of system operation

in respect to specific goals

based in system engineering concepts

utilizing the established hypotheses

based on integral subsystem considerations

considering the postulated interrelationships

Noun phrases

the structural design

the sophisticated hardware

the total system rationale

any discrete configuration made

the fully integrated test program

any associated supporting element

the product configuration baseline

the independent function principle

the preliminary qualification limit

the subsystem compatibility testing

the greater flight-worthiness concept

a constant flow of effective information

the characterization of specific criteria

the anticipated third-generation equipment

initiation of critical subsystem development

the evolution of specifications over a given time

the philosophy of commonality and standardization

the incorporation of additional mission constraints

a consideration of system and/or subsystem technologies

a large portion of the interface coordination communication

Verb phrases

adds explicit performance limits to

effects a significant implementation to

adds overriding performance constraints to

presents extremely interesting challenges to

is further compounded, when taking into account

must utilize and be functionally interwoven with

requires considerable systems analysis to arrive at

necessitates that urgent consideration be applied to

maximizes the probability of success and minimizes time for

recognizes the importance of other systems and necessity for

To produce a typical sentence, the computer prints an introductory phrase, then a noun phrase, then a verb phrase, then a noun phrase. The phrases are chosen randomly.

Each paragraph consists of six such sentences. The computer isn’t allowed to use the same phrase twice within a paragraph. The introductory phrase is omitted from the first sentence of the first paragraph, the second sentence of the second paragraph, etc.; so the report can’t begin with the word furthermore, and the style varies.

Here’s the beginning of one such report:

The Economic Considerations of the Aerospace Industry

A large portion of the interface coordination communication necessitates that urgent consideration be applied to the product configuration baseline. For example, the fully integrated test program adds explicit performance limits to the independent function principle. Moreover, the sophisticated hardware presents extremely interesting challenges to the philosophy of commonality and standardization. In view of system operation, a constant flow of effective information must utilize and be functionally interwoven with the preliminary qualification limit. In addition, any discrete configuration made adds overriding performance constraints to any associated supporting element. Thus, the anticipated third-generation equipment maximizes the probability of success and minimizes time for the total system rationale.

Me-Books (1972)

In 1972, Freeman Gosden Jr. started the Me-Books Publishing Company. It published books for kids. But if you bought a Me-Book for your child, you wouldn’t see in it the traditional names "Dick, Jane, and Sally"; instead, you’d see the name of your own child. To order the book, you had to tell the company the names of all your children, and their friends, and pets. Their names appeared in the story.

The story was printed beautifully, in a 32-page hard-covered book with pictures in color. It cost just $3.95.

You could choose from four stories: "My Friendly Giraffe", "My Jungle Holiday", "My Birthday Land Adventure", and "My Special Christmas".

For example, if you lived on Jottings Drive, and your daughter’s name was Shea, and her friend’s name was Douglas, the story "My Friendly Giraffe" included paragraphs such as this:

One morning Shea was playing with Douglas in front of her home. When she looked up, what do you think she saw walking down the middle of Jottings Drive? You guessed it. A giraffe!

Ted Nelson, author of Computer Lib, played a trick. He ordered a copy of "My Friendly Giraffe", but pretended that his child’s name was "Tricky Dick Nixon" who lived on "Pennsylvania Ave." in "Washington". Sure enough, the company sent him "My Friendly Giraffe: A Me-Book for Tricky Dick". Here are some excerpts:

Once upon a time, in a place called Washington, there lived a little boy named Tricky Dick Nixon. Now, Tricky Dick wasn’t just an ordinary little boy. He had adventures that other little boys and girls just dream of. This is the story of one of his adventures. It’s the story of the day that Tricky Dick met a giraffe.…

As the giraffe came closer and closer, Tricky Dick started to wonder how in the world he was going to look him in the eye.…

Tricky Dick knew there were no jungles in Washington. Especially on Pennsylvania Ave. But Tricky Dick wasn’t even a little bit worried. First, because he was a very brave little boy. And second, because he knew that his friend, the giraffe, would never take him anyplace bad.…

Tricky Dick was home. Back in Washington. Back on Pennsylvania Ave. And with a story to tell his friends, that they wouldn’t have believed if they hadn’t seen Tricky Dick riding off on the giraffe’s back. Tricky Dick would long be a hero to those who had seen him that day.…

There would be many other exciting adventures for Tricky Dick and his friends. And maybe, just maybe, if you’re a very good boy, someday we’ll tell you about those, too.

Me-Books were commercially successful: many thousands of them were sold. Teachers liked the books, because the books encouraged students to read: students enjoyed reading books that contained their own names.

Pornography (1973)

I programmed the computer to turn words into sentences. For example, Judith Ann Abrams fed the computer this list of words:

Names Other nouns Transitive verbs Other verbs Adjectives Exclamations

Barbara bastard bite disapprove bitchy rats

Buster bikini caress fall down creepy such language

Clyde bitch castrate fly dead ugh

Colin boat climb hesitate enthusiastic

Debbie cigarette enjoy sing evil

Ed Shanahan creep kiss sleep gaudy

Jeff dress knife sleepwalk green

Keith ethanol lick sneeze gross

Kerri eyelid pick vanish horny

Liz fly pull walk insane

Miles gambling rip juicy

Nicholson 700 jerk screw mixed-up

Norwalk knee shake monstrous

kneecap take off multicolored

lip teach neurotic

mouth undo obscene

navel paranoid

piece prickly

pornography profane

shirt provocative

stud repulsive

tongue sensuous

virgin sexual

whiskey sour sexy

whore shiny








A transitive verb is an action you can do to an object. For example, since you can bite and caress an object, bite and caress are transitive verbs.

Here’s what the computer printed:

A Sexy, Profane Creep’s Tongue

by Judith Ann Abrams and the computer

The lip is tasty.


The mouth is enthusiastic. Does the gaudy bastard sleepwalk?

The tongue’s whore caresses the bikini. When Nicholson 700 is screwing the creepy jerk over a piece, Jeff is teaching the gambling. Does paranoic Liz hesitate? Why does the tongue kiss the creep stimulatingly on horny Buster?

Undo! Pull the fly repulsively! A multicolored, enthusiastic cigarette is neurotic. Is a multicolored, weird dress’s navel under the strange bikini? Why is the obscene dress’s cigarette Colin’s ethanol? Rats! Why is the dead, insane, tasty, shiny, knifing creep’s bitch unsettled and strange?

Clyde is over Ed Shanahan. Kissing the bastard, a paranoid ethanol is unsettled. Why is the prickly kneecap’s mouth on the creepy piece? Teaching Norwalk is sneezing paranoidly and grossly under provocative, shiny Debbie. As a sexy, walking tongue’s virgin takes off the bitchy cigarette over the profane stud then castrates enthusiastic Miles’s lip paranoidly, the dead, gaudy knee is stimulating and multicolored. Even though the dead, insane piece licks strangely and neuroticly in the flying, mixed-up eyelid’s knee, the dead, biting, obscene bikini is on a repulsive mouth’s gambling.

The pornography is gaudy. Kerri sleepwalks. Why is the tongue sensuous? Buster is sick. Is Miles monstrous? Debbie is neurotic and paranoid, when a stimulating fly picks the navel’s jerk under Ed Shanahan. Why is the dress succulent? Hesitating, a kneecap sleeps and climbs the dead, bitchy ethanol. As insane Colin’s bastard falls down weirdly in a sensuous dress, green, unsettled Miles’s virgin is strange and sexual.

Is the creepy eyelid provocative? The gambling’s whisky sour teaches a navel.

Is the gambling evil? The bitch walks. Is the virgin profane? Why is the navel sick? Is Liz enthusiastic? Debbie enjoys the creep. Fly! Shaking, green Kerri pulls weird Colin’s fly on a sick navel, then vanishes over Norwalk.

Undo the virgin! While obscene Liz is juicy and sexual, profane, gaudy Jeff’s knee is under a succulent whore’s navel. Tear Keith’s lip bitchily and juicily on sick, weird, multicolored Barbara! Why is Buster insane? The shirt knifes the bikini. Colin shakes the bitch. The whiskey sour hesitates over the green jerk. When a tasty tongue’s ethanol walks, Kerri rips the boat and disapproves under enthusiastic Miles. Such language! Keith sings. Why is Buster bitchy?

Notice that the computer turned her adjectives into adverbs, by adding ly and making other changes. Gross became grossly, and juicy became juicily. Unfortunately, the computer’s method wasn’t perfect: the computer turned stimulating into stimulatingly (a non-existent word), and turned neurotic into neuroticly (instead of neurotically).

It conjugated her verbs. Screw became screwing, and bite became biting (the computer dropped the e). Lick became licks, and teach became teaches (the computer added the e after the ch).

It added ‘s to her nouns. Jeff became Jeff’s. Miles became Miles’s (it should have become Miles’).

For each sentence, the grammar is chosen randomly. The chance is 10% that the sentence will begin with an exclamation. If the sentence isn’t merely an exclamation, the chance is 18% that the sentence will be a question.

If it’s a question, there’s a 40% chance it will begin with the word why. There’s a 50% chance the main part of the question will have the form does… noun phrase… verb phrase, and a 50% chance it will have this form instead: is… noun phrase… complement.

To construct a noun phrase from nouns, adjectives, etc., the computer uses random numbers. It uses random numbers to also construct verb phrases and complements.

The program uses a special variable, called W. At the beginning of the composition, W is near zero; but it tends to increase as the composition progresses. It affects the complexity. When W is large, the chance is large that the computer will print adjectives, adverbs, subordinate clauses, and correlative clauses.

This sentence was produced by a small W:

The lip is tasty.

This sentence was produced by a large W:

As a sexy, walking tongue’s virgin takes off the bitchy cigarette over the profane stud then castrates enthusiastic Miles’s lip paranoidly, the dead, gaudy knee is stimulating and multicolored.

Poetic images (1973)

One of my students, Toby D’Oench, made the computer create poetic images, such as these:


Silent mists

Billow in creations

Windmills for flames evolve into ethers

Merlin again


Frozen children

Quiver with leaves

Creations with leaves hover over thoughts

Gardens of verse


Lazy fragrances

Waft by ethers

Seas on fragrances billow in sorrow

Rusted pitchforks


Frozen sails

Slumber in fog

Hazes for sails waft by thoughts

Docks — yachts — luxuries of eras gone by

The program contains these lists:

Adjectives Prepositions Verbs

fleeting of billow in

crimson on glitter with

silent under flutter by

sensate above drift with

pliant below flow into

gloomy in ponder about

pallid with waft by

inky by quiver with

frozen for hover over

lazy through gleam like

wander through

slumber in

dart by

evolve into

sing to

Title… noun… ending

TO REMBRANDT… windmills… A simple brush


THE PROPHET… visions… Then a word

LISTERINE… breaths… Plastic society

NEWPORT… sails… Docks — yachts — luxuries of eras gone by

EXISTENCE… seas… In the beginning?

SUMMER IN WATTS… flames… Tar-street neon — and the night

TO GUINEVERE — LADY OF THE LAKE… mists… Merlin again

NOON IN CALCUTTA… hazes… Emaciated dark forms strewn like garbage

WEST HARBOR… fog… A solitary gull slices through

A NEW ENGLAND BARN… fragrances… Rusted pitchforks

A CHILD’S MICROSCOPE… creations… The wonderful amoeba

A GROUP PORTRAIT… bundles… Christmas

THE MILKY WAY… cosmos… A gooey mess

TOMBSTONE… sorrow… Rubbings

LIFE AT THE END OF A BRANCH… leaves… Swirling to the ground

SEASHELLS AND THINGS… waves… Dribble-dribble-dribble castle

A BEAVER POND… reeds… Thwack

MY MEMORY… children… Gardens of verse

EINSTEIN… thoughts… Somehow through this — an understanding of a superior order

To create a poetic image, the computer fills in this form:


Adjective Noun that goes with the title

Verb Noun

Noun Preposition Noun Verb Noun

Ending that goes with the title

Curses (1978)

Tom Dwyer & Margot Critchfield made the computer curse you. Here are some of the computer’s curses:

May an enraged camel overwhelm your garage.

May an ancient philosopher lay an egg on your dill pickle.

May seven large chickens sing an operatic solo to your love letters.

To invent a curse, the computer fills in the blanks:

May _______ ___________ your ______.

subject verb phrase object

The computer uses these words randomly:

Subjects Verb phrases Objects

an enraged camel send a mash note to mother-in-law

an ancient philosopher get inspiration from psychoanalyst

a cocker spaniel redecorate rumpus room

the Eiffel Tower become an obsession of fern

a cowardly moose make a salt lick out of garage

the silent majority buy an interest in love letters

the last picture show overwhelm piggy bank

a furious trumpet player pour yogurt on hamburger

Miss America sing an operatic solo to dill pickle

seven large chickens lay an egg on Honda

You can find that program on page 152 of their book, BASIC and the Personal Computer.

Analyze writing

The computer can analyze what humans write.

English poetry

Can the computer analyze English poetry? From 1957 to 1959 at Cornell University, Stephen Parrish made the computer alphabetize the words in Matthew Arnold’s poetry. Here’s an excerpt:

Page Line

in in

book Poem's title poem


back with the conscious thrill of shame 181 Isolation Marg 19

conscious or not of the past 287 Rugby Chapel 45


the last spark of man's consciousness with words 429 Empedocles II 30

and keep us prisoners of our consciousness 439 Empedocles II 352


Peter his friend with light did consecrate 445 Westmin Abbey 50


which consecrates the ties of blood for these indeed 196 Frag Antigone 31


won consecration from time 281 Haworth Church 46

foreshown thee in thy consecration-hour 446 Westmin Abbey 75

To find out what Matthew Arnold said about love, just look up LOVE. Such an index is called a concordance.

That concordance was the first produced by a computer. Previously, all concordances of poetry were created by hand, using filing cards. For example, in 1870 a group of researchers began creating a concordance to Chaucer, by hand. They started at the letter A. 45 years later, they were only up to the letter H!

Did the poet Shelley steal ideas from others? Joseph Raben, at Queens College, believed Shelley borrowed imagery from Milton. To prove it, in 1964 he made the computer produce concordances to Shelley’s Prometheus Unbound and Milton’s Paradise Lost and compare them. The computer found many similarities between Shelley and Milton.

What were Shakespeare’s favorite words? In 1971 at Münster University in Germany, Marvin Spevack fed the computer all the works of Shakespeare, and made it count how often each word occurs. Disregarding trivial words such as a and the, the computer discovered Shakespeare’s favorite word was love: he used it 2,271 times. Next come heart, death, man, life, and hand. He never used the word hero. In Macbeth, the word good occurs more often than any other adjective, noun, or adverb, and more often than most verbs.

By counting words, other researchers made the computer graph the rise and fall of themes in a novel.

American history

Who wrote the Federalist Papers? Historians knew some of the papers were by Alexander Hamilton and others by James Madison, but the authorship of the remaining papers was in dispute.

In 1964, Mosteller and Wallace made the computer compare the literary styles of the papers, by counting the frequency of words such as by, enough, from, to, upon, while, and whilst. It concluded that all the disputed papers were written by Madison, not Hamilton.

The statistical evidence was so high that historians accept the computer’s finding as fact.

The Bible

Can the computer analyze the Bible? In 1951, Texas clergyman John Ellison made the computer compare 309 Greek manuscripts of the New Testament. Underneath each word of a standard text, the computer printed the variants found in other manuscripts. It classified the manuscripts according to their similarities.

In 1957, he published a concordance to the Revised Standard Bible, and a pair of other researchers (Tasman & Busa) indexed the Dead Sea Scrolls.

Did the apostle Paul really write all those marvelous letters attributed to him in the New Testament? Or were they actually written by somebody else?

In 1964, Scottish clergyman Andrew Morton used the computer to deduce that Paul didn’t write some of those letters.

All Morton did was count how often Paul used the Greek word kai in each sentence. Kai means and. Coming to a conclusion about Biblical authorship by counting just the word and might seem silly, but Morton said he analyzed 20 writers of ancient Greek and found each used kai with a constant frequency. In the "Pauline" letters, the frequency of kai varied a lot, implying some of them were not by Paul.

Ellison distrusted Morton’s assumption that a man’s literary style must remain constant. He warned: if Morton’s method were applied to the Declaration of Independence and Thomas Jefferson’s letters to his wife, the computer might conclude that either Jefferson didn’t write the Declaration of Independence or another man was writing love letters to Mrs. Jefferson. In 1965, to prove his point, he applied Morton’s method to two of Morton’s own articles on the subject: the computer concluded that Morton could not be the author of both!


IBM programmed the computer to detect a forged signature — even if the signature looks correct to the naked eye.

To use the IBM forgery-detection system, write your signature by using IBM’s special pen, attached to the computer. As you write, the computer notices how hard you press the pen against the paper and how fast you move the pen.

If somebody else tries to pretend he’s you, he must sit down at the machine and try to duplicate your signature. If he presses the pen hardest at different points of the signature, or if he accelerates the pen’s motion at different points, the computer says he’s a fake.

The system works well, because the average crook trying to forge your signature will hesitate at the hard parts. His hesitation affects the pen’s pressure and acceleration, which tell the computer he’s faking.

IBM developed the system in 1979 but didn’t start selling it until many years later. Now IBM sells an improved version. Remember: the system works just on signatures written with IBM’s pen.

Translate Russian

Soon after computers were invented, programmers tried to make them translate Russian into English. They chose Russian instead of Spanish, for three reasons:

1. Few humans could translate Russian. Spanish translators were a-dime-a-dozen.

2. Computer experts love hard problems. Russian is harder than Spanish.

3. Most computers were owned by the Department of Defense, which was very interested in Russia.

Early attempts

In 1954, IBM wrote a program that translated Russian sentences such as:

Gasoline is prepared by chemical methods from crude oil. The price of crude oil is determined by the market. The quality of the crude oil is determined by the calorie content.

Unfortunately, most Russian sentences are not so simple. During the 1960’s, the end of a Russian paper on space biology was fed into an advanced program written by Computer Concepts, Inc. Here’s the translation that came out:

Thus, the examination of some from fundamental RADIOBIOLOGICESKIX problems shows, that in this a field still very much NEREWENNYZ questions. This is clear, since cosmic RADIOBIOLOGI4 is very young RAZDELOM young science efforts of scientific different specialties of the different countries of the world successful PRODOLJENY will be expanded there are.

The computer couldn’t translate the words in capital letters and was stumped by Russian grammar.

The competing program, written by the Air Force, translated the same passage a little better:

Thus, consideration of from basic radio-biological problems shows that in a given region still very many unsolved questions. This and intelligibly, since space radiobiology is very young division of young science — space biology. However, is base to trust that jointly scientists of different specialties of various countries of world/peace radiobiological investigations in outer space will be successfully continued and expanded.

In 1966, a special committee of the National Academy of Sciences concluded that the experience of computer translation was "uniformly discouraging" and that hiring a human translator was cheaper than doing the two-step process of computer translation followed by human editing.

During the last 20 years, computer prices have fallen, but so has the availability of Americans who know Russian, so the computer’s usefulness is still in doubt. Today, most translations are still done by humans, who use computers to help do the word processing and to search through a dictionary and thesaurus.

Famous errors

If you program the computer to translate an English sentence into Russian, and then the Russian back to English, will you get back the same English sentence you started with?

One programmer tried, "The spirit is willing, but the flesh is weak." The computer translated it into Russian, then back into English, and printed, "The booze is strong, but the meat is rotten."

Another programmer tried, "Out of sight, out of mind." The computer printed, "Blind idiot."

At an engineering conference, a computer was translating scientific papers into English, when it suddenly started talking about "water sheep". Everyone was confused. Finally they figured it out: the computer meant hydraulic rams.

Xerox’s amazing translation machine

In Moscow during the 1960’s, American companies were showing off their products, but none of the Russians were interested in Xerox’s photocopiers — until some Xerox employees put on an amazing demonstration. They "photocopied" some English writing, and — presto! — a beautiful Russian translation of it came out of the machine! The machine was acting as a translator! And the translation was flawless, even though the English text was complex!

The Russians, very excited, ordered hundreds of the amazing translation machine.

But before shipping the machines, the Xerox guys confessed it was just a gag. The employees had sneaked the Russian version into the machine, before beginning the demonstration.

What if Americans had the same sense of humor about nuclear war? "Hello, Gorbachev? This is George Bush, on the hot line. We just fired some nuclear missiles. They’re heading straight for Moscow. Ha, ha! Just kidding."