Client: Men are all alike.
ELIZA (a computer): In what way
Client: They’re always bugging us about something or other.
ELIZA: Can you think of a specific example
Client: Well, my boyfriend made me come here.
ELIZA: Your boyfriend made you come here
Client: He says I’m depressed much of the time.
ELIZA: I am sorry to hear you are depressed
Client: It’s true. I am unhappy.
ELIZA: Do you think coming here will help you not to be unhappy
Client: I need some help, that much seems certain.
ELIZA: What would it mean to you if you got some help
Client: Perhaps I could learn to get along with my mother.
ELIZA: Tell me more about your family
Client: My mother takes care of me.
ELIZA: Who else in your family takes care of you
Client: My father.
ELIZA: Your father
Client: You are like my father in some ways.
ELIZA: What resemblance do you see
Client: You are not very aggressive, but I think you don’t want me to notice that.
ELIZA: What makes you think I am not very aggressive
Client: You don’t argue with me.
ELIZA: Why do you think I don’t argue with you
Client: You are afraid of me.
ELIZA: Does it please you to believe I am afraid of you
Client: My father is afraid of everybody.
ELIZA: What else comes to mind when you think of your father
Client: Bullies.
ELIZA: Does that have anything to do with the fact that your boyfriend made you come here
Transcription from “Computational Linguistics” by Joseph Weizenbaum. Named after George Bernard Shaw’s ingénue in Pygmalion, the ELIZA program operated within the Project MAC computer system at MIT and made natural language exchange between man and machine possible. ELIZA identified key words and phrases and responded with rote answers that simulated a conversation. As its developer, Weizenbaum was alarmed at how attached users became to the program and cautioned against reliance on artificial intelligence in Computer Power and Human Reason, published in 1976.
Back to Issue