To anthropomorphize means “to attribute a human form or personality to” (Webster’s, 1913). Never anthropomorphize computers. They hate it.
But where do you draw the line? Take TiePlumb, the computer-aided gauge for necktie straightness. Is it okay to say that TiePlumb can see your necktie on the PC camera, recognize it from previous sessions, and give an opinion on whether the tie needs adjustment? Is it okay to say that TiePlumb wants you to sit straight for the camera?
A computer-based tool can’t want anything, in the sense of being emotionally affected by desire. As a matter of fact, though, a computer-based tool can’t have an opinion or recognize a necktie either, because it has no consciousness. It merely reacts involuntarily to shifting currents of electricity, with no more intellectual involvement than a frog’s detached muscle that is zapped by an electrode in a high-school lab. And can it be said to see anything, if it treats the data from the camera as just a kazillion more numbers to crunch according to rote instructions?
A computer doesn’t mentally compute any more than a basketball mentally plays basketball. But we call it a computer anyway, and although it doesn’t mentally recognize, remember, or calculate data, we say it does because for our purposes it might as well be doing so.
What sounds wrong, in my opinion, is not language that attributes intellect to the computer but language that attributes emotions. However, the dividing line is vague. Anyone would accept the idea that a computer “waits” for input. Can it “expect” input? I would say so, though to expect does seem more intellectually conscious than to wait. As a computer program approaches timeout and displays increasingly emphatic requests, is it becoming “impatient” for input? I’d say that impatience implies too much human personality.
But what such terms imply can be surprisingly subjective. If we write that a message appears on the screen, is the word “appears” unsettlingly inappropriate, implying that a human or supernatural magician is at work? To me, not at all. And the 1913 Webster’s says that to appear is “to come or be in sight; to be in view; to become visible.” Nothing eerie about that. Yet few topics pop up more often on the technical writers’ mailing lists than how to avoid the supposedly magical-sounding “appear.”
Sometimes the intransitive use of “display” is suggested instead: “An error message displays,” for example. Webster’s of 1913 claims that Shakespeare used a similar meaning intransitively — “To make a display; to act as one making a show or demonstration” — but it doesn’t say where, and I’m far from sure he did. Anyway, the latest online Merriam-Webster calls that meaning obsolete and tells us that either the verb is transitive (which is to say that it has an object: you don’t just display, you display something) or else it means “to make a breeding display” as in “penguins displayed and copulated.” You don’t want penguimorphism in your technical writing.
People can be unreasonable when their gut feelings contradict the dictionary. I know. I’m one of those with unreasonable gut feelings about “whose.” The dictionary says that “whose” not only can mean belonging to the person mentioned earlier but also can mean belonging to the thing mentioned earlier. The 1913 Webster’s quotes Dryden: “The question whose solution I require.” The more modern Merriam-Webster quotes Joseph Wood Krutch: “the first poem whose publication he ever sanctioned.” To me, the word nonetheless recalls “who” too strongly. Merely as a personal preference, not as a matter of right and wrong, I try to avoid using “whose” except to mean someone’s. I wouldn’t say “the computer whose messages are friendly tends to be anthropomorphized by the user.” I’d say “if the computer displays friendly messages, the user tends to anthropomorphize it.” And if the user anthropomorphizes the computer from here to Sunday, that’s fine. The restraint is for the writer only.
Questions and comments are welcome: ]]