would admit that the English Language is in a
bad way, but it is generally assumed we cannot
by conscious action do anything about it.
George Orwell “Politics and the English Language”
As a retired educator (English teacher and later school principal) and now a writer, I care deeply about words: their use and abuse. As a reader, I trust you do too. I hope you, like me, are concerned about their abuse specifically in discussions of technology.
Do you think I am overstating the case when I say that the anthropomorphic language we hear consistently used in discussions of technology is demeaning humanity, debasing our sense of ourselves and our place in the world? What is your reaction to talk of “intelligent machines” and “smart bombs”? Does talk about computers having “minds,” “communicating” with one another, "contracting viruses” and passing them on to other computers, or of computers someday becoming “smarter” than people prompt any visceral response in you? Or do you view the use of words like these to describe “the behavior” of machines as simply a quirk of modern day speech not to be too concerned about? Well then…
How do you feel when you hear technophiles talking about our brains being “hardwired” and how we have to “program” or “deprogram them?” And what about young people: Do you think that they are even conscious of how words traditionally used to describe human characteristics and behavior and now used to describe machine “behavior” are redefining our view of machines—and, more importantly, our view of ourselves?
In Technopoly: The Surrender of Culture to Technology, writer and teacher Neil Postman warns us that the computer and how we talk about it has influenced the way that people interpret the world. He says that the language we use to talk about technology has redefined human beings as “information processors” and nature as simply information to be processed. He forewarns us: “The fundamental metaphorical message of the computer, in short, is that we are machines—thinking machines, to be sure, but machines none the less.” [p.111]
Postman is not a hysterical alarmist, nor am I when I say that the way we have been abusing language when we talk about technology is devaluing our humanity. Here’s one example:
The human mind is not only limited in its storage
and processing capacity, but it also has known bugs;
it is easily mislead, stubborn and even blind to the
truth… Intelligent systems, built for computers and
communications technology will someday know more
than an individual human being about what is going on
in a complex enterprise involving millions of people.
(italics mine) [Avon Barr, “A1 Cognition and Computation”
in Machlap, The Study of Information, p. 261
That’s Avon Barr reducing human minds to nothing more than storage and processing devices—and not very reliable ones at that, and claiming that “intelligent machines” will someday “know” more than a human being and be able to distinguish “the truth” more reliably than a human being. I wonder how he would define knowing and truth.
John McCarthy, the creator of the term artificial intelligence goes even further in blurring the distinction between man and machine. He claims that even simple machines, like his thermostat, have beliefs. What beliefs, you may ask? To this he replies: “It may believe that the room is too hot, or that it is too cold, or that it is ok.”
.”http://www-formal.stanford.edu/jmc/little/little.html
Ray Kurzweil, an AI enthusiast and author of The Age of Intelligent Machines and The Age of Spiritual Machines doesn’t think much of us as we are now. He believes that by connecting our minds to computers “We’re going to be funnier. We’re going to be sexier. We’re going to be better at expressing loving sentiment.” And if that isn’t enough, Kurzweil also believes, and wants us to believe, that in the future by connecting our brains to computers “we’re going to expand the brain’s neocortex and become more godlike.”
In On Becoming a Novelist, writer and teacher John Gardner has warned that “once one has made a strong psychological investment in a certain kind of language, one has trouble understanding that it distorts reality.”
Do you, like me, believe that we have an obligation to recognize how the use of anthropomorphic language in talking about our technology tools is distorting our sense of ourselves and of reality? If you do, let’s take a stand against language that purposely obscures the distinction between ourselves, our minds and the operations of machinery. Let’s commit to a conscious effort to stop using words that describe human behavior to describe the workings of machinery, and insist that others do as well. The words we use are important; they frame our thoughts and our beliefs, and as Mark Twain reminds us: “The difference between the right word and the almost right word is the difference between lightening and a lightening bug.” Or in this case, the difference between a human being and a machine.