I was prompted by a Tweet this morning to dig out my copy of Jacques Vallee’s book “The Network Revolution” (And/Or Press, 1982). This passage has haunted me since I first read it:
One lesson which I am not likely to forget took place at the RCA research labs in Princeton, where most of the discoveries in color television had occurred. I had been invited to give a talk on information retrieval. I ventured into English-language interrogation of data bases, and other applications of artificial intelligence used to converse with computers. A man with intense eyes and bright white hair took me aside. We sat on the benches in the lab next to the lecture hall.
He said, “There is a fundamental fallacy in artificial intelligence, and you’re falling into it.”
“In what respect?” I asked, with the feeling that this discussion was not going to conform to the usual polite exchange of generalities heard at most professional meetings.
“Artificial intelligence is trying to emulate nature, it wants to approximate what Man does.”
“What other inspiration is there?”
“Imitation of nature is bad engineering. For centuries inventors tried to fly by emulating birds, and they have killed themselves uselessly. If you want to make something that flies, flapping your wings is not the way to do it. You bolt a 400-horsepower engine to a barn door, that’s how you fly. You can look at birds forever and never discover this secret. You see, Mother Nature has never developed the Boeing 747. Why not? Because Nature didn’t need anything that would fly at 700 mph at 40,000 feet: how would such an animal feed itself?”
“What does that have to do with artificial intelligence?”
“Simply that it tried to approximate Man. If you take Man as a model and test of artificial intelligence, you’re making the same mistake as the old inventors flapping their wings. You don’t realize that Mother Nature has never needed an intelligent animal and accordingly, has never bothered to develop one.
“So when an intelligent entity is finally built, it will have evolved on principles different from those of Man’s mind, and its level of intelligence will certainly not be measured by the fact that it can beat some chess champion or appear to carry on a conversation in English”
With his piercing eyes on me, I had a brief vision of what an intelligent machine would be. If Nature has never needed an intelligent animal and hasn’t evolved one, I kept wondering, then who are we? In our feeble attempts to handle the information we call life, and increase its quality, can we trust the creations of our dreams? Are we perhaps nothing more than the process through which another form of intelligence is itself evolving? And in the end, what measure of control do we really have on the technologies we create?