Theodore Sturgeon and AI
In 1957, the science fiction writer Theodore Sturgeon was challenged by a critic who said that "90% of science fiction is crud." Sturgeon did not dispute the critic's conjecture but expanded it, noting that "90% of science fiction is crud, but then, 90% of everything is crud." This thought is now widely referred to as "Sturgeon's Law".
Sturgeon wrote various non-cruddy things including novels, short stories, a couple of Star Trek episodes, as well as ghost-writing the Ellery Queen novel THE PLAYER ON THE OTHER SIDE. In addition to his work, he was friends with Kurt Vonnegut and Vonnegut based a recurring character in his novels, an obscure science fiction writer named Kilgore Trout, on Sturgeon. Today, Kilgore Trout is more remembered than Theodore Sturgeon. So it goes.
Sturgeon's Law is still remembered and quoted and I think it contains a powerful caution for today's world, a world where the captains of industry assure us that AI will make everything so much better. I have my doubts.
An important thing to remember about AI is that is Artificial Intelligence. We tend to gloss over the first word and believe the second one. We want to believe that code had understanding and insight, but it is an illusion of intelligence. It is pattern recognition and feedback loops processing huge data sets. AI can do amazing things, like scan millions of mammograms and detect previously unrecognized correlations, literally saving lives through early cancer detection. But that is not intelligence.
Now the AI hypsters and true believers (and even the best AI can't distinguish between the two) will tell you that they are very close to achieving AGI (Artificial General Intelligence). AGI will know everything about everything because it has read everything, looked at everything, and found the patterns we puny humans have missed. It no longer matters that is artificial because it is supremely intelligent. I still have my doubts.
Because 90% of everything is crud, these large general data sets contain a lot of crud. And now we have hit the point where much of what is in these data sets has been skimmed from the Internet and as any child can now tell you, the Internet is not exactly a great source of reliable information. But wait, it gets worse. The internet is now being flooded with AI crud, the stuff that mimics good information. And that is being skimmed and fed back into the models. See the problem? We've built a crud concentrator.
This isn't a theoretical future problem. It is already happening. AI companies that have been using AI to write more of their code, are finding more and more examples of "hallucinations", the AI industry's clever rebranding of what we used to call errors, bugs, or lies. In several high-profile instances, next generation AI code has had to be rolled back when it proved to be less reliable than the previous version. And AI is unfortunately quite good at writing code that is inscrutable to humans. So now we have huge code bases that neither computers nor humans understand. That doesn't seem like a secure foundation for the future. It is the kind of thing science fiction writers have been warning us about for decades. I think it is time for us to listen.
No comments:
Post a Comment