By John Gruber
Jiiiii — Free to download, unlock your anime-watching-superpowers today!
Science fiction writer Ted Chiang, in an interview with Madhumita Murgia for The Financial Times (Archive.is link):
Chiang’s main objection, a writerly one, is with the words we choose to describe all this. Anthropomorphic language such as “learn”, “understand”, “know” and personal pronouns such as “I” that AI engineers and journalists project on to chatbots such as ChatGPT create an illusion. This hasty shorthand pushes all of us, he says — even those intimately familiar with how these systems work — towards seeing sparks of sentience in AI tools, where there are none.
“There was an exchange on Twitter a while back where someone said, ‘What is artificial intelligence?’ And someone else said, ‘A poor choice of words in 1954’,” he says. “And, you know, they’re right. I think that if we had chosen a different phrase for it, back in the ’50s, we might have avoided a lot of the confusion that we’re having now.”
So if he had to invent a term, what would it be? His answer is instant: applied statistics.
My puerile mind is tempted to make a joke that tacking on “system” would make for a fun acronym, but I shan’t crack that joke, as I think Chiang makes a strong point here. What we have with these LLMs isn’t low-level intelligence but rather high-level applied statistics that creates the powerful illusion of low-level intelligence.
See also: Chiang’s very short story “What’s Expected of Us”, referenced in the interview.
★ Sunday, 4 June 2023