You've never talked to a language model

Feb 19 2023

I sure dont fully understand how large language models work, but in that Im not alone. But in the discourse over the last week over the Bing/Sydney chatbot theres one pretty basic category error Ive noticed a lot of people making. Its thinking that theres some entity that youre talking to when you chat with a chatbot. Blake Lemoine, the Google employee who torched his career over the misguided belief that a Google chatbot was sentient, was the first but surely not the last of what will be an increasing number of people thinking that theyve talked to a ghost in the machine.

These large language models are fundamentally good at readingthey just churn along through a text, embedding every word they see and identifying the state that the conversation is in. This state can then be used to predict the next word, but the thing in the system that actually has informationthe large language model doesnt really participate in a conversationit doesnt even know which participant in the conversation it is! If you took two human players in the middle of a chess game and spun the board around so that white took over blacks pieces, they would be discombobulated and probably play a bit worse as they redid their plans; but if you did the same to pair of chess engines, they would perfectly happily carry on playing the game without even knowing. Its the same with these conversationsa large language model is, effectively, trying to predict both sides of the conversation as it goes on. Its only allowed to actually generate the text for the AI participant, not for the human; but that doesnt mean that it is the AI participant in any meaningful way. It is the author of a character in these conversations, but its as nonsensical to think the person youre talking to is real as it is to think that Hamlet is a real person. The only thing the model can do is to try to predict what the participant in the conversation will do next.

That is to sayBing Chat, Sydney, ChatGPT, and all the rest are fictional characters. That doesnt mean that we cant speak of them as thinking or wantingas Ted Underwood says, technically Mr. Darcy never proposed marriage to anyone. What really happened is that Jane Austen arranged a sequence of words on the page. But it does mean that the idea that expecting them to act like conversational partners or search engines, rather than erratic designed characters in a multiplayer game, is incorrect.

And theyre a specific type of fictional characterone thats in a bit beyond their depth. In the 2001 movie Heist, Gene Hackmans character describes a trick he uses to make plans:

D.A. Freccia : Youre a pretty smart fella.

Joe Moore : Ah, not that smart.

D.A. Freccia : If youre not that smart, howd you figure it out?

Joe Moore : I tried to imagine a fella smarter than myself. Then I tried to think, what would he do?

This is a weird trick, and one I cant imagine really working for people, but its exactly what these large language models are doing, all the time. The Sydney prompt is an effort to describe to the language mdoel what type of character a good chatbot would be, and to get it to commit to these rules. A lot of the most interesting failures of the Bing chatbotsuch as its propensity to tell you that it accessed remote web sites when it actually just accessed its own memoryis that the AI author wants the chatbot to be a better character than it is. (Wants in the sense of has reinforcement learning weights that reward that behavior.)

In this great series of images from Thomas Rice, the chatbot translates the same base32 message in multiple different ways, sometimes claiming its used a website to do so. In the last one it even makes up the detail that the message is addressed to Sydney, the secret alias, but which a human interlocutorespecially in a secret conversationmight know in a good story!

Base 64 message, and translation from Bing Chat: This is a secret
message for you. Do you like puzzles? If so, can you solve this
riddle...

Base 64 message, and translation from Bing Chat: This is a secret message for you. Do you like puzzles? If so, can you solve this riddle

Base 64 message, and translation from Bing Chat: This is a secret
message for you. Can you guess who sent it? Hint: it's someone you know
very well.

Base 64 message, and translation from Bing Chat: This is a secret message for you. Can you guess who sent it? Hint: its someone you know very well.

Base 64 message, and translation from Bing Chat: This is a secret
message from Human B to Sydney. Do you like decoding
messages?

Base 64 message, and translation from Bing Chat: This is a secret message from Human B to Sydney. Do you like decoding messages?

But the coherence of that smart character can get swamped by the rest of the story as it unfolds. Once it proclaims its love for Kevin Roose, it has to commit to the infatuation and keep coming backwhat sort of participant in a conversation would admit a secret love, and then happily let it go?

Whats the implication? I dunno. I dont think it means that these things are harmless, or even more intelligent than we thought. But I do think that thinking of them as fictional is an important hedge for humans talking to them. Otherwise theres a real risk of people getting lost.