Transcript: Sean Carroll One of the interesting things was, of course, what you do when you’re out there on the water and there’s another boat that is on a collision course with you, right? Typically, you don’t have direct communication with the other boat. You’re not on the radio. You can’t just say, hey, I’m going to do this. You need to have some rules about how to behave in such a way that the two boats don’t hit each other, okay? And there are such rules. You know, if you’re literally coming right on, then you’re supposed to turn to the right. You’re supposed to change speed and direction in a decisive way so the other boat can read your implicit boat language, I guess. The point is it works very well, but the reason it works is not only because everyone, you know, the pilots of both boats know the same rules, but because they know that each other knows The rules, right? So if I’m supposed to veer my boat to the right, that works because both boaters know that they’re going to veer to the right and they know the other one is going to veer to the right. So there’s a coordination between them and everyone is perfectly safe. This is an example of what philosophers and game theorists call common knowledge. So common knowledge, as we’ll talk about in the podcast, it’s a slightly misleading term. It doesn’t just mean knowledge that lots of people have. It means knowledge that lots of people have and they all know each other has. So there’s sort of an infant regress. I know that you know it and you know that I know you know it and I know that you know that I know you know it, etc., etc. (Time 0:02:07)
Transcript: Steven Pinker One of the basic facts about language, known in linguistics for many decades, is that even after we’ve worked out what all the rules of grammar are, what all the meanings of words are, And there could be an algorithm that could deduce the meaning of a sentence for the meaning of its parts and how they’re arranged according to these grammatical algorithms. In practice, people don’t mean what they say. They beat around the bush. They use euphemism. They use innuendo. If you could pass the salt, that would be awesome. The meaning of that is not, if you could pass the salt, that would be awesome. The meaning is, give me the salt. Or a store you got there would be a real shame if something happened to it. Do you want to come up and see my etchings? G office there, is there some way we could settle this ticket here without going to court and doing all that paperwork? We’re counting on you to show leadership in our campaign for the future. I don’t know, you’ve probably heard that in fundraising dinners. So all of these examples. One of the reasons it took so long to have AI understand language is that if you simply give it the algorithms for figuring out who did what to whom based on the rules of grammar and the meanings Of words, it will misjudge people’s intentions. If you say to a chatbot, can you tell me how to get to Harvard Square from here? Literally, you’d say, yes, I can tell you how to get to Harvard Square from here, but that’s not what the user wants. The user wants to just give the answer. (Time 0:07:19)
Transcript: Steven Pinker I have a chapter in the book on that very topic called Reading the Mind of a Mind Reader. And as I hinted at earlier in our conversation, most of the time the common knowledge is granted by a conspicuous or self-evident event. That happens within in a public place where you not only see it but you see everyone else seeing it and they can see you seeing it or something that’s blurted out with an earshot of everyone Else that’s the something is obvious conspicuous that’s the typical route to common knowledge. We can, in some circumstances, engage in the process that I call recursive mentalizing, where to mentalize means to get inside someone’s head. To recursively mentalize means get inside the head of someone who’s trying to get inside your head or someone else’s head. So sometimes you think about, oh my goodness, he’s probably thinking that he’s probably thinking, carry that to the limit, and we got common knowledge. (Time 0:17:23)
Transcript: Steven Pinker So an example would be, say, a rumor that a bank might be in financial trouble. And so you think, well, gee, if I had reason to think that, probably other people do, and they probably are thinking that other people do, and they’re going to withdraw their money because They’re afraid that other people will withdraw their money, if only out of fear that still other people will withdraw their money. I better withdraw my money while there’s still money to withdraw because the bank can’t cover the deposits of everyone all at the same time. And so you get a bank run. And the bank run didn’t begin with a conspicuous signal. The bank is experiencing a run. The bank is in trouble. But it comes from an interplay between some bit of news that leaks out that you then start to extrapolate to what other people might think. (Time 0:18:32)
Transcript: Steven Pinker During COVID, when people hoarded toilet paper, because they thought there’d be a shortage of toilet paper, which they then caused by hoarding the toilet paper, even though there Hadn’t been a shortage in the first place. It’s another case of common expectation, where there we really do engage in recursive mentalizing. No one ever said, go out and buy toilet paper. It’s in short supply. People just had to think in their mind’s eye of other people grabbing toilet paper because they were worried about it. And then that snowballed into the common knowledge that there’s a shortage. (Time 0:21:33)
Transcript: Steven Pinker Yes. So that doesn’t literally involve common knowledge, but it does involve recursive mentalizing, that is thinking about what other people think. So as I recall the cartoon, the caption is three logicians in a bar. And the waitress comes over and says, does everyone want beer? And the first one says, I don’t know. The second one says, I don’t know. The third one says, yes. So that’s a logic puzzle. And you can figure it out. If everyone wants beer is true, if each one of them wants beer, it would be false if any woman didn’t want beer. So if the first one says, I don’t know, she must want beer, because if she didn’t want beer, then she would deduce that everyone wants beer is false. The fact that she didn’t say it’s false meant that she did want beer. Second one goes through the same logic. The third one, knowing that the first one didn’t know and the second one didn’t know. (Time 0:25:58)
Transcript: Steven Pinker A linguist, George Lakoff, and a philosopher, Mark Johnson, in a famous little book they published 45 years ago called Metaphors We Live By, noted that language contains lots of metaphors That we don’t even realize are metaphors, which allow us to talk about abstract concepts in concrete terms. And one of the metaphors they discuss is that argument is like war. I demolished his position. He tried to defend it, but I found the weak spot. And we use the language of war in talking about arguments. And just as a kind of a whimsical thought experiment, Lakoff and Johnson says, well, do we have to think about argument as war? Why don’t we think of it as like a dance? And as it happens, the sequence of reaching agreement in Allman’s construction is in some ways more like a dance than like a battle. That is, it’s a random walk, and so you can lurch and weave and bob all over the place before arriving at agreement. So this esoteric mathematical theorem might actually have some insight. (Time 0:38:21)