Soldiers and Scouts with Julia Galef
In this episode of the podcast, Brooke chats with Julia Galef - co-founder of the Center for Applied Rationality and host of the podcast 'Rationally Speaking'. They discuss the topic or Julia's book, 'The Scout Mindset' which looks at the underlying motivations that guide our beliefs and behaviors. Some of the things covered include…
- Scout versus soldier mindset - how they differ and why we rely on both, depending on the situation.
- The downsides of soldier mindset and why our tendency to defend our beliefs no matter what can get us into trouble.
- The benefits of adopting an evidence-based mindset and being open to things that challenge our beliefs, aka 'drawing the map in pencil'.
- Practical ways we can embrace a scout mindset in our personal and professional lives.
The conversation continues
TDL is a socially conscious consulting firm. Our mission is to translate insights from behavioral research into practical, scalable solutions—ones that create better outcomes for everyone.
Why do mindsets matter?
“Being smart, and clever, and knowledgeable is not sufficient for seeing the world as clearly as possible. You also need to have the right motivation guiding that intelligence and knowledge in the right direction.”
The opposite of soldier mindset: the scout
"It's reasoning that is genuinely directed at, to the best of your abilities, at figuring out what is actually true. And so, just like a scout, in contrast to the soldier, their role is not to attack or defend, but it's to go out, see what's really there and form as accurate a map of an issue or a situation as possible, including all of the uncertainties and unknowns."
How we often switch between the two
"I might be really good at being in scout mindset about, I don't know, my job, and then I might be much more likely to be a soldier in my personal relationships where I refuse to consider other people's perspectives or that I might have been wrong about something. Or some people might be really good at being a scout about science, but not about politics. Things like that. So we're all a mix of both."
A word of caution on anchoring our self-identities
"Try not to let too many things into your identity and try to be self-aware of the things that are part of your identity."
A good scout relies on good evidence!
"You should be adjusting the strength of your beliefs to the strength of the evidence. And that is a messy business."
Brooke: Hello everyone, and welcome to the podcast of The Decision Lab, a socially conscious applied research firm that uses behavioral science to improve outcomes for all of society. My name is Brooke Struck, a research director at TDL and I'll be your host for the discussion.
My guest today is Julia Galef, co-founder of the Center for Applied Rationality and host of the Rationally Speaking Podcast. In today's episode, we'll be talking about soldiers and scouts, perseverance and adaptation, and reasonable ways to be rational. Julia, thanks for joining us.
Julia: Hey, thanks for the great intro. It's good to be here.
Brooke: You've talked about the difference between these two mindsets, soldiers and scouts. You even teased me a little bit before we started recording the podcast about this distinctive hand motion I made from one side to the other, and you knew exactly what it is that I was describing. What are these two mindsets? Can you talk us through those, and how do they differ from one another?
Julia: Sure. So, one of my core ideas that I'm always on about, and that I wrote a book about, is that being smart, and clever, and knowledgeable is not sufficient for seeing the world as clearly as possible. You also need to have the right motivation guiding that intelligence and knowledge in the right direction.
And so, the scout and the soldier are my metaphors for two very different kinds of motivations that can be guiding your thinking in the background. So, the official term for what I call soldier mindset in the cognitive science literature is directionally motivated reasoning. So, it's reasoning that is unconscious, usually. We're not really aware that we're doing this, but unconsciously we're trying to reason and defend a particular predetermined conclusion. So, we're hunting for arguments in favor of something that we want to believe or that we already believe.
And so, I call it soldier mindset because it's very much like being a soldier on a battlefield, trying to defend the fortress of your beliefs against any evidence that could threaten to weaken or undermine it. And this is evident even in the way we talk about argument, and reasoning, and beliefs. We talk about things like shooting down other ideas or poking holes in, or finding weak points in arguments. And we talk about supporting, or buttressing, or strengthening our arguments with evidence, things like that. So, I call that soldier mindset.
And the alternative to soldier mindset, which I call scout mindset, is more officially known as accuracy-motivated reasoning. So, it's reasoning that is genuinely directed at, to the best of your abilities, at figuring out what is actually true. And so, just like a scout, in contrast to the soldier, their role is not to attack or defend, but it's to go out, see what's really there and form as accurate a map of an issue or a situation as possible, including all of the uncertainties and unknowns.
So, figuratively speaking, the scout is drawing their map in pencil, not in pen. The assumption is that you will be adding to it and revising it as you learn more and look at things from different vantage points. And that's fine. That's the process of working as intended. That's not suffering a defeat from the enemy or anything like that. And so, yeah, scout mindset is basically trying to be intellectually honest, and objective, and curious about what's actually true.
Brooke: It strikes me that we typically talk about or perceive something like the scout mindset as being the more rational one. The good scientist is supposed to be a scout. Whereas the soldier often gets derided as like, "Well, you're too caught up in politics, and tribalism, and biases and these kinds of things." But your position, if I understand it correctly, is a little bit different or actually quite different. If I understand right, you're claiming that we actually need both, is that correct?
Julia: Not quite actually. What I am claiming is that it's very understandable that we are often in soldier mindset. And this is a good point at which to note that it's not like some people are perfect soldiers or are perfect scouts and other people are pure soldiers. We're all a mix of both at different times, we might fluctuate.
I might be really good at being in scout mindset about, I don't know, my job; and then I might be much more likely to be a soldier in my personal relationships where I refuse to consider other people's perspectives or that I might have been wrong about something. Or some people might be really good at being a scout about science, but not about politics. Things like that. So, we're all a mix of both.
And what I do think is important to emphasize is that there are very good reasons why we have this soldier mindset baked into our cognition. We use it for some very important things. We use it to feel good about ourselves. We use it to try to look good to other people. So, for example, we often use soldier mindset to assuage our ego, to try to reinforce narratives in which we're not the villain, we're the hero, or we're the victim. Or that thing that we can't have or that we feel like we can't have like popularity, or money, or whatever, we didn't want it anyway. And people who have money aren't good people anyway.
And so, there are a lot of things that we can tell ourselves to feel good about ourselves and our lives or to motivate ourselves to do hard things. We might selectively reinforce beliefs that our business idea is definitely going to succeed in spite of the uncertainty. And then on the feeling good side of things, we use soldier mindset to reinforce beliefs that we think will make us look wise or compassionate to the people around us, to reinforce beliefs that our social circle considers good and virtuous, especially political beliefs or ideological beliefs.
We might convince ourselves that we know what we're talking about even when we don't because we want to appear confident. So, these are all very understandable things that soldier mindset is doing. And so, that might be what you meant when you said that soldier mindset can be rational, but the big caveat to that is that one of the main things I'm arguing in the book is that we rely on soldier mindset a lot more than we need to, to feel good and look good and a lot more than we should just for our own self-interest.
And that soldier mindset comes with a lot of downsides. There are downsides to deceiving yourself into believing things that aren't actually true and in not letting yourself think clearly and honestly about things as a rule. And so, a lot of the trick, I think, is finding ways to feel good and look good without resorting to soldier mindset. And instead, finding ways of thinking about reality, even when it's not flattering or not convenient, thinking about it in a way that you're okay with and you can handle and you don't need to go into denial or rationalization in order to feel good about yourself or to look good to other people and appear confident. And I do think that's quite possible.
Brooke: Yeah. Yeah. One of the examples that you mentioned along the line there is the business ideas. And I think that's actually a space where the value of the soldier mindset is perhaps more obvious. So, for instance, when you encounter a setback in building a startup like this big question, do you persevere or do you pivot? If we said only the scout mindset is the right one, we'd say, "Well, whenever you encounter seemingly contradictory evidence, you should pivot immediately." But in a business context, that's not what you want, right? Is you want to persevere when the persevering is the thing you should be doing and you want to pivot when pivoting is the thing you should be doing.
Julia: That's right. So, scout mindset is, the goal of scout mindset in a situation like that, where you've got a business idea and you're encountering new evidence and the situation may be change or it may not and it's all unclear. The goal of scout mindset is just to think honestly about what actually makes sense. And sometimes that will be pivoting and other times it will be persevering. And other times it will be, look, it doesn't make sense to be constantly reevaluating what I should do. I can't do that every 10 minutes or I'll never get anything done.
So, my honest best guess is that I should be continuing on this path and then revisiting the core premises of my idea every month, or maybe only revisiting it whenever some really big new piece of information comes in or something like that.
Those can all be very legitimate choices regardless of what the business idea is. And the goal of scout mindset is to help you think honestly about which one makes the most sense in your situation and not just rationalize yourself into ignoring evidence because you don't want to admit to yourself that your past idea was wrong or that your past investments have been wasted or something like that.
And also not rationalize yourself into immediately dropping your idea as soon as a challenge comes up, which is another common rationalization that I encounter in the business world. So, the way you characterized what would be scout mindset versus soldier mindset is not quite how I would have. I wouldn't say soldier mindset is always about persevering and scout mindset is always about giving up when you encounter new evidence. I'd say scout mindset is about allowing yourself to think honestly about whether to persevere or pivot, and soldier mindset is about picking one path without thinking honestly about which makes most sense. So, I'm carving up the reality a little bit differently than you are here.
Brooke: Yeah. Yeah. So I wonder, and you mentioned earlier on. I mean, part of what I like to do on the podcast is be a little bit unfair and push people's positions to extremes to try to dig out that meat intention in the middle.
Julia: I think that's a good way to think about things.
Brooke: So, one of the things you mentioned earlier, just to bring a little bit of fairness back into the conversation is that people are not entirely scouts or entirely soldiers all the time. And so, I wonder if we can use that to explore a little bit this idea of the scout mindset that you're going out there to map the terrain.
As you start to build that map or as you start to gain partial signals, right? And this is really important in the context of pivoting or persevering because when you're encountered with that piece of potentially contradictory evidence, the question you're asking yourself is, is this a reliable data point that indicates a trend or is this an anomaly?
In order to come down on that, don't you need to have some position? Can't you not help but have some hypothesis.
Julia: Oh, absolutely. Yeah. It's like having a map, but it's drawn in pencil. Right?
Brooke: Mm-hmm (affirmative).
Julia: So, in the metaphor, your map is basically, here's my current best guess about what's going on. So, the business idea might be here's my current best guess about what kind of products will do well on the market or here's my current best guess about whether we should try to grow fast or slow or whatever.
And then some guesses are going to be more confident, should be more confident than others. Like I'm more confident in the existence of gravity than I am in the fact that I'm going to go home for Christmas this year or something. I'm pretty confident of that but that could change. I could be wrong about that. It would be more surprising if I was wrong about the existence of gravity.
So, all of your beliefs should be on a spectrum of how much evidence you feel you have that justifies them and how confident you think you can be justifiably in them. So, this is part of what I mean when I say you shouldn't be constantly reevaluating everything, but you should have some beliefs that are uncertain enough that when you encounter evidence that seems to contradict them you go, "Interesting. I wonder if this means I'm wrong. I should maybe reevaluate some of my assumptions here."
And I think people often make, they make this convenient simplification that, well, you can't be certain about anything. And so you might as well not in anything. But I would say it's a spectrum and some things are worth questioning more than other things, if that makes sense.
Brooke: Yeah. Yeah. So that, I think, brings a really nice texture and nuance to this scout position that it's not about not being committed to anything and being this purely receptive cartographer. Actually, you do have, as you say, the map in pencil. You do have hypotheses, you do have certain things that you're more committed to than others, certain things that you have more doubt about.
So, it's not about not having a position, it's not about not espousing anything. It's about appreciating the gradient of how much certainty you have about nefarious things and also about an attitude to how you approach new information.
Julia: Right. Yeah. So, I guess another way to put this is that you should be adjusting the strength of your beliefs to the strength of the evidence. And that is, it's a messy business. There is no single well-defined, well, this is exactly how confident you should be in this scientific hypothesis. This is exactly how confident you should be in your political positions. There's no one clear right way to do that. But being a scout means trying to do that. Trying to have the appropriate amount of confidence in your various beliefs based on the evidence.
And just that act of trying is something that most people don't do by default, at least when it comes to emotionally or ideologically fraught things. Most people will allow that. Yeah, of course you have different degrees of confidence in beliefs about whether it's going to rain tomorrow or whether you're going to get that job you applied for.
Yeah. Of course, you have different amounts of confidence. Not everything's black and white, but they don't usually apply that same way of thinking to beliefs about my political party's approach to the economy is better than the other political party's approach to the economy or the thing that that politician did was obviously corrupt and not an accident. We have very strong, very certain opinions when it comes to politics or just general views about how the world works. What's good and bad, are our interpretations of other people's behavior.
Those, we tend to hold with certainty, even when pushed. I've had conversations with people about, I don't know, whether Trump would win the election. I remember this conversation back in 2016 and I remember someone insisting there's absolutely zero chance he will win. And I was like, "Zero chance? I could understand thinking it was unlikely, but do you really think there is absolutely no way?" And he was like, "No way." And then Trump won, of course.
Julia: But yeah. So punchline, you should be trying to adjust the strength of your beliefs to the strength of the evidence.
Brooke: Right. So, let's dig into that. So, the object of the scout mindset is beliefs. And the thing that we use to inform those beliefs is evidence. Is the object of the soldier mindset really beliefs or is it something else? If I think about some of the list of stuff you talked about before, like the soldier mindset's good to help us feel good about ourselves and to look good to others, to reinforce narratives, to give us a sense of self and a sense of direction and a sense of connection to other people. Are those really beliefs of the same flavor that the scout is trying to build, or are these just fundamentally different projects?
Julia: Well, the parallel would be that the soldier is trying to, it's aiming at certain beliefs and those beliefs are a tool to help you feel good and look good. And then for the scout, the scout is also aiming for certain beliefs, but well, not certain beliefs, the scout is aiming for whatever the true beliefs actually are. Whatever those turn out to be, those are the beliefs the scout wants to hold.
And that can be an end in itself. I think some people do actually just value having accurate beliefs for its own sake. They just really want to know what is true as an end goal, but it's also an instrumental goal because the more accurate your beliefs are, the more accurate your math of reality, the better you can navigate the world and the better decisions you can make.
So, this is sort of, excuse me, this is what I was getting at when I was saying, yes, absolutely, you should try to feel good and look good, but you don't have to use the soldier mindset to do that. That's just our first knee jerk impulse, but scout mindset can also help you feel good and look good often and I think in a more solid and long lasting way than soldier mindset.
So, the soldier mindset approach to feeling good might be pushing out of your mind any doubts that you might have about your project. The scout mindset approach might be to say, "Okay, I am going to consider possibilities that my project might be wrong," and then over time, maybe initially it's going to be painful because I notice like, "Oh, man. I shouldn't have done things that way. I made a mistake. This is going to be a bit of a long road back to a good project idea."
But in the longer run, I'm going to have an idea that's actually more solid that I can be more justifiedly confident in and it's more likely to actually work. Same thing for your view of yourself. The soldier mindset approach might be to try to get yourself to believe that, "Yeah, everyone likes me. Everyone thinks my jokes are funny. Everyone wants to date me," whatever. And that does feel good in the moment.
The scout mindset approach might instead be like, "All right, let's take a clear-eyed look at my strengths and weaknesses. Let's take a clear-eyed look at like, how do people actually perceive me?" And maybe you notice things that don't make you feel very good in the short term, but that means you can work on them and improve, or maybe just target, try to date different kinds of people who actually appreciate those things.
And you're able to do that because you have a more realistic map in your head of what kind of person am I and how do people see me? So, it's usually, I would say, a longer term approach to feeling good and looking good, scout mindset, that is. But I think it is often a more solid approach in the long run.
Brooke: Yeah. So, I think we've stayed in a theoretical realm perhaps long enough. Let's dive into an example. You've talked about some-
Julia: That's how I do.
Brooke: Yeah. Yeah, I'm also prone to that kind of behavior. I can just geek out on ideas all day long and never worry about the fact that my toes haven't grazed the earth. So, let's dive into an example and really try to ground this. So, where do you see the big opportunity to deploy a scout or where do you see one big opportunity or the biggest opportunity to deploy this scout mindset to offset some of the negative stuff that happens from too much soldiering on?
Julia: I mean, I'd say you could deploy it anywhere. You could deploy it in terms of your view of yourself and your strengths and weaknesses, you could deploy it in your career when you're thinking about whether to apply for a job, or ask for a raise, or leave your job, or what career to be in, et cetera.
For example, someone I worked with was in law school and she wasn't that happy, but she also didn't like considering the idea that law school had been a mistake because it was a costly mistake. And so, she was trying to convince herself that this was still the right path for her.
And stepping back and trying to have more of a scout mindset about this and specifically doing the thought experiment of, "Okay, I can come up with justifications for staying in law school. Not sure if they're really valid or if they're just rationalizations. But if I were to make this choice again tomorrow, like if I could tomorrow decide to go to law school," for whatever amount of time she had left. One year, I guess. "Or start a different career, what would I pick?"
Then the answer was clearer to her that I would not choose this if I were making the choice tomorrow. And that jolted her back into realizing, I'd really be better off doing something else, even if it means I have to cut my losses. It's still worth it overall.
So, career choices are definitely a fertile area where I think we tend to rationalize, and often to our long term detriment, and hiring and firing too. That was a big one that came up when I interviewed managers and CEOs in Silicon Valley, which is where I was living when I wrote the book for the most part.
When I would ask them like, looking back, what's something where you can see clearly that you were rationalizing about, you were in soldier mindset where you were trying to convince yourself of something, even if it wasn't really true? One of the most common answers I got was, "Whether or not I needed to fire someone."
So, firing someone is one of the least favorite things a manager ever has to do, but eventually you have to do it. And so, a lot of people had an anecdote where, "I was going back and forth. I tried to convince myself, 'No, no. I don't need to fire them. I'm sure things will get better. They'll improve. It's not that bad.'"
And then in retrospect, they were all like, "Ugh, no, I should have fired them a year before I eventually did." And so, one thing that one of my friends does with his fellow CEOs is he has a support group where each of them can bring a case to the others saying like, "Here's the situation. Here's the employee, should I fire them or not?"
And he said, "9 times out of 10, everyone else's answer is, 'You should fire him. Sorry. I'm sorry you have to do that, but it's pretty clear to an outsider." And that's actually something that you can do on your own, even if you don't have this support group is, if a friend of mine brought up this case to me asking what they should do, what would I say? And in the case of firing, just imagining yourself as the outsider looking at the situation as opposed to the manager who has to make the tough choice and actually carry it out, your answer about what you should clearly do can be very different.
So, it's a bunch of applications in your career, I think, and in your personal life. And then, an example that often comes up when talking about scout and soldier mindset is arguments about politics and ideology and how the world works, which is I think an interesting case because it's not as obviously directly useful to you.
It's harder for me to tell you that you will immediately be better off if you try to be a scout about politics, as opposed to a scout about your career or your personal life, which is true. There's no obvious direct benefit to having a more accurate map of how tax policy works or foreign policy or something like that. But I still think it's valuable because scout mindset and soldier mindset are habits of mind. We just unconsciously get in the habit of either flinching away from unpleasant news or reaching for a rebuttal instantly without really thinking about what someone else is saying, or rehearsing arguments in our head rather than questioning them.
These are habits. And so, if you try to be a scout even when it doesn't directly benefit you, like even when you're talking about politics, I think you are cultivating valuable habits of mind that do end up helping you when it comes to more directly personal topics like your career, your self-image, things like that. It also makes you a nicer person to be around and I think is good for the country and the world. But setting that aside, I think it's also good for you just in a less direct way.
Brooke: The examples that you use, both the one about making a business decision about hiring or firing somebody and the one about political views, they raised this really interesting question of like, what determines when we do and when we can take on the scout mindset and when we don't or when we can't. So, the example you gave initially about the CEOs, one of the ideas that came to mind, you mentioned like this is an exercise you can do on your own. You can run through, like if I were presenting this to somebody else or if someone else were presenting this to me, what would I say? My thought is-
Julia: It could go the other way too. You could imagine if I were presenting this to someone else, what would I expect them to say to me? That's another way to try to get yourself thinking about what an objective outsider would say about your situation. Sorry to interrupt.
Brooke: Yeah. It strikes me that certain people are, by disposition, better at suspending their own judgment, their own mental picture in order to walk through this hypothetical. And even for a given individual, like on a day when I'm very tired, I'm probably less good at doing that than a day when I'm feeling fresh and refreshed, than a day that I'm very angry or anxious or whatever it is.
I'm probably less well-positioned to take that on. So, what are the personality traits, and also, what are the situational factors that influence how effectively we're able to take on the scope mindset?
Julia: That's a great question. So, you've already pointed at some great ones, your energy or how much mental resources or even emotional resources you have on a given day is a big one. Also, how tied a particular issue is to your identity is an important factor. So, some issues are very tied to our identities in the sense that we feel that our beliefs about that issue says something important about us.
Like the fact that I believe that the future is going to be bright, that I'm optimistic about technology and progress. That that can feel like an important part of who I am. And I can feel proud to be a techno optimist. I like the idea of being the kind of person who looks optimistically at things and who expects progress to continue and son on much more than I like being the kind of person who is pessimistic about progress, things like that.
Certainly politics. People feel that their beliefs about politics say something important about them. That the fact that I support immigration that shows that I'm a compassionate person and not a closed-minded and bigoted person, for example.
So, it's much harder to think clearly and objectively about topics that are linked to our identity in that way because it's not just a referendum on some empirical question about the relationship between immigration and wages, it's also a referendum on whether you are a good person. So, that's very challenging.
Brooke: The stakes are really high.
Julia: The stakes are really high. Yeah. And so, one approach to that, I know you didn't quite ask, you asked about what determines whether we're in scout or soldier mindset, but to briefly go off into a tangent, one thing you can do about that is to try to keep your identity light, hold your identity lightly.
So, try not to let too many things into your identity and try to be self-aware of the things that are part of your identity. So, like for example, I'm part of a movement called effective altruism, which is about using reason and evidence to try to find ways to do the most good possible. And I think they're great or I wouldn't be working with them. I think I'm happy to be part of the effective altruist movement.
But of course, that means that it's inevitably part of my identity. And I can try to minimize that by keeping my distance and by reminding myself like, look, I just want to pick whatever policy works best, whether or not it's what the effective altruist believe works best. I can do things like that, but still it's going to influence my identity in some ways.
And when people criticize it, I'm going to feel this defensive impulse, but being aware of that can help me work against it and can help me take a step back and be a little more critical of the things that feel self-affirming to my identity as an effective altruist.
So yeah, identity is a big one. And I think related to that, beliefs that you have defended to other people, especially over a long period of time and especially in public, those can be very hard to think about objectively because if you concluded you were wrong about that, then you'd have to admit it to everyone else. Then you'd look stupid, or so you feel.
Often I think we overestimate how terrible it would be if we told the world we were wrong. And usually it goes much better than you think it will, but it feels like it would be a really big, bad thing. And so, that gives us a motivation to be in soldier mindset about ideas like that.
Brooke: So, can we train people in switching? So, some of the advice that you've given here, I really like that notion of holding your identity lightly and being self-aware about what influences and directs your identity. Are there practical exercises, are there steps that we can take to inculcate these habits?
Julia: Yeah. So, the first caveat that I'll add is that I think a lot of people are really excited about the idea of teaching these skills, like in school, for example. And I think it's really hard to teach a motivation. If someone is not that interested in trying to be a better scout, I think it's hard to teach them to do it, or to force them to do it against their will.
My book and my various appearances and so on are an attempt to try to motivate people to want to do it, to try to point out how it can be great and how the downsides that you think are there aren't actually as bad as you think they are and how it can be fun and liberating and freeing to try to hold your identity lightly.
So, I think you can try to motivate people to want to do this. But for the most part, they have to actually want to do it and you can't force them. So, with that said, ways to train yourself to be more of a scout, withholding your identity lightly in particular, one exercise that I find valuable is called the ideological touring test, which was coined by an economist named Brian Kaplan.
And the name is a reference to something called the touring test, which is a theoretical test proposed by a computer scientist named Alan Turing for how could we tell if a computer program was actually intelligent. And his proposed test was, well, have people interact with both the computer program and with the real human without telling them which is which, and see if they can tell the difference.
And if they can't tell the difference, then there's your proof that the computer program is just as intelligent as a human. So, that's the Turing test. The ideological Turing test Brian Kaplan proposed was, this is a test of how well you understand an opposing argument or an opposing ideology. Can you explain it? I don't know.
Say you hate libertarianism. Can you explain libertarianism and the arguments for why someone maybe should be a libertarian convincingly enough that an outsider couldn't tell whether you actually were a libertarian or not? That's the test.
So, in practice, ideally, it would be great to be able to actually do the test and have people try to evaluate your attempts and so on. But just like a simplified version of that is, I think, a really good training tool for holding your identity lately. Because you can try to make the case for an idea that you don't agree with or even think is bad or dangerous.
And just to ask yourself, does this sound like something that people from the other side would endorse? Because even that, it sounds like a very low benchmark, but it's something that I think we usually fail to meet, grossly fail to meet. And when we try to characterize the views of people who disagree with us, we do so in a really caricatured straw manny way. Like, oh, the Republicans, they're only Republicans because they hate poor people and minorities. That's what it means to be a Republican.
Maybe. Maybe you could be right. But is that actually how a Republican would describe why they agree with the Republican philosophy? I don't think so. And so, you've not passed the ideological Turing test. And I think this is a good thing to attempt doing partly because it helps you realize, oh, I don't actually understand this view that I hate, but also because it's an emotional training tool.
Because the act of trying to explain a view you hate and not caricature it or mock it, it's like an act of emotional self-control. And it's basically like separating your identity from just the facts about what do people believe and why? And forcing yourself to do that again and again, I think is a really powerful way to get in the habit of just thinking about the ideas without thinking about the people behind those ideas.
Brooke: Yeah. It reminds me of that famous old saying that the mark of an educated mind is the ability to entertain an idea without embracing it.
Julia: It's a good quote. Do you remember who said that?
Brooke: Everyone says Aristotle but when I did a quick Google and apparently it's not Aristotle.
Julia: That can't be right. That doesn't sound like Aristotle's words at all.
Brooke: No, totally not. But yeah. So, I really like this idea that the exercise is about trying to articulate the position of somebody else in a way that is charitable enough that it could pass as someone defending their own position. So, it strikes me that there are two things that are going on there. The first is that you're learning more and more nuance about the opposing position.
And then as that happens, you're just naturally exposed to the internal reasonableness of that thing. Like as you start to get exposed to more and more of why people who believe that thing actually believe it, you start to see that in fact there are very reasonable grounds on which they hold those things.
Julia: Right. And reasonable does not, I know you're not thinking this, but to make sure to emphasize here, you can be reasonable and still wrong.
Brooke: Yeah, totally.
Julia: It can be reasonable and still dangerously wrong or harmful. Seeing a view as reasonable just means you can see how someone who's not an idiot or a villain could hold that view. You can see how they got there basically.
Julia: And I do think it's a really important thing to be able to recognize.
Brooke: Yeah, lots of views are reasonable and turn out to be wrong. Remember when we all thought that Newtonian mechanics was the thing and we were never going to supersede that?
Julia: Oh, I remember, yes.
Brooke: Now that is old news.
Julia: I feel like it was yesterday.
Brooke: Yeah, that's right. But the second thing that happens there. So, in addition to being exposed to the internal logic of the position and the internal reasonableness of it is that it trains you in the second part, which you talked about more as a skill, which is this idea of being a bit more distanced from the view.
So, it's not just seeing the others' perspective a little bit more and understanding its internal reason, it's also that experience of holding something gently, which you can then apply more easily to other areas, including your own beliefs.
Julia: Right. Yeah. I didn't use this metaphor in my book because I thought it was silly or weird or something, but the experience of trying to explain a view that you hate without injecting a tone of scorn, or disdain, or caricature into it reminds me of those videos I've seen online of people giving their dog, their golden retriever, an egg, to hold in its mouth without biting down. And the golden retriever just holds the egg in its mouth. That's what it feels like to me to just hold the view lightly and not chomp down on it the way I want to by expressing how much I disagree with it. Just describe it as accurately as possible.
Brooke: Right. There's a marshmallow test-esque feel to this.
Julia: Of self-control and willpower.
Brooke: Yeah, yeah. That's right.
Julia: Yeah, exactly. And yeah, I think to generalize that beyond the ideological Turing test in particular, the goal of holding your beliefs lightly is for, you're going to have labels that accurately describe your beliefs. Like maybe you're a progressive, or a conservative, or an effective altruist or something. And it would be lying to say that those labels don't apply to you, but they should feel just like descriptive labels as opposed to flags that you're waving proudly or badges that you're wearing proudly.
And there should always be, in the back of your mind, a sense that, look, if the facts change or if I realized I had misunderstood or if I'd noticed like, oh, actually I think feminism is causing more harm than good in the world or something, if I decide that, I would give up the label.
The ultimate goal is not the label itself. The ultimate goal is just holding beliefs that are as accurate as possible and doing what I think is actually good. And currently, this label describes my views on that, but that could change in the future. That should be what's in the back of your mind.
Brooke: Yeah. The labels are also partial. They're shorthand. It's a way of taking something that's quite complex and giving it-
Julia: Compressing it.
Brooke: Yeah. That's right. And acknowledging that in that process of compression, something gets lost. So it's like, even the difference between I am a progressive and I espouse or I agree with a lot of progressive ideas.
Julia: Right. Right.
Brooke: Even that is like, it seems so small and yet that's-
Julia: It's meaningful.
Brooke:... where all the space is. Right?
Julia: Yeah, exactly. Yeah. And so, you don't have to go around saying, "No, no. I'm not a progressive. I'm just someone who currently agrees with most progressive positions more than other positions." That's very clunky and I am not going to tell people that they need to do that. But should be in the back of your mind when you call yourself a progressive, you should feel like that's what you're saying. That yeah, that label currently does a pretty good job of describing my current positions.
Brooke: Yeah. Okay. So, there have been a few, I think, practical tips that we've been talking about in the last few minutes. So, for someone who's listening out there who's like, "Oh my gosh, this just so clearly describes the solution to this problem that I've grappling with." What can I do to start moving forward towards positioning myself better to adopt a scout mindset and just dial back the soldier mindset.
Not to completely expunge it from my life, but just to not have it yank me around from left to right so intensely. There are some exercises that we talked about around going and exploring someone else's like the opposing position, trying to articulate that position yourself in a way that doesn't need to feel like you own it, but feels like a credible and fair representation of that position.
And then moving from there to taking similar distance to the views that you actually do support saying like, well, this is a way that I can articulate and this is a way that I support them. But also I hold it at a bit of a distance as well. I've got that ache in my mouth when I'm not chomping down.
What about at scale? What about someone who's listening to this and says, "Okay, I understand that these approaches are really valuable to dial down the pressure and to not soldier on so hard." In an organization or in a movement, how can we scale that up for lots of people at a time, not just in my own individual life? What can I start doing tomorrow at The Decision Lab to get everyone to stop being such intense soldiers and just go out and do a little more scouting?
Julia: Yeah. So, I definitely, personally speaking, I focus more on the individual level, but I think the societal or institutional level is just as important if not more important, but we need both. So, one important thing I would say is that people really respond to incentives. This was something actually we didn't talk about when we were talking about what determines whether you're in a scout or soldier mindset, a big factor I didn't mention is just the incentives you're getting from the people around you.
Like if someone brings up a counterargument and you pause to think about it, do they smirk at you? Like ha, you've lost now.
Brooke: Caught you.
Julia: Yeah, exactly. Gotcha. That's a strong social incentive to not be in scout mindset in that social environment. And conversely, if you are making an argument and then you go, "Actually, you know what, I'm not sure that makes sense because," counter argument. Does the person you're talking to go, "Oh, wow. Cool. I love that you notice that nuance and changed your mind."
That's a strong social incentive to be more of a scout and less of a soldier. And so, on the individual level, I do try to surround myself more with people who reward me for scout mindset and try to avoid the people who punish me for scout mindset, because I think that's...
I'm trying to create more of a tailwind for my efforts to be a better scout and not a headwind. I want to make it easier for myself and not harder. So, that's an individual level intervention, but you can absolutely do that at the group or organizational level as well. Especially if you're more in a position of authority or status in the group. You can just try to make sure that you're rewarding people for being scouts and not for being soldiers and reward people for rewarding scout mindset.
If you see someone praise someone else for acknowledging nuance in their plan or in saying, "You know what guys, I'm sorry. I know I was arguing X last week, but this week I actually think Y makes more sense." You can say, "I appreciate that Bob did that and I also appreciate that Lisa appreciated it."
Brooke: ... contributed well. Yeah. Yeah, yeah.
Julia: Right. Exactly. And didn't make things harder for him. I think that's the right way we need to be going. So, certainly explicitly calling out and appreciating examples of scout mindset is valuable. And then of course, walking the walk yourself as a leader is probably more important than just the explicit talk about what you think people should be doing. Because people will look to you for their cues about how you're supposed to behave in this particular organization.
So, social incentives are the first thing I'd say. And then you can also just have policies at the organizational or institutional level that help reinforce scout mindset. So, this isn't like a particular company, but in an academic field, one promising move I've seen is in academic, well, social science is a collection of fields. But psychology, to some extent economics and social psychology, epidemiology, they started to move in the direction of requiring people to pre-register their hypotheses.
So, to back up a little bit, even though scientists, we think of scientists as being the archetypal scouts, scientists are human and so often they're going to be in soldier mindset and they're going to want to try to defend their particular hypothesis that will help them get papers published and help their name become more prominent. That's just inevitable.
And so, there are little unconscious things that they do when they're analyzing their data to help them conclude that their hypothesis is true. And so, pre-registering a hypothesis helps guard against that because you stay ahead of time. Before you even start your analysis, here is the hypothesis I'm testing and here is how I'm going to test it. And then you can't change that later on when you find that the hypothesis isn't true and you are tempted to tweak the method in hopes of getting a more significant result.
So, that's more of a safeguard against soldier mindset that some academic fields have started requiring, and I think that's great. You can also have official norm in an organization or a field of making explicit predictions. Like here is what I think is going to happen in politics and here's how confident I am in it, to call back to our earlier conversation about adjusting the strength of your belief to the strength of the evidence.
Instead of letting people just proclaim everything with certainty and then sweep it under the rug when they're wrong, it can be a really valuable norm to say, let's make it clear what we're actually predicting and attach a quantitative level of confidence to it. And then over time, we can see who actually has a more accurate track record of getting things right in this field.
And knowing that people are going to see your track record can really incentivize people to try to report what they actually think is true instead of just stating things with certainty because it makes them seem confident or instead of just promoting their pet hypothesis.
Brooke: Yeah. There's some-
Julia: So, those are three things. Yeah.
Brooke: Yeah. There's some procedural stuff around that as well. So, you mentioned putting in this preregistration step for academic studies. There are analogs that we can bring in the corporate world. So, you talked about explicitly naming our predictions and getting a level of confidence. But following from that, we can also put in more machinery as well around the structure of a conversation about how we're going to reflect on those predictions.
So, one of the kinds of outcomes that we can get from this very explicit prediction tracking and assessment system is, who are the good predictors? But another kind of outcome is, what are the kinds of processes that the good predictors are using to get to those good predictions to then inform how are we going to train other people to also develop good prediction making processes?
Julia: Right. And people should-
Brooke: And that can be really powerful.
Julia: Sorry to interrupt you. I just want to make sure I get to plug the book Superforecasting a certain scientist named Phil Tetlock. So, this is basically what he did. He entered a team of just amateur political and economic forecasters into a tournament sponsored by the U.S Government in 2013 or 2014. And they made a bunch of predictions about what would happen politically and economically. Like yes or no, will such and such war in the Middle East have ended? Will a peace treaty have been signed by December 31st, 2015 or something, and with what level of confidence are you making that prediction, et cetera?
And they were competing against professors and against intelligence analysts at the CIA itself. And the best people on Tetlock's team just blew the competition out of the water. They were 30% more accurate than the actual CIA.
Brooke: Professionals. Yeah.
Julia: Right. And so, Tetlock dubbed them the superforecasters and wrote this book with Dan Gardner about what they were doing right. Like what was the secret sauce to Superforecasting? It's really interesting and I think unusually rigorous for a social science study. And so, there's a bunch of things in there and you should read it yourself. But one of the things that he notices superforecasters doing a lot more than other people is adjusting their beliefs incrementally.
So, a lot of people will either never update their view at all, or they'll do a... Oh, sorry, just knocked over my mic. A lot of people will never update their view at all or they'll do a 180 where they're like, "Well, all right, I guess I'll give up on that idea because of counter evidence. And the superforecasters did a much more subtle thing where they'd have this view and then they'd read some more news articles and they'd go, "Hmm, that makes me a little less confident."
Or some new development would happen and they'd go, "Actually, this makes war more likely." And so, they'd adjust their confidence upwards from 65 to 75 or something like that. So, it was this very careful, delicate process. And by the end of it, the percentage they eventually landed on was much more accurate than the norm.
Brooke: Julia, thanks for sharing these insights with us. I think that, especially these last chunks that you've talked about around superforecasting and the kinds of approaches that organizations can take are a really valuable place to end on here.
Talking about the kinds of attitudes that can be adopted within institutions, but also flowing from those attitudes, the practical steps that they can take such as leaders modeling good behavior and also creating incentives to get people to be a little bit more forthright about what their hypotheses are about the fact that those are hypotheses and clear on what kind of evidence they're actually looking for in order to figure out which hypotheses are true and which ones aren't, and which ones need to be updated.
That's really, really helpful for listeners out there who are thinking about how to do this in the organization. So, thank you much. It's been a great conversation to unpack some of the nuance of this scout mindset that we're not just this tabula rasa that's going out to try to collect pieces of information.
Somewhat soldier-like, but I think this is a point of disagreement between you and me. Even scouts need to have some hypothesis about what they see out there. And that's a necessary element. You need to be committed to something. You can't just hedge your bets internally.
Julia: Well, I do agree with that. So, maybe we're more sympatico than you realize, but that was a great summary. Thank you.
Brooke: Yeah. Thank you. Yeah, thank you for taking your time out of your day today to speak with us and to share these insights with us. You plugged Philip Tetlock's book, which is a very nice thing to do. Would you like to take a moment to plug your own book before we wrap up?
Julia: Sure. Sure. So my book is called The Scout Mindset and it's a more full, fleshed out explanation of scout and soldier mindset, and why we're so often in soldier mindset by default and how we can move towards scout mindset, especially how we can get those things that we really value, like feeling good and looking good using scout mindset and without having to resort to soldier mindset. And it's packed with lots of my favorite examples of scouts in science, politics, history and everyday life.
Brooke: Great. Thank you very much. And I hope we got a chance to speak again soon.
Julia: Likewise. It's been great.
We want to hear from you! If you are enjoying these podcasts, please let us know. Email our editor with your comments, suggestions, recommendations, and thoughts about the discussion.