I've long been a fan of Julia Galef's podcast, Rationally Speaking. So when I heard that she'd released a new book, I jumped at the chance to dive into it.
The Scout Mindset aims to be a practical guide towards seeing the world more clearly.
It's a quick read, and I think it contains a lot of practical tips for developing scout mindset. I'd highly recommend it.
Traditionally, we're taught that in order to be a confident leader, we need to have total certainty about what things will happen. The consensus is that it's better to deceive ourselves and enter a Jobs-ian "reality distortion field" that allows us to really achieve great things.
Galef terms this method of operation "soldier mindset". In this mode of thinking, we ask ourselves "how can I win?".
She points out that in a lot of our language, we we refer to debates in a fairly militaristic way. We "defend" or "shore up" our believes. We "shoot down" or "poke holes" in opposing arguments.
Instead, she says we should embrace "scout mindset", which works in the opposite manner. Scout mindset tells us to determine not "what do I want to see", but "what's really out there?".
As an aside, I've noticed having a proper noun for a term like this is really powerful. It's more easily let me understand when I do have scout mindset vs soldier mindset.
Motivated reasoning is the phenomena where once we have a belief, we will tend to accept more evidence that supports that belief, and discount evidence which contradicts it. In essence, we are motivated to reach the conclusion we had at the outset!
This is a fairly well-documented part of the human psyche that appears everywhere. But it's a good reminder that if we hold a belief at the beginning, we are more likely to accept points which support that view.
There's an inherent question here, that I've discussed with a number of early stage founders. When you go to present at all-hands, or pitch your company to investors, how do you motivate yourself and your teammates if you think the odds of success are really low?
The first answer here is to consider not the odds of success, but the expected value.
Elon Musk famously rated his odds of success for both SpaceX and Tesla to be somewhere around 10%. He thought it was more likely that each would fail.
Bezos explicitly told investors that he thought there was a 30% chance that Amazon would succeed.
So why did either of them start at all?
If the expected value is high, it's worth doing. Sparking a green revolution and putting millions of electric cars on the road was a worthwhile mission for Musk. So was owning a significant percentage of e-commerce for Bezos.
Some things are hard, and likely to fail. But if they are important enough... they are probably worth doing anyway.
One point I hadn't considered before is the difference between Epistemic confidence vs Social Confidence.
Epistemic confidence refers to uncertainty in the world. "You have a 10% chance the chemotherapy will work" or "There's a 30% chance Amazon will succeed". On its own, people are more open to epistemic uncertainty. Many patients are more than happy to accept a diagnosis which gives them a range of successful outcomes.
Where people will doubt a person more is when the uncertainty lies within that person, not within the world.
As a result, people are more likely to trust other people who speak with a confident tone, don't stutter, and are able to clearly articulate their point.
Depending on your point of view, this could be a good or bad thing. Galef views it as "generally good", because social confidence is something we can practice and become better at. Epistemic uncertainty on the other hand, we can't do much about.
Galef also makes the argument that most of our soldier mindset stems from our identity. It's the different tribes we align with and the different in-groups that we see ourselves through. We're less likely to accept realities that would malign some part of ourselves.
Her advice is borrowed from Paul Graham, to keep your identity small. If there are few ideologies you consider "a part of yourself", you'll have an easier time embracing those realities.
An interesting technique for keep your identity small is borrowed from Brian Caplan is the idea of an "Ideological Turing test".
Traditional Turing tests try and distinguish a human response from an AI. In the ideological Turing test, you try to make an argument that would be indistinguishable from someone with an opposing viewpoint.
An example for vaccine hesitancy might be something like: "governments and health organizations have been wrong before about new medication, and about things like cigarettes." This is a pretty reasonable argument, and engaging with that point is far stronger than some of the much more fringe viewpoints.
In my own arguments, this idea has made me think more about what "the strongest point I could make on the other side" is.
Julia ends the book by advising that instead of leaning in to any sort of "tribal" identity, the way good scouts think of themselves is primarily as, well, good scouts. In that way, they aren't held to any particular belief, but instead are tied to the pursuit of being curious and understanding the world more clearly.
That means that, as a scout, you should enjoy learning. You should enjoy updating your beliefs, and correcting some believe you held prior.
In that vein, you should also enjoy being able to hold different views from your peer group, and not feeling ostracized for it.
Here, I also particularly like the term "scout" vs "rationalist". I get the impression that the scout community and rationalist community overlap in many ways. But in my mind, scouts seek to understand people without necessarily passing judgement that those people should be more rationalist.
By creating a new word for it, the scout identity has the ability to take on a language and principles all its own.