In a recent article in the A3 journal, psychologist Maria Konnikova’s work around decision making was used an example to highlight the way our brains are hardwired to seek concrete answers. We want a clear cut answer, a concrete story about what is happening around us and to us. Our brains are hardwired to want to believe the stories we tell ourselves.
Traveling in avalanche country involves observing and integrating a endless amount of data. All that data gets boiled down into a ‘operational thesis’. This thesis is our feel for the mountain snow environment. As we travel we gain more data and that thesis is refined and changed—hopefully.
It begins with a lower resolution macro perspective—before we even head into the mountains— we begin speculating on how the mountain snow environment will present in the field. Our avalanche forecast is part of this but as a skier a lot of it is about surface conditions and snow quality. As we travel and gain more data our thesis is confirmed or denied and it needs to evolve with this new data.
Our intuition comes into play when we begin to observe data that doesn’t fit well into our operating thesis. If we are not able to integrate those observations and a conflict arises between the new data and what our brains want that data to be we develop an uneasiness—an intuition.
According to Konnikova, it’s part of fundamental psychology. We are predisposed toward data that confirms what we believe. I recall a very interesting essay about the same topic but as it relates to meditation titled Sherlock Homes Is a Buddhist. Essentially suggesting that our predisposition toward only listening to what confirms our correctness is a large barrier to meditative insight and what makes a good detective makes a good mediator.
In essence Sherlock Holmes ability to think critically about a situation without being to attached to the story he wants to tell himself is key to being a good detective, buddhist and… backcountry traveler.
Some people—like Sherlock homes—are better at allowing that new data in. They are better at not being attached to their thesis and allowing it to easily evolve with new observations. I might even suggest that ability is a trained skill.
If you find yourself with a bad intuition ask yourself what observations needs integrating and how can the thesis be improved.
A bad intuition for me is usually an indicator of uncertainty. I am pretty good at integrating that new data and not being attached to my thesis. When I start to get that intuition its usually a sign that I have data that doest fit—well it doesn’t fit yet. It means there is some uncertainty about what the new observations mean.
The habit should be to question the thesis not the data–this is easier said than done though. Our psychology drives us to question the disagreeable data first. Train yourself to be more open to integrating new data into your thesis.
Fitting it into the snow safety worlds “operational mindsets” framework. We will see the intuition really start to come into play when we get into operational mindsets with more certainty. When we are in “stepping back” or “assessment” we don’t really have a thesis to be attached to. There is clearly a lot of uncertainty and we acknowledge it and don’t step into terrain that could give us a real bad intuition.
When we really start to believe our operational thesis is when we need to be most receptive to that new data disrupting what we think we know about about the mountain snow environment. This is when listening to the intuition is most key.
We want to train ourselves not to seek the answer but to acknowledge the answerless. Its not about finding what is correct—the mountains are too complex—but about curating that endless process of eliminating uncertainty.
Its pretty simple in fact, we need to be ok being wrong. However, this is much easier said than done. Hopefully with more intentionality, intuition can be one of many tools we can harness to improve our decision making.
Tell me more about that…
🙂
But seriously, I like it — great reminder to be a student of life.