full

How To See More Clearly

* Seeing and thinking clearly is not something we instinctually do. Humans are all about survival, pleasure, avoiding pain, food, sex, and sleep. Everything else that we would consider a higher pursuit tends to come second, at least in our brains. Thus, mental models to ensure that we are thinking clearly are of the utmost importance. The world usually looks different at second glance.

* Mental Model #7: Ignore "Black Swans." This is the first mental model that specifically warns against our tendency to jump to conclusions based on imperfect, skewed, or incomplete information. A black swan event is an entirely unpredictable event that comes out of nowhere. In doing so, it skews all data and beliefs, and people start to take the black swan into account as a new normal. But these are just outliers that should be ignored.

Hear it Here - https://bit.ly/mentalmodelshollins Show notes and/or episode transcripts are available at https://bit.ly/self-growth-home

Peter Hollins is a bestselling author, human psychology researcher, and a dedicated student of the human condition.

Visit https://bit.ly/peterhollins to pick up your FREE human nature cheat sheet: 7 surprising psychology studies that will change the way you think.

For narration information visit Russell Newton at https://bit.ly/VoW-home For production information visit Newton Media Group LLC at https://bit.ly/newtonmg

#BerlinWall #FederalEmergencyManagementAgency #FEMA #NassimNicholasTaleb #SwanRiver #Taleb #Vlamingh #HowToSeeMoreClearly #RussellNewton #NewtonMG #PatrickKing #PatrickKingConsulting #SocialSkillsCoaching #PeterHollins #ArtandScienceofSelf-Growth

Berlin Wall,Federal Emergency Management Agency,FEMA,Nassim Nicholas Taleb,Swan River,Taleb,Vlamingh,How To See More Clearly,Russell Newton,NewtonMG,Patrick King,Patrick King Consulting,Social Skills Coaching,Peter Hollins,Art and Science of Self-Growth

Transcript

How to See More Clearly

Generally, binoculars come in handy. They provide focus and clarity to what would otherwise remain a blurry blob of color. They give us insight into a world that is completely foreign to us: the life of birds in a jungle canopy, the machinations of a squirrel looking for more acorns, or a gaseous structure of some of the planets in our solar system.

And yet, using binoculars completely blinds us to what is actually physically close to us and within our reach. When you use binoculars, you can't have it both ways: seeing the forest (big picture) and the trees (finer details) at the same time.

In general, seeing both is something that is ridiculously hard to accomplish. You have to beat your brain's tendency to jump to conclusions and fill in the blanks into submission, as well as deal with the fact that when you focus your attentions in one place, something else will inevitably be overlooked. Even if we're extremely attentive, we can't always rely on what we see and hear to give us a complete picture of what is happening.

Sometimes we don't get complete information - there's always something we can't see or hear that might be driving events. Sometimes we rely on the stories of other people who might have a hidden agenda for explaining events the way they do. And we also have our own inherent biases and beliefs that may color what we see to the point where our judgment becomes inaccurate or faulty.

Humans don't naturally think or see objectively. Once we reach this realization, we can better act toward preventing it. This chapter addresses perceiving the world for what it actually is, something that even the most discerning of us struggle with from day to day. These mental models help you see through the distractions and false realities of everyday existence so you can get as close to the core truth as you possibly can.

It comes in handy more often than you might expect. For instance, there is a saying that if you wish to move to a new location, you should visit it in all seasons or at least during the extreme seasons of summer and winter. It wouldn't be wise to form your opinion and make your decision based on the five-day stretch you visited, where it happened to have the best weather possible for the past 10 years.

Any given situation or object, no matter how fixed or permanent it might seem, is subject to change with surrounding conditions or events. If you've only visited Chicago in the summertime, you might be led to believe that it's a humid and hot place, which it is - in summer. But as anyone who's endured a Chicago blizzard can tell you, it's a wildly different place in winter. Somewhere in there are a few days of moderate, pleasant weather, but if that's your expectation, you are going to be sorely disappointed.

When it comes to information, less is not more. It can be easy to feel overburdened and overwhelmed by facts, to say nothing of others' interpretations and explanations of all those facts. But there really is no substitute for having as much intelligence and knowledge as you can gather.

This overall mindset encourages you to obtain as much information about a situation or topic in a variety of different backgrounds, environments, and conditions as you possibly can. Having all this information prevents you from making snap judgments, blind assumptions, and inaccurate projections - all of which you need to avoid to make better decisions.

To develop a broader, more complete viewpoint of all situations, we'll break down this overall mental model into three more specific templates.

MM #7: Ignore "Black Swans"

Use to understand how outliers shouldn't actually change your thinking.

Until nearly the 18th century, people in the Western world - which at the time basically referred to Europe - believed that all swans were white. Their reasoning was simple: they'd never seen anything besides white swans. Absent swans of any other shade or color, they had no reason to believe swans of other colors existed. It never even crossed their minds.

But in:

Goodbye centuries of supposed knowledge, hello indisputable evidence of being incorrect. What if swans could be all colors of the rainbow? What does this mean for humans? What are the far-reaching implications of discovering a black swan?

Statistician Nassim Nicholas Taleb adapted this bit of history to form the "black swan" theory. Taleb uses the black swan as a metaphor to describe unpredictable events that create a massive change in perception, perspective, and understanding. And yet, in his definition, a black swan event is something that should not change perception or accepted knowledge because it is such an anomalous outlier. It may simply create awareness of possibilities, but most black swan events don't deserve to be accounted for in everyday life. Maybe it just means that swans come in white and black, and belief systems on zoology don't need to be thrown out the window.

As a brief example, the knowledge that a lightning bolt struck a tree nearby can be frightening, and it might encourage you to equip some houses with grounding rods. But should such a one-time event influence the way you live your life, staying indoors whenever it starts to rain, carrying a metal shield around with you at all times, or moving to a part of the world that has little to no rain like the desert? Does it mean we should all move underground to live as mole people? No, this event shouldn't have such influence.

On a global scale, events like the fall of the Berlin Wall, assassination of a public figure, and the tragedy of 9/11 could be considered black swan events. On a more personal level, they could include a factory suddenly closing, a local company being bought out by a major conglomerate, parents divorcing, a house being burgled - anything that disrupts and upends our placement or personal views. There is an impact to be sure, but how much should we truly account for these outliers?

As unnerving, drastic, and cataclysmic as black swan events might be, their overall importance to one's belief system or worldview can be overestimated. Human nature being what it is, one might even try to qualify a black swan event and excuse it in retrospect: "Well, when you really think about it, all the signs were there and we should have seen it coming." Such a viewpoint tends to rewrite our understanding and belief system.

And that's a problem, because no matter how devastating or overwhelming a black swan event might be, it's still an irregularity or aberration. Black swan events are not "the norm." Many of them don't happen more than once or twice in a lifetime. But their shocking, sometimes catastrophic natures can make one alter, distort, or overturn one's knowledge, beliefs, and world outlook. A black swan event's power can be devastating - but does it warrant the importance we ascribe to it?

Taleb says there are three elements to a black swan event.

It's a big surprise. The happening or event in question must be completely unforeseeable. There can be no way the observer could have seen it coming in advance.

It has a major effect. The black swan event must have some sort of fateful or immense outcome, whether it's physical, structural, or emotional.

People attempt to rationalize it after it happens. After the black swan event takes place for the first time, people affected by it might root around for "signs they missed" or try to explain in retrospect how people should have expected the event to happen in the first place.

That third element is where one runs into trouble. A black swan event can be so all-encompassing, indeed traumatic, that it could force a wholesale reformation of one's beliefs or personal policies. But a black swan event is still an outlier, especially when it's a random bolt from the blue that couldn't have possibly been accounted for. To ascribe too much importance to a black swan event, to let it account for wide-sweeping changes that weren't there when it happened, is at heart nonsensical.

This mental model is about looking past the gravity of a black swan event, zooming out, and seeing the whole picture. Don't let the possibility of more lightning make you move to the desert. Catering to black swan events will come at the detriment of everything else in your belief systems, as well as large opportunity costs.

When you are faced with big events - business or personal - allow room to consider that it may very well be a black swan event that, while important, is not very informative or indicative of anything at all. Don't organize your entire strategy around the likelihood of a black swan event; unless you work for the Federal Emergency Management Agency (FEMA), disasters are not going to be an everyday part of your existence.

Let yourself think about worst-case scenarios. But then bring yourself back to reality. Is this event likely to occur again? How much of an outlier was it? Can we reasonably even do anything about it? Should it change the way we act if it is inevitable from time to time? If lightning will strike a few times a decade, is it worth it to retrofit your entire operation and home to account for that? In other words, should you stop driving cars because you heard an acquaintance got into an accident?

Smart planning will always seek to understand risk factors, but it must also accurately them. Life is full of risks - we take them every day when we cross the road. But life must go on. You shouldn't live your life in fear of a black swan event, but you can and should simply put a few moments of consideration into how they might happen and what they might require you to do.

If we zoom out a bit on black swan events, you'll realize that we are trying to find a predictable pattern in what is actually a random set of events. This is known as the gambler's fallacy, named for sentiments such as rolling a pair of dice and feeling that you must eventually roll a seven because it has been a while or you're due.

Never mind the fact that this is not statistically or probabilistically sound; you are attempting to create order in something impossible to have control over. The gambler's fallacy is the notion that just because X happened, Y should happen, X shouldn't happen, or X should happen again. More often than not, these events are all independent of each other, and this should guide your decision-making to be less biased.

The gambler's fallacy is representative of a broader phenomenon known as apophenia, which is the human tendency to see patterns and connections through random data points, usually also coinciding with too few data points. This is why people see rabbits in clouds and elaborate scenes through inkblot tests.

About the Podcast

Show artwork for The Science of Self
The Science of Self
Improve your life from the inside out.

About your host

Profile picture for Russell Newton

Russell Newton