For the past few weeks, we have been exploring mental models, what they are, and how common ones can help a team comprehend issues within their organization. By studying mental models, we can become better systems thinkers.

From the field of cognitive science and behavioral economics, a significant number of shared mental models have evolved labeled as biases or heuristics.  I label these mental models dangerous or limiting. Without recognizing them in practice, and how prevalent they are in our thinking, we are highly prone to mistakes and shallow thinking.

These mental models have been grouped into the famous Cognitive Bias Codex around four major cognitive problems.

1. Too much information
2. Not enough meaning
3. Need to act fast
4. What should we remember?

In our VUCA world,  these four common problems are commonplace in leadership, and each triggers certain biases that shortcut our thinking to save us mental energy. In other words, these problems with information often cause us to employ cognitive biases without our knowing it. I explore each problem area below.

Problem 1: Too much information
Think about all of the data in a day an educational leader has to filter through their brain. There is information about students, teachers, situations, new policies and initiatives in addition to all of the external information in the world that comes at a dizzying pace. We have no choice but to filter most of it, and our brain has some interesting tricks to help without your knowing it.

We know that proven biases such as the availability heuristic and attention bias make us notice things already primed in our memory. For example, you may have been thinking all day about a literacy issue, and as you work through your emails, you delete everything that doesn’t deal with literacy. You were biased toward that action.

Likewise, we know that biases like the confirmation bias or selective perception force us to notice details that confirm our existing beliefs. This set of biases can be especially prevalent in teacher evaluation. If you like a teacher, you are more prone to find things that support how good s/he is no matter their actual skill level.

In sum, because of too much information in the world, our brains take shortcuts based on our likes and beliefs. This series of shortcuts can limit our ability to learn and widen our perspectives about things. It is critical to recognize this issue and develop systems that help us manage the information overload. For instance, I have an email file labeled interesting. There I put emails about subjects that I have not heard about before and schedule a time weekly to read through them.

Problem 2: Not enough meaning
Our brains are a fantastic tool that works to add meaning to everything we perceive and experience. To survive as educational leaders, we have to try and make sense out of student actions, teacher lessons, and policy mandates that may not be as clear as we would like them to be. Without our knowing it, this information has entered our brains where we automatically work to connect the dots and fill in missing information.

This meaning-making often happens beneath our level of awareness and is engaged by our cognitive biases. We know for instance that the recency illusion and insensitivity to sample size biases show that we still find or makeup patterns even when data are sparse.

Alternatively, we see that through stereotyping or automation bias we often fill in characteristics and patterns about people or groups of people. For example, as a change-minded leader you may label certain teachers as resistors who may question the reasons for your changes. By placing this label on them, we may limit our ability to design a stronger change by surfacing their ideas.

In sum, because information is often missing and our brains like complete patterns, we typically fill in the blanks again based on our beliefs and experiences. These cognitive patterns can limit our depth of understanding. During change initiatives, it remains critical that you have a set of conceptual filters or written principles to keep coming back to add depth of meaning.   

Problem 3: Need to act fast
As leaders, decisions and information fly at us all the time, yet we cannot always be certain or take the time to make the best decisions. This problem often leaves us with trying to understand the information the best we can, apply it to the situation, and predict what may happen.

The problem of needing to act fast has been exacerbated over the past decade due to the immediate nature of our culture. This problem feeds right into many known cognitive biases. For instance, the information bias and belief bias shapes our thinking to favor the simplest solution over complex, ambiguous ones even if the simplest solution may be wrong or incomplete.

Similarly, to avoid making mistakes and preserve our autonomy, we often employ the status quo bias and social comparison bias. These cognitive biases help speed up decisions in that they both reject any new ideas almost outright.

In sum, without a clear way to make decisions or a process that slows our thinking down, we often assume we need to make a fast decision. Without realizing it, we often ignore or ignore newer ideas due to time constraints. That is why it is critical as a leader to explain these biases, recognize them and set reasonable timelines for important decisions. Rushed decisions almost guarantee one of these biases will emerge.

Problem 4: What should we remember?
As human beings and leaders trying to lead schools in the 21st century, there is just too much to think about and remember. Even when we try and remember things, it feels like our memory banks are overloaded. Our minds and memories have not evolved to meet the memory demands of our times.

Cognitive biases such as stereotyping or implicit stereotyping allow us to discard specifics and form generalities. These biases are a brilliant energy cost saving process but lead to many unfortunate results.

To save space, other cognitive biases like suggestibility or misattribution bias make us edit and weaken some critical details after the fact. This bias is why professional development for a whole group is so problematic. We assume everybody has interpreted the same message.

In sum, we are often not in complete control of what we try and remember without lots of intentional effort. Many cognitive biases reduce by putting memories in a kind of zip file to maintain space and energy in our brains. It is vital then as a leader to help your staff develop good memory storage systems that can help them remember what is most important.

This problem is also essential for you as the leader; it is almost like you need to create a second brain just for storage. Useful tools like Evernote or Google Drive when used well can help solve this problem.

Conclusion
Our brains are amazing tools yet evolved during times that were not as information-rich as our current time. Our brains evolved to recognize and avoid danger; not to handle hundreds of emails daily.

Many of the biases discussed above have evolved to save time and energy for the organ that demands it the most. They act as mental models or processes to handle four major problems of our knowledge age. It is essential then that we recognize these four major areas of biases in our thinking and develop methods that allow us to deal with them for deeper reflection and better decision-making.