
Europe needs Chips Act 2.0 to compete in digital race
Former French finance minister Bruno Le Maire says Europe needs to act urgently to secure its supply of semiconductors or face relegation to the global slow lane. ...
Audio available
by Martin Fellenz Published July 15, 2025 in Geopolitics • 9 min read • Audio available
Domestic and international political dynamics, technological acceleration, and the era of poly-crisis and perma-crisis create events with consequences no one can foresee or fully understand. Existing playbooks – be they economic, strategic, organizational, or interpersonal – used by executives to guide their actions provide insufficient guidance. Similarly, the existing mental models they use to make sense of new realities prove increasingly inadequate. Decision-makers are told to “expect the unexpected”. That may sound good, but it is profoundly unhelpful and logically impossible: as soon as we expect something, it is not unexpected anymore. More fundamentally, trying to conceive of frame-breaking events and their consequences is based on prediction, and research shows that humans are not good at that. Our brains are designed to find regularity. We (over)simplify, rely on often inaccurate memory, and trust previous experience. All this leads us to expect and predict stability and continuity.
The more expertise we build up, the more we value our understanding of the world. We trust our intuitive ability to make sense of and understand the world and how it is evolving. At the same time, we have less and less of an incentive to question our existing mental models because doing so challenges our sense of our identity, expertise, and value. In volatile, changing, and complex circumstances, expertise can become a liability that is hard to overcome.
For this reason, I suggest to individuals, teams, and organizations that want to improve their ability to operate in complex and changing environments to learn to “unexpect the expected”. To do this, we must become aware of our existing mental models, develop a different way to perceive and interpret uncertainty, and – maybe most importantly – replace an implicit performance orientation with a deliberate and committed learning orientation in our engagement with the world. The first step is to let go of our need for certainty and actively embrace curiosity.
Letting go of familiar mental models is uncomfortable because it is cognitively demanding and emotionally challenging. It often feels like admitting we are wrong. Suspending our expertise to adopt the mindset of a novice learner can be especially difficult for experienced professionals and requires courage. Leading others through this shift demands strong, empathetic leadership.
In rapidly changing environments, success is less a function of past knowledge and more about how quickly and accurately we can learn. In such conditions, clinging to what we already know is more dangerous than exploring and embracing what we have yet to fully understand. That’s why learning through testing and disconfirmation – actively challenging our assumptions – is essential. So, how can we do that?
One of the most effective ways to challenge entrenched thinking is to deliberately adopt a skeptic’s perspective – a disconfirming stance. This doesn’t mean being negative for its own sake. It means insisting on high standards of evidence before accepting something as true, even things we have learned in the past. In medicine, for example, an approach known as differential diagnoses helps to distinguish multiple plausible diagnoses and indicates tests that can help determine the true cause of experienced symptoms. In any organization that has developed the capacity for disconfirmation, you will regularly hear people say, “Have you tested this?” or “What does the data say?” This reflects the core logic of the scientific method: insisting on quality data and testing rather than defending ideas. This scientific mindset builds intellectual robustness. In business, we too often neglect this discipline, preferring confirmation over challenge. We must work hard to reverse that tendency.
Diverse teams (across age, background, culture, training, etc.) bring different mental models, which is a strategic advantage, especially when facing novel challenges. However, inclusion is not enough. Minority perspectives must be actively amplified. If they don’t exist, they can be deliberately introduced through roles such as “devil’s advocates” or debates designed to surface dissent before consensus ossifies. These interventions only work if there is psychological safety. Individuals will not voice dissent if they fear it is socially or professionally risky. In addition, when high-status team members signal a preferred view, alternatives will quickly fade. Real learning requires a climate in which speaking up is not just permitted but enabled, expected and supported. This can be signaled with evidence events where everyone is invited and expected to bring different and challenging evidence. The more senior members show publicly that they are open to changing their minds, the more such cognitive diversity can take root across the team and organization. This is how diverse thinking can inform collective sensemaking.
Counterfactual reasoning forces us to step outside current assumptions. “What if” questions challenge accepted facts or logic and create space for alternative explanations. For example: “What if our access to a critical resource drops by 30% for the next 18 months?” These provocations may initially seem unrealistic, but the more outlandish they feel, the more they push us to test and revise entrenched models. In real life, we can extend this logic with red-team/blue-team events common in cybersecurity tests where a red team attacks the existing security arrangements to test assumptions and find weaknesses. Alternatively, in a new product development, instead of asking, “What can we learn from the needs and wants of our customers?” we say, “What can we learn from those who choose not to buy from us?”
We often default to linear cause-and-effect reasoning, but new insights can emerge when we reverse that logic. Instead of asking how X causes Y, ask, “What if Y causes X?” For example, instead of assuming overeating causes obesity, ask, “What if obesity leads to overeating?” These inversions don’t need to be correct to be useful because they help expose blind spots, uncover assumptions, and broaden how we interpret complexity. For example, Charlie Munger’s inversion approach urges us to consider not just how to achieve an objective but how to avoid its opposite. It might also be helpful to ask questions that turn symptoms into causes (like in Toyota’s “five-why” root cause analysis: “Why did the machine fail? A tripped fuse. Why did the fuse trip?”), but fully flipping causal direction is a more powerful way of challenging automatic thinking patterns and identifying hidden assumptions.
Reframing a problem in a broader context or a longer time horizon can loosen the grip of existing models. Techniques such as scenario planning, future-back reasoning, “multiple states of the world”, and “pre-mortem” analysis all shift the interpretive frame. They reveal hidden assumptions and open space for new interpretations. While pre-mortems can be used for specific issues such as new product launches, projects, or startups, these approaches deliver useful insights for issues of all sizes. Take scenario planning: Project 201 was a comprehensive coronavirus pandemic simulation conducted with various stakeholders at Johns Hopkins University three months before COVID-19 struck. To deploy insights arising from different timeframes, Singapore plans its land use and transportation development with three parallel and interconnected plans with 50-year, 10 to 15-year, and short-term horizons. Each perspective informs the others to maximize the value of investing limited resources. A different way to change the scope is to consider the evaluation of a current solution 10 years in the future or in the present with a very different set of stakeholders. “Zooming out” offers a highly effective means of fundamentally changing perspectives and exploring different truths.
Waiting for outcomes to review what worked is too slow in fast-moving, uncertain environments. We can adopt ongoing-action reviews (OARs) by borrowing from after-action reviews in healthcare and the military. These involve making short-term predictions based on current understanding, tracking specific indicators, and updating mental models as new data emerges. The US Special Forces Command instituted a “Team of Teams” approach to share close-to-real-time information across many otherwise siloed units. This evolved into an interactive forum for flagging problems, learning from mistakes, and identifying high-impact responses and solutions. The focus of such OARs is continuous adjustment, not retrospective analysis. To ensure OARs have the impact needed, some project firms have instituted governance structures for learning, not just for progress and performance review. Yet, this is also important for the adaptive work required by non-project units and systems in times of change, complexity, and uncertainty.
Micro-experimentation – running fast, low-risk tests to gather real-time data – is even more dynamic. While not suitable for all strategic questions, this approach is invaluable for probing emerging realities and testing hypotheses in complex systems. Insights may be incremental, but their value compounds quickly. Such an approach allows broad engagement across the organization and provides high-quality data that can be shared to inform decision-making. One large financial services firm that introduced micro-experimentation fundamentally changed decision-making across many parts of the organization. Rather than relying on expertise, tradition, and hierarchical power (the proverbial HiPPO: the highest-paid person’s opinion), micro-experimentation created the expectation that those with the best data should sit at the decision-making table. This led to a democratization of decision-making, hugely energized people across all parts and levels of the organization, and helped create a much more future-oriented, adaptive culture.
These approaches share a common goal: to test, refine, and where needed change our mental models, not to reinforce them. Yet knowing how to evolve our mental models is only half the battle.
Decision-makers must shift from interpreting new developments through outdated frameworks to actively questioning those very frameworks in light of what is unfolding. Sensemaking – interpreting uncertain contexts to enable constructive action – is no longer a solitary cognitive task but a core leadership responsibility. Our ability to “give sense” to others depends on our willingness to release outdated assumptions and remain open to the signals that the emerging reality is sending. Doing so collectively demands emotional attunement as well as cognitive agility.
Leaders must be attentive to how others respond to having their thinking challenged. Moments of resistance – emotional, tense, or even hostile reactions – often signal that we are touching deeply held and rigid beliefs. These strong reactions are not primarily driven by facts but by perceived threats to identity or ideology. When people trust the evidence, they rarely react that strongly. When they don’t, the reaction is often a clue that we’ve stumbled onto a core assumption worth exploring further.
Responding to these reactions requires a balance of persistence and empathy. Shifting mental models isn’t just about providing better arguments; it’s about building trust, creating psychological safety, offering a sense of belonging, and strengthening the belief that the person, team, and organization involved can cope and deliver even in hugely complex and dynamic circumstances. Leadership in uncertainty is as much about emotional connection as intellectual clarity.
One of the most effective ways to support this shift is to involve others in the process of joint sensemaking. By modeling openness, curiosity, and a willingness to change our views (in other words, by modeling genuine humility), we invite others to do the same. This is leadership by example, not through providing certainty but through facilitating a learning orientation and shared exploration.
If we are to “unexpect the expected,” we must test not only our assumptions but also support others in doing the same. Outdated mental models must give way to evolving, co-created ways of understanding and acting in the world. The future will not reward those who cling to what they once knew; it will reward those who remain curious, adaptive, humble, and willing to learn together.
Affiliate Professor of Leadership and Organizational Behavior at IMD
Martin Fellenz is Affiliate Professor of Leadership and Organizational Behavior at IMD, where he co-directs the Negotiating Value Creation program. He was previously Full Professor in Organizational Behavior at Trinity College Dublin and has received numerous institutional, national, and international awards for his impactful teaching. He is an expert in leadership development, organizational transformation, cultural change, and organizational design.
July 10, 2025 • by Bruno Le Maire in Geopolitics
Former French finance minister Bruno Le Maire says Europe needs to act urgently to secure its supply of semiconductors or face relegation to the global slow lane. ...
July 8, 2025 • by Mike Rosenberg in Geopolitics
A new framework encourages leaders to see the world as PLUTO – polarized, liquid, unilateral, tense, and omnirelational. It’s time to think differently and embrace stakeholder capitalism....
July 7, 2025 • by Richard Baldwin in Geopolitics
The mid-year economic outlook: How to read the first two quarters of Trump...
July 1, 2025 • by Mridul Kumar in Geopolitics
Mridul Kumar, India’s ambassador to Switzerland, gives his personal view of his nation’s growing role as a leader in a multipolar world....
Explore first person business intelligence from top minds curated for a global executive audience