Share
Facebook Facebook icon Twitter Twitter icon LinkedIn LinkedIn icon Email
Psychological test for leaders of AI companies

Leadership

Decisions about the future of AI shouldn’t be made by “dark personalities”

Published 22 November 2023 in Leadership • 9 min read

Traits such as overconfidence and narcissism, which may lead tech leaders to make bold decisions and take risks in the pursuit of growth, are dangerous in the context of artificial intelligence.

The furor around OpenAI CEO Sam Altman being fired, and then finally reinstated, has brought into sharp focus the potential risks of allowing the leaders of the firms developing AI to make critical decisions about the future of a technology that may pose an existential threat to humanity.  

Notably, the board that fired Altman was made up of people who had no financial stake in the company’s success and were dedicated to balancing the potential benefits of AI against the risks.*

Altman is a controversial figure. Given his leading role in making decisions about the future of AI, his goals and decisions have a significant impact and so his capacity to make good judgments, and those of other leaders of AI companies, matters a lot.  

So, I was particularly concerned when I read an account that Altman is a survivalist. He is quoted in The New Yorker as saying to the founders of one of his companies, “After a Dutch lab modified the H5N1 bird-flu virus five years ago, making it super contagious, the chance of a lethal synthetic virus being released in the next 20 years became, well, non-zero. The other most popular scenario would be that AI attacks us and nations fight with nukes over scarce resources… I try not to think about it too much… but I have guns, gold, potassium iodide, antibiotics, batteries, water, gas masks from the Israeli Defense Force, and a big patch of land in Big Sur I can fly to.”  

If true, this raises questions for me about Altman’s fitness to make hugely consequential decisions about the future of AI. Also, interestingly, OpenAI’s interim CEO Emmett Shear appears to have quite a different view of the right tradeoff between safety and speed, posting on X in September, “I specifically say I’m in favor of slowing down, which is sort of like pausing except it’s slowing down.” 

Sam Altman's return capped a tumultuous week that upended OpenAI, the firm behind the ChatGPT.
Sam Altman's return capped a tumultuous week that upended OpenAI, the firm behind the ChatGPT

Think about the future of AI as putting a potential weapon of mass destruction into the hands of people like Sam Altman. The consequences of the decisions he and others will make are orders of magnitude greater than the usual decisions made by tech CEOs. This should make us all very concerned about whether these people have the psychological fitness and wisdom to do the right things. This is especially true given the immense financial returns and power that can be accrued by pushing forward regardless of the risks.  

Dark personalities and tech firm leadership 

A recent summary of research on “dark” personality traits in CEOs provides important perspectives and raises huge concerns about who is making critical decisions about AI. These traits, which include excessive risk-taking, narcissism, Machiavellianism, and abusiveness, are described in the research as “dark,” not because they are “evil” in a moral sense, but because of their negative impacts on (often many) others due to overconfident risk-taking, pervasive self-oriented manipulation, and a lack of empathy for the effects of their decisions on others.  

These qualities have been studied extensively over the past two decades to understand their impact on organizational outcomes. This research has established that dark personality traits can have both positive and negative effects on company performance. The specific traits that have been explored include: 

Overconfident risk-taking – This trait can result in a CEO pushing their company to achieve high levels of performance. Still, it can also lead to unrealistic expectations, overcommitment of resources, and sometimes a disregard for the welfare of employees or other stakeholders. CEO hubris can lead to a stubborn pursuit of visionary goals and the potential to achieve significant breakthroughs. But, it can also blind a leader to the realities of the market and organizational capabilities and result in failure. 

A CEO's willingness to take big risks may help persuade stakeholders to invest in an unproven technology.
A CEO's willingness to take big risks may help persuade stakeholders to invest in an unproven technology

Narcissism – CEOs with narcissistic traits may be more inclined to take bold actions, make big bets on innovative projects, and set visionary goals for their companies. Their self-confidence can be infectious, potentially inspiring employees and attracting investors. But these traits can also lead to risky decision-making, resistance to feedback, and volatile company management. 

Machiavellianism – CEOs with Machiavellian traits may be adept at navigating corporate politics and outmaneuvering competitors. They might excel in negotiations and strategic partnerships because they can manipulate situations to their advantage. On the downside, this could create a toxic work environment and lead to unethical business practices. 

Abusiveness – This describes leaders who regularly engage in behaviors intended to dominate, belittle, or otherwise cause distress to subordinates or colleagues. 

In the context of “normal” technology firms, where innovation and rapid growth are highly valued, these qualities can result in positive business outcomes. A CEO’s willingness to take big risks, for instance, may help persuade stakeholders to invest in an unproven technology. Narcissistic CEOs may excel in projecting a strong image of the company, attracting talent and investment. Machiavellian CEOs may know when and how to “bend the rules” to achieve their goals. Abusiveness is tolerated because of the magnitude of the business results these CEOs achieve. 

OpenAI AI
“It is essential to recognize that while dark personality traits can contribute to positive outcomes, they also carry substantial risks.”

It is essential to recognize that while these traits can contribute to positive outcomes, they also carry substantial risks. The success of CEOs with these traits in technology firms may depend on their ability to balance their more extreme tendencies with sound business judgment and the input of their management teams and advisors. The research highlights the double-edged sword of dark personality traits in leadership and suggests their impact is nuanced and dependent on the broader context within which they operate. 

Dark personalities meet nuclear weapons  

The potential benefits of being a dark personality CEO in a tech firm include superior outcomes for the firm and financial returns for shareholders. The downsides may result in failures and lost investments.  

But the companies developing AI are not “normal.” Again, thinking about AI as a potential weapon of mass destruction can be instructive when considering what kinds of personalities we want to make decisions about their development and deployment.  

When it comes to nuclear, chemical, and biological weapons, this is not a hypothetical question. The US military has done a great deal of work to develop criteria and assessments to winnow out the people who should and shouldn’t have the keys to trigger Armageddon.  

Dark personality traits in CEOs are described in the research as “dark,” not because they are “evil” in a moral sense, but because of their negative impacts on others.
Dark personality traits in CEOs are described in the research as “dark,” not because they are “evil” in a moral sense, but because of their negative impacts on others.

Consider the psychological fitness test below. Based solely on the first three sections, this assessment would eliminate CEOs with dark personalities from making decisions with potentially cataclysmic impacts. To put it another way, it may be less of an issue for dark CEOs to lead normal tech firms, but a huge problem to trust them with making decisions concerning artificial intelligence. 

How can we prevent potentially catastrophic consequences from AI being used in the wrong way by the wrong people? In a saner world, their decisions about how far to go, and how quickly, with AI would be subject to oversight by lighter and wiser personalities. With so much at stake, if the leaders of some of these firms are found to be making dark decisions, they would be barred from working at AI firms.   

Illustrative psychological fitness test 

Note: This is not an actual instrument but illustrates the criteria used. In addition, it is set up as a self-assessment, while in practice, the people who access and control weapons of mass destruction are assessed by trained psychologists.  

Open AI leader test
What are the consequences of having socially aversive leaders in charge of AI firms? What would their score be if they were to undergo a psychological fitness test?
In real assessments like this, the items would be scored on a frequency scale of Never / Rarely / Sometimes / Often / Always.

Poor impulse control

  1. I am a risk-taker.
  2. I often make decisions on the spur of the moment.
  3. I struggle to wait for things I want.
  4. I find it challenging to work on long-term projects.
  5. I have made significant life decisions without consulting others.

Mistrustful traits

  1. I find it hard to trust people.
  2. I prefer to keep others at a distance.
  3. I often feel suspicious of others’ motives.
  4. People would describe me as cold or aloof.
  5. I have been told that I am untrusting.

Lack of empathy

  1. I tend to focus only on my own interests.
  2. I seldom think about the effects of my actions on others.
  3. I struggle to understand why people feel the way they do.
  4. It’s not important for me to care for others’ feelings.
  5. I can’t easily put myself in others’ shoes.

Substance use

  1. I use substances like alcohol or drugs to relax.
  2. I have failed to fulfill my responsibilities because of substance use.
  3. I feel guilty about my substance use.
  4. Friends/family members have expressed concern about my substance use
  5. I have tried to reduce my substance use but found I couldn’t.

History of aggression

  1. I have been involved in a physical fight as an adult.
  2. I lose my temper and become aggressive more than I would like.
  3. I have damaged property when angry.
  4. People have told me that I can be intimidating.
  5. I believe that, in some situations, aggression is necessary.

Mood disorders

  1. I have periods where I feel unusually full of energy.
  2. I experience significant mood swings without a clear reason.
  3. My mood affects my ability to make decisions.
  4. I have felt so down that I had trouble functioning.
  5. I have thought about ending my life.

Anxiety disorders

  1. I worry a lot about making mistakes.
  2. I am very anxious in social situations.
  3. I avoid certain places or situations because they make me anxious.
  4. I find it difficult to control my worry
  5. I have had a panic attack in the  last year

Psychotic disorders

  1. >I feel disconnected from reality at times.
  2. I have heard voices when no one was around.
  3. I sometimes believe that others can control my thoughts.
  4. I have beliefs that others find strange or unfounded.
  5. I see things that others do not see.

* Altman has been rehired as CEO of OpenAI with a new board. The best account so far about what led to his ousting is here and is worth reading as it provides more background on the safety vs. speed debate within the company and Altman’s apparent efforts to press for speed and stifle criticism.

Authors

Michael Watkins - IMD Professor

Michael D. Watkins

Professor of Leadership and Organizational Change at IMD

Michael D Watkins is Professor of Leadership and Organizational Change at IMD, and author of The First 90 Days, Master Your Next Move, Predictable Surprises, and 12 other books on leadership and negotiation. His book, The Six Disciplines of Strategic Thinking, explores how executives can learn to think strategically and lead their organizations into the future. A Thinkers 50-ranked management influencer and recognized expert in his field, his work features in HBR Guides and HBR’s 10 Must Reads on leadership, teams, strategic initiatives, and new managers. Over the past 20 years, he has used his First 90 Days® methodology to help leaders make successful transitions, both in his teaching at IMD, INSEAD, and Harvard Business School, where he gained his PhD in decision sciences, as well as through his private consultancy practice Genesis Advisers.

Related

Learn Brain Circuits

Join us for daily exercises focusing on issues from team building to developing an actionable sustainability plan to personal development. Go on - they only take five minutes.
 
Read more 

Explore Leadership

What makes a great leader? Do you need charisma? How do you inspire your team? Our experts offer actionable insights through first-person narratives, behind-the-scenes interviews and The Help Desk.
 
Read more

Join Membership

Log in here to join in the conversation with the I by IMD community. Your subscription grants you access to the quarterly magazine plus daily articles, videos, podcasts and learning exercises.
 
Sign up
X

Log in or register to enjoy the full experience

Explore first person business intelligence from top minds curated for a global executive audience