Stanley McChrystal says character is the most vital leadership trait in the age of AI and polarization

Amid polarization, AI disruption, and eroding trust in institutions, retired four-star General Stanley McChrystal argues that what leaders need now more than ever is character. Head of the business consulting firm McChrystal Group, he has written a new book on character, drawing from his decades of experience. From AI ethics and modern warfare to hot-button issues like Signalgate and transgender service in the military, McChrystal explains why character is the foundation of lasting leadership. 

This is an abridged transcript of an interview from Rapid Response, hosted by the former editor-in-chief of Fast Company Bob Safian. From the team behind the Masters of Scale podcast, Rapid Response features candid conversations with today’s top business leaders navigating real-time challenges. Subscribe to Rapid Response wherever you get your podcasts to ensure you never miss an episode.

I wanted to ask you about the changes that are afoot in the military and the Department of Defense. Some folks champion the idea of change. Some folks make dire predictions. For you, who have worked with military and military leaders for a long time, what’s your perspective about what’s being attempted?

I would say first, if I go to 30,000 feet and look at it from a big distance, change is needed, change is appropriate. And I think it’s going to mean significant change, adoption of new technologies, changing of force structures, all of those kinds of things. All of that is correct. Even firing generals, if it’s necessary, is a good thing—if you are firing generals because they don’t have the skills or they don’t have the right personalities. So all of those things, I completely sign up for, and I wouldn’t recognize a lot of sacred cows that would be exempt from hard scrutiny.

Now, having said that, I am not aligned with where the current secretary of defense, how he defines some of the current issues and the direction. He talks about the warrior ethos. But the reality is, what we are trying to do is get the best military we can, and that’s not necessarily the strict warrior ethos, because soldiers are a little different. Soldiers are disciplined. They follow the rule of law. When you think about warrior, it’s got a little bit of a looser definition or interpretation usually.

So that I think is probably a mistake if you start to say, “You can’t have transgender soldiers.” My response would be: One, there aren’t many. And two, if a transgender soldier is really good, we need them. We don’t have that many extra people that are really good that we can afford [to lose]. So I have a much different definition of what a really effective service member might be. I think I do.

And then I also think that if you are judging military leaders on a political ideology, you’re playing with fire. And here’s why. We’ve had this extraordinary couple of centuries of the U.S. military being pretty apolitical, not always perfectly, but generally very apolitical. And although there’s friction between civilian leadership and uniforms, it’s one of the healthiest relationships that you’ve seen on the globe for 200 years.

Once you start to hire and fire senior leaders based upon their political alignment with any particular ideology, you are going to affect younger military leaders. They are going to shape their behavior. They’re smart people. They’ll look up, and they’ll say, “This is what it takes to succeed in this business,” and they will start to represent that behavior. And a decade from now, or two decades from now, we’ll have a very different kind of military, and we won’t like it. It will not be the apolitical, very professional force that I knew and that I think is largely the case today. So I think it’s understanding the danger of that dynamic that is really critical.

The issues around security—this Signalgate scandal using publicly available tools to communicate—is that really a big deal or is that sensationalized? 

One, I do think it’s a big deal. I think using Signal, even though it’s encrypted, it’s not secure. And so you are transmitting future plans on an unsecured device, which is extraordinarily dangerous for the men and women who are going to go execute that operation. So I do think that was a big deal. It was almost a reflection of amateurism.

Now, the other side of it bothers me far more. We had the mistake. It comes out. Everybody knows it’s a mistake. They know that the information is extraordinarily sensitive, and they get up in front of cameras, and they say the information was not classified. Now they know that that’s not true. They know that’s a lie. It is classified, and yet they look at the camera, and they say something that maybe most Americans can’t parse the difference. But anybody who’s involved knows that people whose salaries you and I are paying in positions of great responsibility consciously and intentionally don’t tell the truth to you and I. That’s a big problem, and that’s the far greater issue here. We can minimize the event that occurred as a mistake, but we can’t minimize the lack of integrity.

I’m curious how you look at AI’s potential impact on the military, and how do we know if we’re ahead or behind, especially in that competition with China?

Yeah, we’ve never had anything quite like this. The closest analogy in my mind would be nuclear power, atomic weapons, and we got them first during World War II; we won the race to produce nuclear weapons and then used them first. And when other countries followed us and developed their own nuclear weapons, we got this sort of balance.

The problem with artificial intelligence, and I’ve had the opportunity to do some work and a big war game on it, is that if somebody achieves artificial general intelligence before their competitors, theoretically they could then sprint ahead in a way that their competitors almost couldn’t catch up. And you could have a dominant superiority, and we’re not even a hundred percent sure what AI will do on the battlefield. We know it will make a lot of things simpler, faster, easier—logistics, planning, all those things—which will make an army more efficient. But as AI starts to do target discernment, autonomous engagement with weapons systems and robotics, we have an incomplete picture. Ukraine’s like a glimpse of the future. We have an incomplete picture of how dominant that will be.

So I don’t think there’s any time except the pursuit of nuclear weapons where this idea of losing the race could mean losing the war. And when you think of AI, you have to blur the lines we had for many years of military power and separate from diplomatic or commercial power. Those things are now so interwoven, because the ability to leverage AI in production and things like that could give a country a decisive advantage that immediately shows itself in the military sphere.

So I think first, two things have to happen. We need to be pursuing those kinds of regulations and understanding around the world that give us some opportunity to put rules and norms in place for AI. But we’re not close to it. But parallel to that, we need to be at breakneck pace trying to develop AI. And those seem in tension, in contradiction, that here we are trying to develop new nuclear weapons and at the same time, we’re trying to set up rules to limit their use. But if we lose, if we don’t get parity with AI, then we’re going to be in a position that’s extraordinarily dangerous. And that’s, again, not going to be the military; it’s going to be this broader national effort.

And the topic of character that you’re so focused and compelled about. Today that applies to AI, too, and how we talk about it, whether it’s commercial uses or military.

Well, I would argue character becomes more important, because the power of the individual is dramatically more than it was even 200 years ago. When we think of the old saying about Samuel Colt, who created the six gun, we say: “God made man; Sam Colt made them equal”—and he leveled the playing field for people who weren’t as big and strong, as they could have an effective weapon.

AI is going to do that and give extraordinary power not just to nation states, but to individuals. And so those people who have that extraordinary power, and almost all of us will have some form of it, have the ability to do great good or great evil. And so character, I think, is going to become more essential than ever.

https://www.fastcompany.com/91333979/stanley-mcchrystal-says-character-is-the-most-vital-leadership-trait-in-the-age-of-ai-and-polarization?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Établi 12h | 14 mai 2025, 09:20:03


Connectez-vous pour ajouter un commentaire

Autres messages de ce groupe

Elon Musk’s Grok AI is replying to tweets with claims about ‘white genocide’ in South Africa

X users who interacted with the chatbot Grok on Wednesday were confronted with replies about the legitimacy of white genocide in South Africa—often regardless of context.

In one post, a

14 mai 2025, 20:50:03 | Fast company - tech
How Headspace and Ozlo help people drift off with sound

Ever wonder why the sound of rain makes you instantly drowsy, but a ticking clock drives you up the wall? That’s because not all noise soothes the brain in the same way. Sleep sounds might seem li

14 mai 2025, 16:20:06 | Fast company - tech
Elon Musk’s DOGE is launching a new AI retirement system. It was built mostly under Biden

Elon Musk’s Department of Government Efficiency (DOGE) has spent its first 100 days slashing

14 mai 2025, 16:20:05 | Fast company - tech
Uber launches affordable Route Share service and new savings features at GO-GET 2025

Uber is on Wednesday launching its own version of a bus system along busy routes, calling it its most affordable ride option yet.

The rideshare company has introduced Route Share, a new

14 mai 2025, 16:20:05 | Fast company - tech
Sony announces record surge in profits for Q1. Here’s how they did it

Japanese technology and entertainment company Sony logged an 18% rise

14 mai 2025, 16:20:04 | Fast company - tech