Runway’s AI can edit reality. Hollywood is paying attention

AI tools are disrupting creative work of all kinds, and Runway AI is a pioneer in the space—making major waves in Hollywood through partnerships with the likes of Disney and Netflix. Runway’s cofounder & CEO Cristóbal Valenzuela dissects the company’s breakneck growth, the risks and responsibilities of AI tool makers, and how AI is redefining both business expectations and our notion of creativity. 

This is an abridged transcript of an interview from Rapid Response, hosted by the former editor-in-chief of Fast Company Bob Safian. From the team behind the Masters of Scale podcast, Rapid Response features candid conversations with today’s top business leaders navigating real-time challenges. Subscribe to Rapid Response wherever you get your podcasts to ensure you never miss an episode.

You released your Gen-4 model not long ago. You had your Aleph video editing tool come out.

Correct.

And there are these other tools out there too now, Google’s Veo 3 which I see folks using. Of course, there’s OpenAI’s Sora, Midjourney. What’s the difference between all these? I mean, are you all utilizing similar engines, or are all these things popping up now because the compute has reached a certain place?

It’s a combination of things. I mean, we’ve been working on this for almost seven, eight years, so there’s a lot that we’ve learned after being alone and building this. I would say these days it’s becoming more evident to many that the models are getting pretty good at tackling and doing a lot of different things, and so that becomes interesting for obvious business reasons. All models are different. I think all models are trained for different reasons. We tend to focus on professionals and folks who want to make great videos. This amazing model we’ve released only recently, just a couple of weeks ago, allows you to modify and create video using an existing video. That was never possible before. And so those kinds of breakthroughs are just allowing, I guess, way more people to do much more interesting things.

I saw you in another video use voice prompts to create a video scene. Your tools generate camera angles and change objects. They extend a scene outward, filling in what isn’t there. In one video, we see a cityscape, and then street lamps come on, and the windows of office and apartment buildings start blinking, and the lights are switching on and off in this very choreographed sequence. Can you explain how that was created?

It took us less than an hour to make that video, and you start with a scene, an initial video, and then you ask Runway for things you want to change in that video. And so, we could ask if it’s daylight, we can ask the model, “Just show me a night version of that same scene.” And so what the model will do is it will understand what’s in that scene, and it will turn down the light metaphorically, but also literally we’ll just turn day into night, while maintaining pretty much the consistency for everything else. You might turn on the lights of the streets. And you can be much more specific. You can be “Only turn the lights on the left,” or “Only turn the streetlights while keeping everything else dark.” You can be like, “Now start turning the lights one by one, starting from the one on the left to the one on the right.”

So in a way, it’s editing reality. Maybe you can think about it like that. You have an existing piece of content and you’re working through that content with AI, asking it to modify it in whatever way you want, which is really fun to be honest. It’s something I think we’ve never had the chance of doing ever before, and so it’s really fun to play with.

I’ve played with Runway a little. It’s awesome, but I can’t write a single natural language prompt and get a full film yet. I mean, there is craft and discipline to getting these tools to work at their potential. I mean, are we going to get to the point where all you need to create a film is the idea for it? The vision and the production itself is all automatic?

I think a great concept of what you mentioned was tools. This is a tool. It’s a very powerful tool, and this tool allows you to do things that you couldn’t do before. Knowing how to use the tool would always be important, and the tool is not going to do work on its own if you don’t know how to wield the tool, how to use it in interesting ways. And so I guess the answer for the question of will we ever get to a point where you can just prompt something and get exactly what you want? I guess the answer is kind of-ish. Depends on how good you know how to use the tool.

I think about what tools people are using today to make films, like a camera. Can a camera help you win an Oscar? Of course. If you have a camera, will you win an Oscar? No. What makes a great filmmaker is like, well, knowing where to point the camera, knowing how the camera works and functions and how you can tell a story with a camera. And I think that’s no different from how we think about AI tools and Runway specifically, which is it can help you go very far. You can do amazing things with it. You just have to learn how to think with it and work with it. And if you know, then you’ll get far.

You mentioned work that you do with studios in Hollywood. I know you’ve partnered with Netflix and Disney and AMC networks and whatever. How are they using Runway’s AI today? Because AI can be a little bit of a dirty secret in Hollywood. People are using it, but they don’t always want to admit it.

Yeah, I think it’s a tool that’s the answer. And so the best studios and the best folks in Hollywood have realized that, and they’re using it in their workflows to combine them with other things they know pretty well. The thing is that there are no rules. You can start inventing them right now. I mean, Aleph is a couple weeks old, and so people are figuring out things and ways of using the technology that we never thought possible, and that’s what I enjoy the most. It’s a general purpose technology. It can be used in ways that are diverse and creative and unique, and if you’re creative enough, you’re going to uncover those things.

At some point in the future, there may be a whole different medium about the way you do it. Right now, I can imagine they take ideas and they create essentially a prototype of a film to show to get ideas through. Is that part of how it’s used?

You can think about, broadly speaking, in two stages. There’s preproduction and postproduction. Preproduction is, well, writing the script and doing art direction and selecting characters and casting and location scouting and just preparing to make the stuff. And so there’s many use cases of Runway in there. Of course, the obvious ones are storyboarding and helping you with writing the script and helping you with casting characters and seeing how they’re going to behave and what they’re going to do.

And then in post, once you film or we record something, there’s a lot of visual effects and things that you need to apply and change to the videos themselves. And so let’s say the example that we were speaking before, turning day into night. Let’s say you’ve recorded something and it happens to be that someone changed the script later and the shot that you recorded had to happen at night. Well, the way you would do it before was that you had to go back and shoot again and spend more time and fly the actors again and do the whole thing. Or now you can go into Runway and just ask the model to turn that scene into night, and it will do it for you. So it’s less of them coming to Runway and typing, “Get me a multi-award winning film now, fast and cheap,” and more about, well, I have this problem, it’s very expensive to solve. I have a tool now that can help me do it faster and better. Can I use it? Will it make my movie? No, but it will help you very much in getting there faster and cheaper.

https://www.fastcompany.com/91393242/runways-ai-can-edit-reality-hollywood-is-paying-attention?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Établi 6h | 27 août 2025, 13:50:11


Connectez-vous pour ajouter un commentaire

Autres messages de ce groupe

Nvidia’s earnings report will show whether AI boom is overhyped or not

Chipmaker Nvidia will release a quarterly report Wednesday that could provide a better sense of whether the stock market has been riding an

27 août 2025, 18:30:15 | Fast company - tech
Thrifting is still big business, according to Pinterest

Thrifting is still in fashion for fall. 

That’s according to Pinterest‘s annual fall 2025 t

27 août 2025, 18:30:12 | Fast company - tech
Bed rotting has gone mainstream

Many of us indulge in a day of “bed-rotting” every now and then. For younger generations, it’s becoming less of a guilty pleasure and more of a habit.

According to a recent

27 août 2025, 16:20:06 | Fast company - tech
Instagram adds student badges as social apps chase campus connections

Timed with back-to-school season, Instagram has launched a new feature just for students. 

The social media platform announced “

27 août 2025, 16:20:04 | Fast company - tech
Want to disguise your AI writing? Start with Wikipedia’s new list

Have you ever read an article or social post and thought, “This is terrible! I bet it was written by AI!”?

27 août 2025, 13:50:09 | Fast company - tech
Spotify and the problem with our ‘everything app’ era

How many apps do you use to chat with other people? I don’t mean tweeting out into the ether. I mean actually interacting with a fellow human in a one-to-one way.

For most people, the nu

27 août 2025, 13:50:07 | Fast company - tech