Adobe is diving—carefully!—into generative AI

Dall-E 2. Stable Diffusion. Midjourney. These tools all use generative AI to create images based on any text description you can dream up. They’re astounding. But the stuff they churn out can be more fascinatingly weird than wonderful. They’re also provoking ire (and, in the case of Stable Diffusion and Midjourney, legal pushback) from artists who don’t like the idea of AI taking their jobs, being trained on their work, and even mimicking their style.

Enter Adobe, the eternal behemoth of digital-imagery software. Like everyone else in the business of computing, it can hardly choose to avoid the generative AI boom. But as a company whose whole reputation rests on helping creative people produce high-quality results, it also has more to lose than a startup. An Adobe image generator must be capable of rendering imagery usable in a professional context. And if the underlying technology is legally or ethically questionable, it could damage the company’s relationship with its core customers.

Today, at its Adobe Summit conference in Las Vegas, the company is revealing Firefly, its first major piece of functionality based on generative AI. Debuting as a beta web-based service, it will also be integrated into Photoshop, Illustrator, and Adobe Express (and, eventually, appear in all relevant Adobe products).

[Image: courtesy of Adobe]

Like other examples of AI appearing under Adobe’s company’s “Sensei” brand, Firefly is based on home-grown technology. I haven’t yet had the opportunity to try it for myself. But I did get a demo from Adobe CTO for digital media Ely Greenfield, who explained the company’s approach to the technology, which emphasizes thoughtfulness during a period when other AI titans are moving fast, breaking things, and mopping up afterward.

“We’ve been talking to a lot to our customers, everyone from the creative pro to the enterprise to the creative communicator, about what they think about [generative AI],” says Greenfield. “And we think it can be incredibly empowering for them.”

To teach Firefly to create pictures, Adobe curated a training set based on its own vast repository of stock imagery as well as public-domain work and other material it knew it had the legal right to ingest. Along with allaying concerns about rights issues and preventing other companies’ brands from showing up in generated images, this process helps Firefly come up with visuals that are professional rather than just entertainingly bizarre. Even so, “it’s still a bit of a rolling-the-dice game,” says Greenfield. “Sometimes you get good stuff, sometimes you don’t. And that’s true with Firefly as well. Yes, we can generate beautiful high-quality content because of what we’re training on, but you’ll get the occasional extra finger or limb.”

[Image: courtesy of Adobe]

While typing stream-of-consciousness prompts into existing tools, such as Dall-E, is loads of fun, Adobe gave Firefly an interface that emphasizes workaday productivity. Along with typing in a free-form request, such as “highly detailed llama,” you can click on thumbnails to specify elements, such as the desired content type (Photo, Graphic, Art), style (Synthwave, Palette Knife, Layered Paper, and beyond), and lighting (such as “dramatic”). There’s also a typography option that lets you request items, such as, “The letter N made from green and red moss.”

Firefly in its beta form could manage to be more ingratiating than its existing competition, but it won’t address every concern artists have on day one. Adobe says it’s working on ways to pay creators whose images are leveraged by Firefly, including contributors to Adobe Stock. It’s also introducing a “do not train” tag that artists can embed in their digital work’s metadata. That could be a step toward reassuring artists who don’t want their creativity sucked up into algorithms, especially if it’s adopted and honored by everyone else who’s training AI.

Firefly is the single most obvious piece of generative AI functionality that Adobe could build. It’s also just a starting point. “This is the first model that we’re delivering,” says Greenfield. “It will be the first of many in the family.” Already in the works: applications of the tech for video, 3D, and—since the company is a major player in marketing technology—ad copy. Even if Adobe isn’t one of the first tech giants that springs to mind first when you think about AI’s transformative potential, the long-term impact on its sprawling portfolio of offerings could be just as profound.

https://www.fastcompany.com/90868402/adobe-firefly-generative-ai-photoshop-express-illustrator?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Établi 2y | 23 mars 2023, 01:20:45


Connectez-vous pour ajouter un commentaire

Autres messages de ce groupe

How your data is collected and what you can do about it

You wake up in the morning and, first thing, you open your weather app. You close that pesky ad that opens first and check the forecast. You like your weather app, which shows hourly weather forec

3 juil. 2025, 10:10:05 | Fast company - tech
Crypto is about to get even bigger thanks to millennials

How the Boomer wealth transfer could reshape global finance.

Born too late to ride the wave of postwar prosperity, but just early enough to watch the 2008 financial crisis decimate some

3 juil. 2025, 10:10:04 | Fast company - tech
Is the Velvet Sundown an AI band? Many on the internet sure think so

The Velvet Sundown is the most-talked-about band of the moment, but not for the reason you might expect.

The “indie rock band,” which has gained more than 634,000 Spotify lis

3 juil. 2025, 10:10:04 | Fast company - tech
U.K.’s Bytes Technology stock plunged over 27%. Here’s why

Shares of U.K.’s Bytes Technology plunged over 27% on Wednesday after the IT firm said its operating profit for the first half of fiscal 2026 would be marginally lower due to delayed custome

2 juil. 2025, 17:50:03 | Fast company - tech