06.25.24
Kevin Bethune | Essays

Oh My, AI

Adobe was once a trusted partner for creatives across the industry. But when they quietly updated their terms of service earlier this month, requiring users to give the company access to their content, they lost the very thing that fueled their success.

It didn’t have to be this way.

I remember being in graduate school, looking for a way into the world as a digital creator. It was the height of Web 2.0, a time when I could imagine sharing my artistic creations with anyone in any country in the world in an instant. Then I heard about this thing called “Adobe Photoshop” from other creatives active online. They had a pretty singular story to tell: Master this, and you’re on your way.

They were right.

Photoshop did two things for me. First, I learned digital interaction patterns that would help improve my craft — everything from selecting pixels with marching ants to applying cool filters — and second, I found my people. I immersed myself in online communities that would share feedback with Adobe and each other. As a community, we learned together, grew together, and entrusted Adobe to become our industry standard.

It was this big, messy, diverse community of creatives that just delivered Adobe an object lesson in what happens when you treat your customers with contempt.

The controversy began on June 6, after Adobe updated its terms of service to require users to agree to give the company access to their content via “automated and manual methods” to keep using its software.

The oddly worded legalese just hit different, Jess Weatherbed explained in The Verge.

“The update went viral after creatives took Adobe’s vague language to mean that it would use their work to train Firefly — the company’s generative AI model — or access sensitive projects that might be under NDA.”

And in a flash, decades of goodwill hung in the balance.

“I can’t help but feel like this is the final nail for Adobe,” said one Reddit commenter. “They already aren’t in their user base’s good graces. I just don’t understand why every decision they seem to make is bad for the user. Like even visually. Everything is punishment with Adobe.”

Adobe, while not perfect, had long been instrumental in empowering creatives around the world. It operated as a virtuous feedback loop, as the creative community continually provided valuable — sometimes blistering — insights to make Adobe even more successful. Win-win.

I was also an enterprise customer when I worked as a leader in the global footwear product engine at Nike. Adobe’s consulting teams made sure we understood how to use the toolsets that would work for us — the standard ones for everyone to use, and the bespoke ones that met our specific needs. That custom work also helped them think about their future roadmap for enterprises and consumers alike. Again, win-win.

In every case, I was confident that what I created on the screen would be mine — or my client’s — a promise made more meaningful because it’s now so rarely true in tech. We all learned the hard way with platforms like Google, Facebook, and the like, that free-to-use didn’t really mean “free.” We are in fact, the “the product.”

So, I had expected some level of co-creation, feedback, and reassurance to continue as the company trained its generative AI models — the very technology with the potential to negatively impact the community who uses its products. The idea that Adobe might have been leaving open the possibility that future AI features could copy our work without attribution — threatening creative IP, confidentiality, and ownership — was simply too much.

It took a few forward-thinking ethicists and activists to catch the update that few people were likely to read on our own, and sound a public alarm.

It cascaded quickly and predictably into a scrum of confusion, misinformation, and anguish. Employees were outraged; subscribers to Adobe’s Creative Cloud service canceled their licenses in protest. “No creator in their right mind can accept this,” posted Sasha Yansin on X. “You pay a huge monthly subscription and they want to own your content and your entire business as well.”

Then on June 10, Scott Belsky and Dana Rao, the company’s chief strategy officer and chief trust officer, respectively, published a blog post clarifying the terms of service debacle.

“We’ve never trained generative AI on customer content, taken ownership of a customer’s work, or allowed access to customer content beyond legal requirements. Nor were we considering any of those practices as part of the recent Terms of Use update,” they wrote.

But many customers — myself included — remain wary.

And here is where the opportunity lies.

I challenge Adobe to trust the community that made their business what it is by co-creating guiding principles with us that will offer every creator some input and comfort as we collectively navigate the inevitable proliferation of generative AI in all facets of creative practice.

I’ll say it more plainly: Let’s do this together.

Using a customer-led approach, Adobe can become a leading voice in the ethical development of AI. (They’ve made some promising moves so far — see below.)

And y’all, we need one.

The output from this creator-centered alliance can inform everything they do as they roll out new AI capabilities — marketing, sales, onboarding, product development, user experience, and, yes, even a plain language version of their terms of service.

Who will set a new standard and speak up for what’s right? Who will lead the way? That is my question to the tech industry. Adobe’s missteps can be repaired over time if they are prepared to truly support the values and strengths of our creative community.

Adobe, don’t play business as usual — especially on this and especially with creators. There’s just too much at stake.

A version of this essay was originally published in the Equity Observer email newsletter. Catch up on past issues here. Sign up for insightful commentary, breaking news, and community shout-outs delivered twice weekly. Find your people.






Jobs | June 30