Skip to content
Home Audio S11E10: Activism in AI with Google’s Ovetta Sampson

Ellen McGirt, Jessica Helfand|Audio

September 11, 2024

S11E10: Activism in AI with Google’s Ovetta Sampson

Ovetta Sampson is the Director of User Experience Machine Learning at Google. In December 2023, Business Insider named Ovetta to their AI 100, a list of the 100 most influential people working in artificial intelligence. Her inclusion on that list is a refreshing addition. Sampson approaches her work with generative AI and machine learning as an activist, with a commitment to humanity and ethics. 

In the final interview episode of the 11th season of DB|BD, Jessica Helfand and Ellen McGirt sit down with Ovetta to talk about why awareness of how AI is made is the first step towards holding it, and the people who make it, accountable. 

“There are more ethical ways that we can make AI but if you don’t know how it’s made, you can’t be an activist for those alternative ways, right?” Sampson says.  “And it’s really hard. There are definitely Joy [Buolamwini] and Timnit [Gebru] and some amazing other Black women out there, Black women in AI,  who are fighting this fight. And I happen to work at one of the largest tech companies in the world trying to fight this fight internally. But it’s really important that folks on the other end of these model outcomes join us. Join us in really asking questions about your data.”

Sampson also shares more about her mantra “Skynet not yet”, why we all should have an expectation that our data will be used responsibly and how her dad’s Commodore 64 launched her programming journey.

On this season of DB|BD, co-hosts Jessica Helfand and Ellen McGirt are observing equity by highlighting the “redesigners” — people who are addressing urgent problems by challenging big assumptions about how the world can and should work — and who it should work for. 

This season of DB|BD is powered by Deloitte. 

More from Ovetta Sampson via her website.


Follow The Design of Business | The Business of Design on Spotify, Apple Podcasts or your favorite podcast app

Transcript

Ovetta Sampson There are more ethical ways that we can make AI. But if you don’t know how it’s made, you can’t be an activist for those alternative ways, right? There are definitely Joy and Timnit and some amazing other Black women out there who are fighting this fight, but it’s really important that folks on the other end of these model outcomes join us. Join us in really asking about questions about your data. 

Ellen McGirt Welcome to the Design of Business. 

Jessica Helfand The Business of Design. 

Ellen McGirt Where we introduce you to people from all over the world, from different industries and disciplines. 

Jessica Helfand Who are here to talk about design, business, civility and the values that govern how we work and live together. 

Ellen McGirt This season, we are observing equity. 

Jessica Helfand I’m Jessica Helfand. 

Ellen McGirt And I’m Ellen McGirt. This episode of The Design of Business |The Business of Design is powered by Deloitte’s Dei Institute. Deloitte believes that bold actions can help drive equitable outcomes. And conversations like this can fuel the change needed to continue to build a more equitable society. Visit Deloitte’s DEI Institute site at deloitte.com slash U.S. DEI Institute for more of their research and perspectives on equity. Later on, we’ll hear from Kwesi Mitchell, Deloitte’s Chief Purpose and DEI officer. 

Jessica Helfand Ellen, can you believe this is our last episode of the 11th season of DB|BD and a whole season of redesigners at that. We are going to be back in a few weeks with a new episode reflecting on the season, but this will be the final interview episode of season 11. How do you feel about it? 

Ellen McGirt Yeah, I just I feel like we’ve been through something together. There’s just been so many gems, but I just want to say thank you. You are a wonderful partner. You’re wonderful friend. You’re a wonderful thinker. And by the way, Alexis Haut, who’s listening to you? Thank you as well. 

Jessica Helfand Oh my goodness. The scripts, the help, the research, the, all the things, all the things it really does take a village. And we have a wonderful village here at Design Observer, led by our incredible producer Alexis. And we have the perfect guest to end this season, someone who really sits at the intersection of business and design. And I would say the future. 

Ellen McGirt I agree, she’s Ovetta Sampson, the director of user Experience machine learning at Google. Don’t worry, she’s going to explain exactly what that means. But what you should know up front is that she’s one of the most influential people currently working in AI and machine learning. 

Jessica Helfand And it isn’t just us bestowing her with that honor. In December of 2023, Business Insider named her to their list of the top 100 people in AI. 

Ellen McGirt And honestly, I am so thankful that Ovetta is on that list and you all listening will be too when you hear what she has to say. She’s approaching her work with generative AI and machine learning with a commitment to humanity and ethics that, as we all know, has fallen by the wayside in developing some of these technologies. 

Jessica Helfand Exactly. Ellen Ovetta’s personal design motto is “to amplify the beauty of humanity with design while avoiding practices that exploit its fragility”, which is beautiful. But I have to say, another line on her website must be quoted, which I just loved, which is “if your mother says she loves you, get a second source. “

Ellen McGirt I think that makes perfect sense. They’re both gorgeous, and it makes her such a special person in her field that UX is her third or fourth career. That if your mother says you love you line, that’s a journalist talking, right? I’ve almost lost track. But she started her career as a journalist, reporting on the child and family beat, also around the world. She’s a design researcher and a web designer. She got a masters in UX in 2016 because she wanted to build useful products for seniors like her mom and a friend who is hard of hearing. 

Jessica Helfand It’s wild to me and maybe to you, Ellen, and I’m certain to our listeners, that she’s been in the UX field for less than ten years. She’s already held UX leadership positions at Capital One, Microsoft, Ideo, and she teaches a series of courses on integration of machine learning and design at DePaul, where she got actually a Master’s degree in 2016. This is somebody who just does not stop. 

Ellen McGirt She really has emerged as a force as the field she’s in has been developing around her, and nobody does it quite like her. She is really honest and we feel so lucky to close out our season with her. In this episode, you’re going to get to hear her talk about why her mantra is “Skynet, not yet”, why we should all have an expectation that our data will be used ethically, and how her dad’s Commodore 64 launched her programing journey. 

Jessica Helfand Listeners, you are in for a treat. It was a real treat for us as well. Here, with no further ado, is our conversation with Ovetta Sampson, the Director of User Experience Machine Learning at Google. 

Jessica Helfand I want to start by asking, somebody like you, my goodness, where to begin? You have done so many things over the course of your young life. You were a journalist, a researcher, a data scientist, a UX designer who was a major player, let’s be honest, in the AI revolution. Early adopter, early contributor. It sounds like your love of technology and research and programing started when you were only nine years old, with a Commodore 64, in the basement with your dad. Can you tell us a little bit about that story? 

Ovetta Sampson Yeah, it’s such a great, it’s so funny because, like, I have always thought of my life as like, I’m that silver ball in a in a pinball machine that just kind of get knocks around. And sometimes I get in the hole and score and other times I just get bounced around. But as I look back in my life, it’s more strategic than that. And I want to give, the universe a little bit more credit about my achievements, as well as myself, and my parents, and my family, and my support system. It’s not willy nilly. And I think those moments with my father, who was a young dad who went to Vietnam, and he was poor and Black and had to do multiple tours there. And so he was overseas when my mother, my birth mother, died. And so he came back and had this two year old he had to raise. And I’ve always been trying to have a connection with my father. And when I was growing up, around in fourth grade, he was going to night school because he had wanted to be a computer scientist, but he had to stop all of that to, to raise me. And so I would just kind of like sneak downstairs in the basement while he was studying computer science, and he was studying Fortran and COBOL. And so I said, I want to study Fortran and COBOL, and I do not recommend it. And so he bought us a Commodore 64 so that we could cadets over, coding. And, you know, I, I wasn’t like deep into coding, but it was like this, magical feeling where you could type some, some syntax into this machine and have this amazing outcome. And I would never really forget that. That felt really powerful, gave me a sense of agency, but also creativity. 

Ellen McGirt And I will say that imagination and creativity is the thing that we are now looking to you for, to help us navigate this really strange and terrifying new world, world of generative AI, which you your mantra “Skynet, not yet.” You know, don’t be afraid. We we are not going to be replaced and I appreciate that. But before we get to all of that, let’s let’s go from you and your dad and the Commodore 64 to Google where you are today. What was that journey like? 

Ovetta Sampson Well, I would say it was semi designed. I would not say it was holistically charmed. I think there was a lot of obstacles, and barriers to the path that I took to Google. Not so much because I did something wrong or, or or whatever, but just because of life circumstances. And that includes institutional racism, sexism, all those things. I mean, the high school that I went to didn’t really have a computer program, so I focused on being a writer. And really, when I became, a writer, it was really about activism and, and really changing the narrative of my neighborhood on the South Side of Chicago,. At that time in the 80s, when I was growing up and in the early 90s, when I went to, graduated from high school, Chicago had a very violent and, and kind of negative reputation. But in my sense, it was, it was a lovely, amazing community to grow up in. I think my, foray into writing was really about activism because there was a story in the main paper, the Chicago Tribune, depicting my neighborhood as this gang infested, violent place. And I was like, wait a minute, that’s not my lived experience. Right? At 13, I kind of wrote a letter, and I was just like, your narrative about my neighborhood is inaccurate. And that got me thinking about data and how, people of African American descent, and in Chicago, people who live in poverty or poverty stricken areas, how their narrative is written for them by other people. I thought, how can I earn a living and write and be an activist? And journalism came to mind. And so I spent 20 years, 22 years, writing, both internationally and domestically about, people like myself who were struggling with poverty, with, sexism and racism, and even internationally with war and religious persecution. And I started writing about how, humans treat other human beings and how we could treat each other better, right. And in different ways. How we can solve problems that are perennial, with community organizing and community design and, listening to the communities that are affected, instead of thinking that we know what they need. And I got to a point where I was like, I’m really tired of writing about problems. How can I solve them? And so I went back to school at age 38 to get my Master’s in Computer Science degree, because I felt like that was a chapter unfinished, you know? And it was me and, and a bunch of millennials, learning about design. And, I got my Master’s degree in Computer Science with a emphasis on Human-Computer Interaction from DePaul University. 

Jessica Helfand I want to go back for a minute, if I may, to what you said about being a journalist. Because I think more than just being a writer, what I’ve learned from my friend Ellen McGirt, and others like her, although no one’s really like her, so what I’ve learned from Ellen is that: journalists, good journalists, journalists who are fair and ethical, which is exactly where we are in this conversation, ask penetrating questions, difficult questions, and and really honest questions. And I think there’s something about questioning that leads you to the search for data, the search for algorithms, a search for non-biased algorithms, an understanding of how people correlate data to their lived experience, that is so fascinating as a journey. I mean, most people who talk about data analytics, I mean, you could just, you know, fall asleep, it’s so boring. You talk about it, and coming from your experience and your life, it it illuminates something that is so missing from the conversation. How do you take that experience and apply it to what is really an abstract expertise? Right? So you are looking to solve a specific problem, but the nature of understanding other people’s lived experience, how does that correspond to AI, which is so speculative and so open ended, and so, I think ,confusing for many of us. 

Ovetta Sampson Oh wow. What a great question. So let’s let’s go back to the human algorithm. And so I’m famous for saying in all my presentations that all data is created by people and all people create data. And so the first step to your question, Jessica, which is a wonderful question, of applying, lived experience to what seems abstract is to understand that there is no AI without us. 

Jessica Helfand Right. Right 

Ovetta Sampson Data is the love language of AI. 

Jessica Helfand Right. I love that.

Ovetta Sampson And we create data. And this is something that I like when I talk about activism, the first step to activism is awareness. So the first step that people need to understand is without our human behavior, there is no algorithm. Without our human action, there is no AI. Without the data that we create as humans, there is no enterprise AI or problem solving AI or AI that can be generated. 

Jessica Helfand Can I ask a really, can I ask a really quick follow up question? It seems to me that so much of what we read about artificial intelligence is about optimizing an outcome. And optimizing an outcome feels, to me, so quintessentially opposed to what you’re talking about, which is messy, which is idiosyncratic, which is human, which is beautiful and complex. How how does the machine and the sort of the machine language and the large language models, how do they reflect the humanity that you’re describing to us so beautifully? 

Ovetta Sampson Yeah. Well, let’s talk about a word that I used before. When I was nine years old, I was really good at writing. 

Jessica Helfand You’re still probably really good at writing. 

Ovetta Sampson I had just discovered my writing gene, and I had just discovered my coding gene. And the word that connected him, you have to remember, is syntax. Coding is just syntax. It’s always writing and communication. It’s just syntax. Right? And so, when you say, how do we connect the humanity that I’m talking about to the, this optimization kind of like automation culture. So we have to be aware of how models are made. The first thing is that there is no model without a person. So that’s the data that the model uses. Now how does that data get collected, transformed and engineered? Another person, right? A lot of the data that goes into our those really beautiful autonomous vehicles. Snorkel AI CEO talked about how he hires for meritocracy and all that. Well, he’s also exploiting workers in Kenya and India who are labeling all of that movement, cars and whether a window is a window, a laptop, and all that, machines just don’t automatically know that. So there’s a lot of human work and effort and labor that goes into this seamless, automatic world that we’re creating. But we can’t forget that. Again, all data is created by people and all people create data. So AI always starts and ends with humanity. Now, whether AI will start and end with humanity on good terms is up to us. 

Ellen McGirt It is up to us to a point. And if you were in every room that these decisions were being made, I would feel better about them. 

Ovetta Sampson Yeah! Because you you do not have to exploit human beings to collect data. 

Ellen McGirt You don’t. But this but this is the business. That’s why I’m shouting capitalism. Yes. That’s right. 

Ovetta Sampson Right. We do. Why? AI is born into the world of institutionalized racism, homophobia, transphobia, patriarchal. All of those frame frameworks as well as capitalism. And we all know that capitalism exists within those institutionalized frameworks as it is today. And the only reason why it exists is because it takes the constraints of those frameworks and exploits them and tells us that this is the only way that we all can benefit. And we know that that’s not true. We know that that’s not true. And it doesn’t have to be true for our AI either. There are more ethical ways that we can make AI, but if you don’t know how it’s made, you can’t activate. You can’t be an activist for those alternative ways. Right? And it’s really hard. Like there are definitely Joy and Timnit and some, amazing other Black women out there, Black women in AI, who are fighting this fight. And I happen to work at one of the largest tech companies in the world trying to fight this fight internally. But it’s really important that folks on the other end of these model outcomes join us. Join us in really asking about questions about your data. About where your data is going and who uses it, and the cost and value of it. 

Ellen McGirt I’m here with Kwesi Mitchell, Deloitte’s Chief Purpose and DEI officer, and good friend, and sponsor of today’s episode. Good to see you, Kwesi. Thank you for joining us. 

Kwesi Mitchell It’s good to be here, Ellen. 

Ellen McGirt And we’re spending this entire season thinking about what it means to redesign these systems, any data to support the fact that DEI efforts are still important to the C-suite? Because I know there’s a lot of conversation out there about that. Ample conversation. 

Kwesi Mitchell And in particular, when you look at recent surveys that we’ve done, and sponsored, as an organization, we continue to find that over 90% of organizations are continuing to commit to DEI because they believe that is good for their business. It is an ample request and desire from talent coming into the organization and ultimately leads to a diverse workforce that is truly innovative and nimble. 

Ellen McGirt Very excited about this. But what do you tell people who aren’t sure where to start? And you know, specifically when they’re not coming from a position of power within the organization? 

Kwesi Mitchell First, we’d like to start and dismantle the different aspects of what is a position of power within an organization to drive equity, right? I mean, and fundamentally, we all have to lead from where we stand, and there’s individual decisions that we’re making on a daily basis that drive equitable outcomes within our spheres of influence. Now, typically, we talk about having somewhat of a three pronged approach, you know, thinking, acting and connecting and using seven distinct practices that range everything from observing what’s occurring all the way through, looking in, addressing the challenge, accepting one’s individual roles, and ultimately forming the connections to drive action. So those are the items that we really start to help change. That notion of driving equity is something that the C-suite does in comparison to anyone else within the organization, from from where their realm of responsibilities exist. 

Ellen McGirt Kwesi, thank you so much for being here today. 

Kwesi Mitchell Thank you Ellen. It’s been a pleasure. 

Jessica Helfand I wanted to ask actually about a specific thing about the day job at Google, which is, if you feel at liberty to share with us and our audience, what an example of an ethical issue is. 

Ovetta Sampson Yeah. So, I do this all the time, look Google is really easy to pick on. It’s a huge organization. It’s a multinational corporation which are always like the hidden villain in every novel. Right? Like, I get it, but it could do a lot worse. Like, I always talk about like, when Walmart decided that they were going to dim the lights in their shopping centers, they did a lot more for environmental climate change by making that decision and by requiring their vendors to, to, to adhere to their climate change policy than a lot of people standing on the street with their with their signs. Right. 

Jessica Helfand It’s a great example. Yeah. 

Ovetta Sampson So when Google decided, hey, you know what? Why is it that when you take a photo with our Pixel camera, that Black people are, like, erased? Right? Now, I have talked about this for years because when Kodak, which was the first company camera company to go from black and white to color when they if you know anything about. 

Jessica Helfand Oh, it’s an incredible story and Polaroid. 

Ovetta Sampson Yeah. It’s an incredible story. If you know anything about photography, right, typography. I used to work on a newspaper, those things that used to come to your door with yesterday’s news. Right. And I had to go, I had to take the paper to print. Right. And so that means I had to go over all the photos and make sure all the RGBs were all on point. So if you ever taken a photo and you go, why does my grandma, who’s darker, is not there in the photo. And maybe my cousin, who’s lighter, is definitely there. It’s all about color balance. Color balancing. Right. And so when when Kodak went from black and white to color, they used a woman named Shirley to say, this is normal. And what does Shirley look like? 

Ellen McGirt She’s blond. Yeah. 

Ovetta Sampson Yeah. Well, actually she’s brunette, but she was a white woman. And so, so for years, 50, 60 years, we’ve been taking photos that make us invisible. Us. Right. The industry has been making taking photos that makes darker skinned people invisible. And Google engineers were like, why is that? But Google product inclusion folks. Activists, right? We’re like, we know why. And this is an engineering problem that we should fix. And so they used AI. They they got the Monk skin tone color balancing which has 12 on the spectrum. And they looked at it as an engineering problem and say, how do we fix this? Now when you take a photo with Pixel, it has all kinds of hues in it, and Black people don’t disappear. That’s one example of change in an industry. So when you have people within a company who say there are wrongs that need to be righted, and the and we have now the technology to do it with AI, and you have that combination of a willing company, who sees product inclusion as a way to make money, but also, rights and wrongs, which is great. Then that combination leads to things like the new Pixel skin tone capability, within Google Androids. And that’s just one example of how Google is trying to address, right some of these wrongs with the use of AI. 

Ellen McGirt Well, let me give you, let me ask you this, because we’re all, there’s so much fear about AI, and I know that a lot of creatives and the people who listen to this podcast are afraid that AI is going to replace them as creatives. And we can talk about that. But it’s the things that have been rushed into the marketplace that are dangerous in ways that are not easy to see, like a photo. My favorite example is, of course, are the algorithms, that’s used by and by municipalities in criminal justice that seek to predict who was going to offend or re-offend, again, based on incorrect data that we can’t see and we don’t know to advocate for. So this is just where I need a hug, I guess. But I don’t know how to prepare people to advocate for things that we can’t always see or know is in the market. 

Jessica Helfand Right? Technological recidivism isn’t something that we read about in the paper every day, but we should, or online every day because the paper came yesterday, as you said. 

Ovetta Sampson So let’s break this down. All people create data. All data is created by people. Data is the love language of machines. For data to be made machine ready, we need people. 

Jessica Helfand Right? That’s a manifesto right there. 

Ellen McGirt Yea, I want that on a t-shirt. 

Ovetta Sampson Yeah. So we collect data that’s traumatized with our social institutionalize -isms. That data gets machine ready. Right? By us. Then we create a model with that data. Right. And then we do something called, what I called, minimizing the human engagement risks. 

Ellen McGirt Okay. 

Ovetta Sampson So once we put that data into a machine learning model, we already know it’s risky. How do we know? Because it’s traumatized data. It’s data that’s been traumatized by our lived experience. Right. And so once we put it into a model, how do we mitigate the human engagement risks there? Well, we put people who don’t look like us in the room, so that they can identify human engagement risks that we may miss. We co-design with those folks. We send the model to those folks first, and they do red teaming for us, right? They point out the things that we do miss, right. We small launch the model in diverse populations so that we can get diverse mental models and lived experience about how that model engages with us. Right. We incorporate that feedback. We develop principles and frameworks about not doing more harm and adding more trauma with the model. We do not unleash that model like you would a toddler. Right? Right. Without teaching the model, what’s good and bad. The model doesn’t know, right? So we have to make sure that the model has guardrails and constraints on what’s good and bad behavior. And then when we do launch the model, we invite the diverse populations to give us an effective evaluation of how that model is engaging with them. 

Ellen McGirt So the way forward, is a full on stakeholder approach for every large language model. and that can be a voter or a customer, whatever, wherever you are. But if it’s in your community, you get to have a say. If it’s in your hand, you get to have a say. 

Ovetta Sampson Absolutely. And if it uses your data, you most definitely get to have a say. If you are a content creator, you’re on YouTube, you’re on Adobe, you’re you’re on any of these, Figma, all of these things, you definitely get to say how your data is used, right? Like, you shouldn’t just sit back and and hope that they’re using your data in the right way. You should have an expectation of advocating for your data privacy. 

Ellen McGirt So how how far away from that world are we right now? And are you taking your vitamins? 

Ovetta Sampson Yeah, girl, it’s been a struggle. But I do feel like it’s activism. I do feel like it is a part of the social justice, justice and equity movement that’s moving us into what I call, a new society, right? And not not a world domination society, but a society where, where people are realizing their value and the goals of their data being used for their for services and products. And again, the first step to activism is awareness. And I feel like people are getting more aware of what companies are doing with their data, or at least that companies need their data. We’ve been getting it for free, right? OpenAI has been scraping everything right. Just kind of like basically scraping the internet and the and the budget. And we we put everything on the internet because it was like this amazing thing to share. But people are now waking up to say, hey, I’m giving away gold there. And so the capitalism is going to go both ways, here, soon. Right? If it’s not already, right. TikTok makes money if I, if I make money. As a creator, right? YouTube is sharing that money, right? Like like it’s not like YouTube is the only one making that money. And so, now that more people are being aware of of the dependencies of AI, I, I’m hoping that there’ll be more, activists out there for their own personal relationship with it. And as far as design, designers especially, you know, human centered designers, and I do make a distinction. Human centered design, it’s not a framework. It’s not a methodology. It’s an IOS labor definition of a job. And part of that job is to get requirements and desirability from users. And the other part is do no harm to them physically, psychologically, cognitively. 

Jessica Helfand This is where I get to tell you that when I was teaching years ago, the word user bothered me for just the reasons that we’re discussing. And I once went into the classroom and I drew on the board a woman’s arm and a syringe in chalk, and I said to the students, what is this? And after about ten minutes, someone said, user. And I said, exactly. That’s to me, that’s the image, like branding, I always see a cattle prod. There are certain words that have become common parlance that really do us a disservice. And when you describe user, I have no objection. But in many contexts the syntax, as you rightly say, really leads us astray. 

Ovetta Sampson Yeah, I use I’m sorry. I usually say…

Jessica Helfand No, no, no no, no, not at all. But you, you actually make me want to reconsider it. But I think that what you’re talking about has to do with something that that has a great deal, you use words like craft and sensitivity and context and syntax, and these are words that really a lot of engineers and a lot of data scientists don’t use. So I personally applaud your effort to reintroduce into the language, words that actually are meaningful and suggestive of something more than just crunching numbers. 

Ovetta Sampson But if you’re a human centered designer, your responsibility is to make sure that those products do not do harm. And that includes AI. And we’re the only ones in the room, to your point, Jessica, that has that edict and mission and value as a job description. Right. Not every other discipline has that we do. A human centered designer has both. Right? And so that’s what I try to remind designers of when it comes to AI and design is our job is to stand in between harm that technology can do to people, rather than be afraid of not understanding that technology. And, our job is to look for the unintended consequences and use design to counteract it. 

Jessica Helfand Ovetta, thank you so much for your time today. 

Ellen McGirt Thank you so much. I love and miss you. 

Ovetta Sampson Thank you ladies. It was so wonderful to meet you. I miss you too. 

Ellen McGirt Jessica, I just want to say that I feel better about the world of technology and AI, knowing that there’s someone as interesting and creative and unusual a figure sitting in these rooms, representing a wide swath of humans who ordinarily aren’t part of these conversations. I think that’s really important. 

Jessica Helfand And I would add to those descriptions, those those adjectives fearless and brave. Yeah. And you don’t have the sense that she’s in warrior mode and has her dukes up. You have the sense that she is just wise and thoughtful. And here’s a shout out to anybody, who is you know, she’s she’s not 21, but she’s not old, right. This is a shout out to anybody who’s thinking about an enduring career as a human being. Yeah, because everything she’s done has collectively led to the next thing. And so she has this accumulated wisdom that gives her this calm perspective and this expressive kind of comprehensive view of the lay of the land. And the lay of the land keeps changing. And she’s, I think, fearless and brave. 

Ellen McGirt She’s a real beacon. And that she lets a lot of people know who do not look like they come from central casting in technology, that they have a place at the table. And she just says the thing and I will tell you when I go forward and I’m thinking about wherever I am, whatever issues I have to navigate to get to the next big breakthrough in my life, I’m going to remember her energy and I’m going to just say the thing. 

Jessica Helfand Say the thing. 

Ellen McGirt I’m just going to say the thing, she says the thing I love her. 

Jessica Helfand You know what that music means. 

Ellen McGirt It’s Big Swing, Small Wins time. 

Jessica Helfand So what’s our Big Swing for this week, Ellen? 

Ellen McGirt We’ve got a we’ve got a mini episode that’s going to be coming that we’re going to do a lot of recapping and thinking and thanking everybody really properly, including our amazing sponsor, Deloitte. So I’m just going to take, with our kind permission, just a moment to brag on ourselves a little bit. Our Big Swing coming up is are going to be our second annual Design Observer 20, the list of the people, ideas and initiatives who are redesigning the world for good. It publishes every October. We’re up to our eyeballs in great ideas and wonderful initiatives, and I’m very excited about getting ready to publish it again. It was the biggest thing we did last year. It’s sort of the anniversary of our of our renewed partnership too, Jessica. So I really love it for that as well. 

Jessica Helfand That’s beautifully said. And October is a very special month for Design Observer, and I look forward to seeing what our editors come up with. Our Small Win is also, we’re going to put the put the camera back on ourselves here for a minute, and say that after a very long period of just unbelievable work from our tiny but mighty team, we are about to launch our redesigned website for Design Observer. And what’s great about this redesign, for anybody who cares to spend any time on this new and improved, and as Ellen likes to say, newly hydrated site, is that it’s fast, it’s searchable, it’s got everything in it from the last 20 years. It looks like Design Observer, but it looks like a brand new, spruced up, ready for back to school time Design Observer, which it’s corresponding to back to school time. It’s what the French call la rentrèe. So we’re very excited that.  People who are new to the site can discover new things. People who have been longtime contributors and readers of the site will be able to find everything they want. And I’m very proud of and grateful to you, Ellen, for spearheading this enormous effort. I mean, the archive is just vast, as you can imagine, and we hope, our listeners and readers will be pleased with what they find. 

Ellen McGirt What did the French say again? 

Jessica Helfand La rentrèe. Which means the reentry. Because, of course, if you’re a French, you take the entire month of August and you go to the beach. 

Ellen McGirt Maybe next time I’ll be French. All right, beautiful people. Thank you for a great season. 

Ellen McGirt The Design of Business|The Business of Design is a podcast from Design Observer. 

Jessica Helfand Our show is written and produced by Alexis Haut. Our theme music is by Warner Meadows. Justin D. Wright of Seaplane Armada mixed and mastered this episode. Thanks to Adina Karp and Focus Forward Podcast Studio in Providence for production support. 

Ellen McGirt And thanks especially to all of you who have listened along with us all season long. We love knowing you’re out there. We love getting your comments and we appreciate you very much. For more long form content about the people redesigning our world, please consider subscribing to our newsletters Equity observer and The Observatory at Design Observer dot com. 

Ellen McGirt The Design of Business|The business of design is produced by Design Observer’s editorial team. The views and opinions expressed by podcast speakers and guests are solely their own and do not reflect the opinions of Deloitte or its personnel, nor does Deloitte advocate or endorse any individuals or entities featured on the podcast. 

Episodes are produced by Design Observer’s editorial team. The views and opinions expressed by podcast speakers and guests are solely their own and do not reflect the opinions of Deloitte or its personnel, nor does Deloitte advocate or endorse any individuals or entities featured on the episodes.

Observed

View all

Jobs

Share on Social

By Ellen McGirt & Jessica Helfand

Ellen McGirt is an author, podcaster, speaker, community builder, and award-winning business journalist. She is the editor-in-chief of Design Observer, a media company that has maintained the same clear vision for more than two decades: to expand the definition of design in service of a better world. Ellen established the inclusive leadership beat at Fortune in 2016 with raceAhead, an award-winning newsletter on race, culture, and business. The Fortune, Time, Money, and Fast Company alumna has published over twenty magazine cover stories throughout her twenty-year career, exploring the people and ideas changing business for good. Ask her about fly fishing if you get the chance.

View more from this author

Jessica Helfand, a founding editor of Design Observer, is an award-winning graphic designer and writer and a former contributing editor and columnist for Print, Communications Arts and Eye magazines. A member of the Alliance Graphique Internationale and a recent laureate of the Art Director’s Hall of Fame, Helfand received her B.A. and her M.F.A. from Yale University where she has taught since 1994.

View more from this author