Megan Smith | The 2019 MAKERS Conference
Megan Smith, CEO, shift7, talks about how we can and should all get engaged with AI which brings both threats and great promise from The 2019 #MAKERSConference at Monarch Beach Resort.
- Please welcome Megan Smith. [MUSIC PLAYING] MEGAN SMITH: So I wanted to bring Joy Buolamwini to you guys. Today, she's at Emory giving the provost lecture. She's an extraordinary scientists, computer scientist, and engineer. And when she was a student at MIT, she just noticed face recognition wasn't seeing her. And so she put on a white mask. And it saw her. So this is it. You've taken photographs of people. And if there's a person of color, you have to adjust the light, turn on the flash. This is racist, right? So we have to know that technology can be good or bad. It's just good or bad. It's whatever we do with it. And, in fact, we were over at the Norman Lear Center. And they had this little piece. And it was, media is good or bad, right? We're humans. And we do what humans do. And we do good and bad things. So what I'm hoping-- in this session, we're going to talk about AI, data science, machine learning, and all the emergence that's happening. People say this is like the beginning of fire, right? These are important technologies. And my hope is that you feel welcome into this world. This is our world. Everybody owns what's going to happen. It's not about a small group of people who'd like to use it for their interests. It's for whatever. So in these images, I wanted to bring Joy forward. And I also want to share that-- if you bring up the slides-- that Joy is actually in the cover of Time this week. Ava DuVernay is the guest editor. And it's about optimistic art. And so she's one of 34 featured folks. And it launched this morning. So I hope you'll read that article. And the point of Joy's work is we can solve this if we foresee it, right? So the weapons of math destruction, right? If you haven't seen Kathy O'Neill's TED talk, you have to see it. Because you can learn so much. And it helps you not be afraid of code but be invigorated to figure this out. Because we don't want loss of liberty. We don't like loss of freedom, economic opportunity, all of those things. This is our Congress. So we can use the data to see that Democrats and Republicans used to vote together in the '50s, onwards. And now, they're completely divided, once we have cable news and this media landscape that divide us, right? This is some work that I've shown you guys before, which is, who gets to speak on films? Children television down here, blue is men's lines, red for women's lines. So men's lines to women's lines in 2000 films. So AI learns on data. So it may be we have to learn that women are going to speak less in life with AI. No. We can use AI, machine learning, face recognition to actually audit media. So this is some work with USC from the Viterbi School that we do with Geena Davis and others. So we can use the AI for what we want to solve, for equality and justice. AI is, of course, for these things, huge business. But it's also for all of these things. AI, machine learning could be for poverty reduction, for equality, for justice. Why not? And if we knew our history we know that Ida, and Ellen Swallow Richards, and all these women in the time have done that. This is a picture that shows where the money's going to flow. This is from PricewaterhouseCoopers and some of the Accenture data. But I would notice that Africa's not even on this map. [INAUDIBLE], right? So this is Google's data, some engineers hooking up our search traffic to search, just like-- you know, the Earth from space, right? When we see the light. So wherever there's people, they're searching. Except, look at that. Right? And this is a couple years ago. But still, could we use some of the development money to get broadband to a billion talented people so they could join us in the conversation? Thank you. Let's go. Right? So how do we use collective genius, all of us, right? That's what we're trying to do. And so I just want to riff a couple examples. Grace is in 10th grade. She's a computer scientist. And she's teaching the police chief of New Orleans how to code. Because justice and tech go together. This is a high school in Houston who suffered a great-- of course, the hurricanes. This is an environmental justice high school that used to be, as a Title I school, 50% graduation rate. Half the kids left. Now, it's above 90%. Because they work in environmental justice, on real things. This is an incredible film from Sundance last year, "Inventing Tomorrow." All of these young people are like the Parkland Kids. They're STEM kids. And they invent things, like [INAUDIBLE], who saw the river looking like this. This crap blew in her school bus. And she didn't say, oh, no. She said, let's start measuring. And let's start fixing it right now, right? So that's what can happen. I talked to you about Hokulia, the young people working with the mentors who can Polynesian navigate, right, with no instruments. They went around the world with no instruments. These are just some of the young people doing that. Think about, in the age of AI, that human intelligence is so great. You know, we went to the United Nations. My colleague Susan Osner, is here, who helped with doing the solution summit. We put out a call from the UN, who, already, is solving the goals? And all these people emerged. So how do we get behind these solution-makers on all these topics, whether it's the Native Americans from Standing Rock and Pine Ridge who are already doing solar, and energy, and agriculture things; or whether it's Greta, who is coming forward and really demanding that we pay attention, if our house was on fire, wouldn't we run outside and start doing something, that's happening with the planet? Here's a data point. We explode the equivalent of three Hiroshima bombs every second, in terms of heat, into our oceans. Panic! She says, please panic! I hope you will panic! Let's work on that! On this stage, we had Olivia and Komura. Remember Komura? She was listening to the room for domestic violence with her app. So we could see if we should call the police. Alexa and SIRI aren't doing that. But that's what we could use them for. We could figure out the privacy so it was OK. We could work on that if Komura's in the team.