ログインしてさらにmixiを楽しもう

コメントを投稿して情報交換!
更新通知を受け取って、最新情報をゲット!

ウィスパリング同時通訳研究会コミュのA conversation with YouTube CEO Susan Wojcicki

  • mixiチェック
  • このエントリーをはてなブックマークに追加

Susan Wojcicki CEO, Youtube
Moderator Frederick Kempe
President & CEO, Atlantic Council
FREDERICK KEMPE: Hello and welcome across the world. I’m Fred Kempe, president and CEO of the Atlantic Council.
I’m delighted to welcome you to Atlantic Council Front Page, our premier ideas platform for global leaders, and to this session with Susan Wojcicki, the CEO of YouTube. You can also follow us online through hashtag #ACFrontPage. On this platform, we’ve hosted heads of state and government, leading lawmakers, chief executives, and innovators in the private sector. President Macron of France recently spoke to our global audience and tomorrow we’ll host Malala Yousafzai, the Pakistani human rights activist and Nobel Prize laureate.
But seldom have I ever been so looking forward to one of our Atlantic Council Front Page discussions as I am today. And that’s not just because Susan has been a leader in the tech industry for more than twenty years or that she runs the leading video—first platform in the world with two billion users. Pause on that for a moment: two billion users.
It’s not just because of her particular focus on the growth of YouTube’s creator economy. Over the past three years, YouTube has paid more than thirty billion dollars to creators, artists, and media companies, supporting 345,000 jobs. And in this year of COVID-19, we know how important that is to people. We’ll get to that.
We’ll also get to the burdens and responsibilities of running such a company in our world of disinformation, online harm, and growing digital dangers to democracy. The events of January 6 in Washington, DC have brought a reckoning in many respects.
There is also no topic more relevant at this moment than that of technology’s role in society. As more of the world comes online, especially during this year of coronavirus, we are grappling with complex debates over the role of private industry in governing the internet. What’s the meaning of human rights in digital spaces? How do all of these issues shape the world we want to live in?
And in that spirit, join me in welcoming the chief executive officer of YouTube, Susan Wojcicki. A humanities major, she took her first computer class in college as a senior. She sold spice ropes when she was eleven. So, Susan, there’s a reason I tell this story. My thirteen-year-old daughter was not impressed that we were hosting President Macron, but when I told her we were hosting you she was beyond herself.
SUSAN WOJCICKI: (Laughs.)
FREDERICK KEMPE: And she is learning animation now because of the inspiration of YouTube.
It was also in Susan’s garage that Larry Page and Sergey Brin spawned what became Google. She was its first marketing manager, and as employee number sixteen, went on to help launch and grow AdSense, Google Analytics, Google Books, Google Images, and even the famous Google Doodle. In 2006, Susan convinced Google to acquire a fast-growing video-sharing service called YouTube. The rest, as they say, is history.
Since taking over as CEO in 2014, she has helped grow the company to its now estimated ninety-billion-dollar valuation, boasting, again, the two billion monthly global users and an estimated ad revenue of fifteen billion dollars in 2019. She is one of only two women leading a company listed in the top ten of most-visited websites. And she has used her position to address the challenges women face in the tech industry, leading to changes in her company and Google, while speaking out publicly for paid family leave at the national level.
So, wow, Susan, that’s really a lot, and that just scratches the surface. We’re all looking forward to getting to know you better.
Before we get started, let me just share a little bit of context about the Atlantic Council’s work in the tech-and-society space through our GeoTech Center, which focuses on harnessing technology for good; through our Cyber Statecraft Initiative, which today began its ninth annual Cyber [9/12] student contest across thirty-five countries and four continents; and in particular through our remarkable Digital Forensic Research Lab, which is hosting this event. So let me salute Graham Brookie, who leads it so capably; and also Rose Jackson, the director of policy initiatives at DFRL.
DFRL has been at the forefront of the effort to better understand, document, and confront online harms, build resilience, and develop a community of practitioners globally—we call them digital Sherlocks. In so doing, our team has extensively researched and documented [misinformation] and disinformation spread in many parts of the world. As they have demonstrated, confronting the very serious and real offline consequences of online harms requires policy, legal, product, and societal changes.
Finally, I’d be remiss if I didn’t acknowledge that as we raise all these questions about technology platforms, we are operating off of them today. You may be watching us on YouTube, commenting about it on Twitter, sharing reactions through Facebook. Plenty of evidence to suggest that social networks have become an essential part of business, government, and life in the world. Indeed, one of the biggest differences between our pandemic now—the worst in a century—and the one that took place a century ago is we had a virtual world to which we could retreat, a digital world that would help sustain economies and so many livelihoods. So setting and advancing the positive vision for tech in the world requires us to have the sort of direct and honest conversation we’ll have today and across industry, governments, and civil society about the rules and incentives that govern online spaces.
So I’m sorry for that long of a windup, but I wanted to set the context for what we’ll be doing now. I’m so grateful for Susan’s willingness to join us today to discuss how she views these issues and the role of YouTube. We welcome the audience to ask questions via Zoom Q&A if you’re on Zoom or by tweeting at #ACFrontPage.
So, Susan, after that introduction, let me set the stage—which I hope sets the stage for the first question—so, how big was the garage where Google was born? And why did you let these clowns—no, that’s not really the first question. I’m really—
SUSAN WOJCICKI: It was small. It was small.
FREDERICK KEMPE: (Laughs.)
SUSAN WOJCICKI: But thank you for having me here. I’m delighted to be here and looking forward to the conversation. But, yeah, their garage was small. They actually entered through the garage. They had a small part of our house that they worked into. The whole house was tiny.
FREDERICK KEMPE: So do tell us that story. Since you’re on it, tell us.
SUSAN WOJCICKI: (Laughs.)
FREDERICK KEMPE: I mean, did you have any idea what they were—what any of this could have led to and –
SUSAN WOJCICKI: No. No, I can’t say I thought, oh, they have this amazing idea, I’m going to rent them the garage because I want to have equity or anything. I just wanted the rent. (Laughter.) I wanted the rent to make sure I could cover the mortgage of our house that we had just bought. And I had known Sergey beforehand. And so then they moved in. They had one employee. They got up to seven employees in the house and it got pretty crowded at that point.
But I did realize when they were there—they would be there all night and I would come over and talk to them—I did realize what they were working on and how compelling it was. And at the time, this wasn’t seen as a very interesting part of the tech market and it didn’t seem like it was really going anywhere, and but I realized, like, wow, this is really making a difference and it can help me find information. And I grew up in an academic house and being able to find information was really something I knew the importance of, and I saw what a good job they were doing and that’s, ultimately, what convinced me to join them.
FREDERICK KEMPE: So the first lesson of this is if you live in Menlo Park, rent out your garage or parts of your house because it could pay off big time.
So a question that comes off of that, why were you so convinced, because you said what they were working on wasn’t so recognized at that time. But why were you so convinced when you joined Google as the number sixteen employee? Why were you so convinced they should acquire YouTube? What was unique about it as a platform, product, and approach that attracted you and, ultimately, convinced them?
SUSAN WOJCICKI: So we were working on a product, a Google video product, and so we had a product that was similar to YouTube. And so as a result of that, we actually saw how people were using it and what they were uploading, and I would say there were two really important insights that we gained from working on it.
And the first is that people want to share their story. So we didn’t know [if people were] going to upload their videos, and what we saw is that people—like, right away, we saw millions of videos being uploaded, people [were driven by] just the desire to share their story. And, of course, lots of people want to become famous or talk about what was important to them. But then there was this question of does everyone want to hear from everyday people about what’s meaningful to them or about their hobby, and what we saw is that they do.
And there was actually a video I remember in particular that really cemented that for me, which was of these two students in their dorm room. Their roommate is in the background doing homework and they sing this song, the Backstreet Boys, and it’s so funny. And that was really the first big hit that we had and I just realized, wow, you can have hits here in user-generated content. People want to watch it and people want to see [content] from other people like them, and I realized what a powerful medium it could be for entertainment but also for information.
FREDERICK KEMPE: That’s really interesting, and it’s so interesting to talk to somebody that was there so much at the beginning of all of this, which gets to my next question, which is, I think we all saw promise in the internet. We all saw promise in things like Google and YouTube and Twitter.
But did you ever think it would become such an essential part of life, business, [governance] even, all around the world? So I think that’s question one, and then question two, now that it is that, how does that change the way you look at your responsibilities compared to, say, when you started as CEO 2014?
SUSAN WOJCICKI: Well, I didn’t really see that when we first started. I mean, when I first joined, just to put it in perspective, Google had sixteen people, right. And I was in place sixteen and I did see the promise right away of information and the way people were writing to us and discovering new places to go or new doctors, new treatments, new information, new songs, [and] new music. I saw that upfront.
But I couldn’t have anticipated how it was going to grow into the platform that it is today. And, again, it’s been over twenty years. I think most people—it’s hard to see twenty years into the future and predict what that’s going to be. So, I mean, it’s hard for us now, right, to think [about] what’s technology going to be in 2040.
But now that we are where we are and I see the role we play, I’ve been very focused on the responsibility [aspect], and because I’ve been at Google for over twenty years, [and] because I’ve been now [YouTube’s] CEO for over seven years, I feel the responsibility to—because I know how our systems work, I know how to build them, I know how to change them—to take everything I know and to make sure we apply them to these hard questions that we’re dealing with as a society and make sure that we are a responsible platform. And it is some of the hardest work that I’ve ever done.
But I also know that the combination of consulting with experts, talking with policymakers, and being able to translate that into whether the right policies and products for YouTube and for Google, I know that we can do that and we’re on a journey to do that. But we’ve come tremendous—we’ve made tremendous progress.
And so, really, the responsibility work for me really started around 2016. I mean, of course, we had always talked about doing the right thing for users. That’s been there since day one of Google. Do the right thing for users. And but in 2016, we really started talking about responsibility and the role that we played and I can answer more questions about that because that’s a complex area. What is the right thing to do? What is responsibility and what does that mean for a platform that is global and deals with so many hard issues?
FREDERICK KEMPE: Let’s stay there for a second because—and talk about one specific category of harmful content and that’s [misinformation] and disinformation, linked with everything from people refusing to take the COVID-19 vaccine, to ethnic violence in Myanmar, and to the attack on the Capitol.
YouTube announced… site wide policies to moderate against extremist content in 2017. Why did you make the decision then and what have you learned through that period of time and where do you go now?
SUSAN WOJCICKI: So you referenced 2017, and the first category that we worked really hard on was violent extremism, and there were a number of events that happened in 2016. Whether it was the attack in Nice or the London Bridge attack in 2017, there were a number of attacks that we spent a lot of time self-reflecting [about] what role does our platform play and how can we make sure that we are being very careful with regard to violent extremism.
And there was actually, like, a particular event—I believe it was the London Bridge one—where there were some accusations of the role that YouTube had played. And I went home that night and I looked at the content that we were being accused of leading to violent extremism and I talked to our reviewers, and there was nothing in the content that technically violated our policy.
So there were videos from various—I won’t say who, but various individuals that technically met the bar. But when I started researching, you could see from people who were experts in the field that they felt that this led to violent extremism. And, really, what I discovered was [that there were] a lot of dog whistles that were going on that a normal person wouldn’t hear but yet were potentially problematic.
And at that point, we went and we hired a large number of violent extremism experts and worked on a plan… We made changes in terms of our policies and approach, and we made significant progress in removing that content. And, again, these were not groups that were on the foreign terrorist organization or prescribed list, right, because if they are, then that’s content that we understand has been clearly marked by a government as not being appropriate to have on the platform.
But it was other content, other messages, and we had to get really, really detailed. And so that was the first [hurdle] that we really tackled and made tremendous progress, and then we realized that we needed to take that same approach and apply it to many, many other areas, whether it was hate, child safety, dangerous tricks, many different areas, and we have done that and I’m really proud of the work that we have done since that time, and ongoing. [There’s] still more work to do.
FREDERICK KEMPE: There are policy decisions and there are design decisions. How do you weigh each of these? Because sometimes it looks like the design may be responsible for a person going from one thing to another and being led inadvertently to content that the person wasn’t necessarily starting with. And then there are also the policy decisions where drawing the line between censorship and dangerous speech and inciting of violence is sometimes a difficult one to draw.
SUSAN WOJCICKI: Well, we need to work on all of them. So you need to have a comprehensive solution between your policies, your product, how your systems work. And so we’ve actually come up with what we call the four R’s of responsibility, and pretty much everything we do is captured in one of these four R’s.
So the first one is “remove.” And that’s generally where we update our policies. That’s a very high-leverage move, to make a new policy, because then [the content] becomes… no longer allowed on our platform. So last quarter, for example, we removed approximately nine million videos. Those were all violations of our policy. We do so very quickly. Ninety percent of those removals are automated. And the vast majority are done within a few views. So that’s “remove.”
The second one would be “raise.” I’ll use COVID-19 as an example because you talked about it. We passed ten different policies associated [with] COVID-19 and we removed that content very quickly and were able to make sure that that content was not being viewed as a result.
But then there’s also “raise.” So we wanted to make sure that the authoritative information that came from different health authorities on COVID-19, that we could raise it up, whether that was on searches [or] whether that was on the watch page: People who did any kind of video watching around COVID-19, we [raised] information that came from authoritative sources.
Then we look at “reduce.” And, by the way, we served over four hundred billion impressions from COVID-19 that came from authoritative sources, which is probably one of the largest campaigns we’ve ever run.
Then there’s “reduce,” which is content that technically meets the letter of the policy but doesn’t really meet the spirit or is very low-quality content, like aliens are in my backyard or aliens caused COVID-19, right; that’s not content we’re going to promote. So that’s under the “reduce,” meaning it’s not content we’re going to recommend.
And then, lastly, “reward,” how we handle and then work with our advertisers, because once you enable monetization of content, then that creates an environment where there’s more and more of that content created. So saying that you’re not going to monetize it reduces any incentive to create that content. It’s also something advertisers wouldn’t ever want to be on.
So then the last one is [“reward.”] So it’s really a combination of our policies, our recommendation systems, our search, how that works, and then, of course, our monetization policies.
and look forward to continuing the conversation.
https://ameblo.jp/shinobinoshu/entry-12662652453.html

コメント(0)

mixiユーザー
ログインしてコメントしよう!

ウィスパリング同時通訳研究会 更新情報

ウィスパリング同時通訳研究会のメンバーはこんなコミュニティにも参加しています

星印の数は、共通して参加しているメンバーが多いほど増えます。

人気コミュニティランキング