ログインしてさらにmixiを楽しもう

コメントを投稿して情報交換!
更新通知を受け取って、最新情報をゲット!

ウィスパリング同時通訳研究会コミュのA Conversation with Mark Zuckerberg

  • mixiチェック
  • このエントリーをはてなブックマークに追加

Jamie Miller:My name is Jamie Miller and I'm a vice president for public programs at the Aspen Institute. And it's my great pleasure to welcome to our stage Facebook founder, chairman and CEO, Mark Zuckerberg, who will be interviewed by Cass Sunstein, professor from Harvard University.  
Speaker 2:  [inaudible].  
Jamie Miller: It's great to have professor Sunstein back at the ideas festival and we are so pleased and excited to welcome Mr Zuckerberg to the ideas festival for the very  first time. Thank you both.  
Cass Sunstein:  So it's challenging to follow an Emmy Award winning artist. I do have a song that  I'd like to sing that I wrote.  Okay, well we'll wait on the song, um, by interest in these topics. As Mark knows, uh, came from a book that I wrote called Hashtag Republic, which is actually very concerned about democracy and social media. And I found myself  on a train getting a call from one of Mark's, uh, employees saying, you know,  we're also concerned about social media and democracy. Uh, might you be  willing to talk to us? And I found that a great honor, and I've had two occasions  to work on specific projects as a consultant with Facebook. And, uh, the  opportunity to explore. I think our topic is the future of the world. Is that what  we agreed to? Uh, uh, let's just, we can fit in 40 minutes in 40 minutes. Um, so,  uh, lightning round, uh, the first issue, which is I think very salient in many  nations, is that the governments have been thinking about regulating social  media. The United States has affirmed tradition of freedom of speech, and it's  not exactly usual. I did empirical research, courtesy of a online research  strategies at, it's not usual. I can confirm to find the heads of companies calling  for regulation of their own companies. Uh, but you have done exactly that called  for government regulation. Can you say a little bit about what inspired that and  what areas you'd like to see regulation in? 
Mark Zuckerberg:  Sure. So, you know, I've spent most of the last few years focused on trying to  address some of the biggest social issues facing the internet. And you know,  that our company in particular, is it the center of the four big ones that we  focused on our election security and preventing election interference, free  expression. And while harmful content, uh, privacy and making sure that we get  those issues right. And one that's gotten a little bit less attention, but I think is  as important is portability and interoperability and being able to move between  services to increase innovation and competition and enable research. So  through working through these issues, we met a lot of progress on, on each of  them. We've built up our systems internally in elections, which I'm sure we'll get  into in a bit. Uh, there've been many elections around the world over the last couple of years. The results have been, um, a lot cleaner online due to a lot of the work that, uh, that we and others have done in partnership.
Uh, but one of the things that I've kind of come to after a few years of, of really spending most of my time working on these issues is that in order to solve these issues, you get down to some fundamental trade offs and values that I don't think people would want private companies to be making by themselves. Right? So questions like, how do you balance what the line is between free expression? On the one hand and safety from content on the other hand and privacy about what people are allowed to say. Ah, human dignity and decency. You know, those are really hard questions to answer. And you know, as a company we try to do the best that we can. But I think that if, if as a society, if we were rewriting the rules of the Internet from scratch today, it is not at all clear to me that what we would want to do is have private companies make so many of these decisions by themselves and in a lot of these areas around, for example, what constitutes political speech and what should be acceptable advertising around
an election.
Um, I, I really don't think that as a society we want private companies to be the final word on making these decisions. So we're, we're, we're, I've come out is look in the absence of regulation, um, on some of these things, we're going to do the best that we can and build up very sophisticated systems to be able to handle these issues. But at the end of the day, I, I don't think that that is necessarily the ideal state that we all want to be in. I think we would be better off if we had a more robust democratic process setting the rules on how we want it to arbitrate and draw some of the tradeoffs between a lot of these values that we hold dear.
Cass Sunstein: Okay. Let's talk about the integrity of the electoral process and foreign
interference with elections. A lot of people in the United States are of course concerned about that. Can you say a little bit about what you've done specifically? Oh, since 2016 to reduce the risk and what you'd like to see, say Senator Mcconnell and Senator Schumer agree on, in the next few years on the regulatory front?
Mark Zuckerberg: Yeah, so, so I think getting election integrity right is probably the highest priority of the, of, of these issues. And we, there's no single silver bullet. Um, but there are a number of different strategies that we've taken as a company, uh, to prevent state actors. Like what we've seen Russia do and tried to do in the 2016 elections from being able to do that again and elections around the world, including the 2018 midterms and upcoming the 2020 election. So the things that have made the biggest difference are, one is building up really sophisticated technical AI systems and, um, hiring a whole lot of people. We have 30,000 people at Facebook who work on, on, um, on, on content and safety and safety review to be able to find these networks of bad actors to be able to take them off the systems before they have the opportunity to, um, to spread propaganda or misinformation or whatever they're spreading
Um, we've gotten much more sophisticated at that. It's an arms race. Uh, Russia and other folks have also gotten more sophisticated in their tactics. Every election we see a new tactics, but through a big investment in this, we're able to stay ahead and keep the progress going on that. Um, we've also upgraded the policies. So now anyone who wants to run political ads, uh, or issue ads or run a page that gets a lot of distribution, needs to verify their identity with us with a, uh, with a, with a valid government id and we've rolled this out in the u s and we rolled it out across the world, cetera. It's a quite a large operation to be able to do that because we have 7 million advertisers. Um, overall, uh, not all of whom are trying to engage in political or issue ads, but, but that's a big deal.
But that that would prevent people from other countries from being able to advertise in elections where the law might prevent them from doing that. Um, we've also instituted a bunch of transparency requirements where now anyone who runs political ads, um, that those ads are going to go into an archive that's going to be visible for, uh, seven years. Uh, so that way, um, seven or eight years. So that way anyone who wants a is a journalist or an academic is going to be able to study what every political advertiser did, who they targeted, how much they paid, what else they said, two different audiences. Uh, and that's really important I think for keeping people honest, right. To make it so that not only bad actors, but you know, common actors in, in the political system can't say different things to different people without, without getting called out on it.
And, um, and there's a number of other things that just an in partnership with, uh, with working with intelligence agencies and, uh, election commissions around the world. Again, this isn't an American, uh, only issue. I mean, we just had the big elections in the, in the EU for example, and there's a big election in India, uh, and you know, the EU parliament, uh, president and it came out, I did a, I went and testified in the EU, similar thing to what I did here in, in Congress in the u s and m and the EU parliamentary president after the you elections came out and said, uh, that, you know, Facebook basically that we were able to deliver on what we said we were going to leading up to the elections and that he thought that it was a relatively clean election because of that. So there are reasons for optimism, but we can't rest on our laurels because this is certainly an area where the adversaries are sophisticated and they have a lot of resources and we'll just keep on trying to get better and better.
So, you know, from a regulatory perspective, what would I want to see? One is, you know, there's this honest ads act, which I think is a good floor for what should be passed it. We actually are doing all of the things that are in it already. A lot of it is the verifying political advertisers. It's transparency around who's advertising. But you know, I wouldn't just want those policies to be enforced on Facebook. I think you want them enforced across the whole Internet. So, um, so having a bill like that past as the floor I think would be positive. Um, there were other types of laws around the world that I think would be positive as well. Um, you know, for, for example, you know, we had an issue in, um, this is not an American example, but we had an issue in Ireland. Um, in the last year, there was a referendum on, on abortion
And during that election, leading up to that referendum, uh, a bunch of, uh, prolife American groups advertised in this Irish leading up to this Irish election, um, to, to try to influence public opinion there. Then we went to the Irish and, and asked folks there and say, well, how do you want us to handle this? You have no laws on the books that are relevant for whether we should be allowing this kind of speech in your election. And really this doesn't feel like the kind of thing that a private company should be making a decision on. Um, and their, their response at the time was, you know, we don't currently have a law, so you need to make whatever decision you want to make. We ended up not allowing the ads, but at the end of the day, that feels like the kind of thing around the world in different democracy is that you'd really want the local countries to be
deciding for themselves what kind of discourse they want and what kind of advertising they want in their elections, not a private company.
So that's, that's kind of a flavor of this. But, but I mean, overall, um, the, the laws around election advertising are very out of date. Right. And what we, what we've seen, you know, a lot of laws around elections today are basically they, political ads are defined as ads around a candidate or an election specifically. That's not really what we saw Russia do primarily in 2016. They tried to get people agitated around different issues. Um, so you know, if, if you want to, if you want to play and prevent that, it's actually, it's more important to broaden out the law of the laws to focus on things that are more issue oriented and not just around elections and candidates specifically. And also the nature of elections now is there, they're kind of permanent campaigns, right? So a lot of
countries have laws that limit what candidates are folks can do in the time period before an election. And um, and I just don't think that that reflects the modern threats that we see around the world. I think you want stuff that's more ongoing. Okay. So my suggestion is that it's worth considering
Cass Sunstein: either for you or for regulators, something that would be a zero tolerance policy and that transparency is a good first step, but not adequate. If people have to do a little work to get what's transparent or if they've done that work and they've still seen the ad, which suggests that some parts of America are evil and other parts of America are worse than evil, or that some political candidates are systematically crime committers when they actually aren't. And this is all coming from someone who's transparent. A, wouldn't it be better to have something that would be a zero tolerance policy that specified the categories with particularity beyond the call it, beyond transparency?
Mark Zuckerberg: Well, I think that the challenge here is that a lot of the messages that are being sent, our speech that would be acceptable if it were Americans doing it, participating in American elections. So you know, a lot of what we saw, for example, was simultaneous running of campaigns on both sides of an issue. Take immigration for example. So you know, there would be a campaign to argue for immigration reform and a campaign to argue against immigration reform and the campaign for immigration reform might look a lot like what someone in the u s advocating for immigration reform would do. Um, but it's, if it's done through a network of, of fake accounts created outside of the country, um, then that's not allowed in, in our, in our political discourse. So we then have the means to go and take that down. But a lot of the time it's not actually the content that's harmful, which is why it's so important to act upstream of the content. I'm sorry. It's not that the content isn't harmful. It's not that the content itself would be, would be prohibited. It's the actors and the way that they're engaging that that is prohibited, which is why the partnerships with the intelligence community, um, and election commissions are so important because that's how we get the signals to say, hey, there's something going on over here that we should look into. This is a network of fake accounts. Let's go take that down before it gets closer to an election or a sensitive time where something might happen.
Cass Sunstein: I'm worried that America's enemies aren't quite worried enough yet. So, so the, the idea that we'd have transparency and take down things that were lying
about their sources, that's very important. Uh, would it be good for you to have as part of your standards, a specified list of things, which wouldn't be limited to, you know, during campaign ads, but would be, as you say, ongoing campaign efforts to sow division. And if it's being done by an American as an effort, let's say, to be pretty provocative about what the other side thinks, that's fine. But it's effort by someone who's not particularly friendly to us to put us apart from each other. Like in a twilight zone episode, maybe that would be part of the prohibited community standard.
Mark Zuckerberg: So, yeah, I mean, the, the tools are constantly evolving to find the bad actors and take them down. And that's an ongoing partnership with the security
community. But I think you're pointing to an interesting, an interesting point, which is that as a private company, we don't have the tools to make the Russian government stop. Right. Or for, you know, don't, I mean, we can defend as best as we can, but our government is the one that has the tools to apply pressure to Russia, not, not us. Right. So, you know, one of the mistakes that I worry about is, you know, after 2016, um, when, when the government didn't take a, any kind of counteraction, um, well the signal that was sent to the world was that, okay, we're open for business. Countries can try to do this stuff and, and our, our companies will try their best to try to limit it.
Um, but fundamentally there isn't going to be a major recourse from the American government. So since then we've seen increased activity from Iran and other countries and we are, are very engaged in, in, in, in kind of ramping up the defenses, our, the amount that we spend on safety and security. Now as a company, it's billions of dollars a year. It is greater than the whole revenue of our company was when we went public earlier this decade. Right. So we, we've kind of, we've ramped up massively on the security side, but there's very little that we can do on our own to change the incentives for nation states to act. That's something that is a little bit above our pay grade.
Cass Sunstein: Okay. Let me ask a question that uh, is connected with the election interference one but involves domestic as well as foreign actors and that is defects. Yep. And you can think of deep fakes as literal the technology that is increasingly available by which anyone for whom there's been a photograph taken can be portrayed, you know, singing a song or saying that communism is great or endorsing a political position that they are poor or it could be an altered video that portrays someone as drunk or crazed or thinking something. I pour it. So my question for you is then this is obviously connected with the speaker of the house, Nancy Pelosi's altered video. Why aren't in the policy as of to say tomorrow be that if reasonable observers could not know that it's fake, that it will be taken down and disclosure isn't enough.
Speaker 2: [inaudible]
Cass Sunstein: that's the first time I've ever gotten applauded.

コメント(0)

mixiユーザー
ログインしてコメントしよう!

ウィスパリング同時通訳研究会 更新情報

ウィスパリング同時通訳研究会のメンバーはこんなコミュニティにも参加しています

星印の数は、共通して参加しているメンバーが多いほど増えます。

人気コミュニティランキング