Image for The Great
Go Back to the article page

Please upgrade to a browser that supports HTML5 audio or install Flash.

Audio MP3 Download Podcast

Duration: 00:56:35

David-Kaye-Edited-Podcast-cj-14h.mp3


Transcript:

0:01

Alright, good afternoon, everyone. So it is a great pleasure to have David Kaye back at the law school and back at UCLA. David's book,

0:11

which is pretty fresh, is for sale right there. I believe he will sign copies, right? Yeah. If you buy one. And what we're going to do today is talk about this book. Let me just say a word about David. First, he's going to give a few remarks. We'll sit down, have a brief conversation will open up to questions. For those of you who are not familiar with David, he's both, well, a former member of the faculty here, currently at UC Irvine, but most importantly, for the purposes of this UN Special Rapporteur for freedom of expression. I'm probably butchering the exact title. But that's very special, especially very, extremely special rapporteur. And these positions of Central laboratories are really significant positions, that although they are I think It's fair to say not well funded, they give the holders a great platform to really dig in on a really important issue. And David had the good fortune, I think, to have one of the most interesting and important and, and current issues of all, but also one of the hardest. And so the book really grapples with that. And I'm sure you'll continue to grapple with that. So with that introduction, I'm going to turn things over to David. He will give some remarks as I said, and then we will take it from there. Thank you.

1:26

Thank you to Kal, thanks to Alexandra. Alexandra is there so, thank you, Alexandra. And thank you all for being here. So I really want to keep my remarks to maybe 10-12 minutes or so just sort of as an introduction, obviously, when we talk about all of the different issues around online speech, we could we could spend the whole semester here talking about them. And in fact, if you go to UCI law, I teach a whole class on it. So I won't say too much to spoil that. In case you're thinking of applying to law school. Okay, so I want to start with a recent case. And as anybody who ever has given a talk or written a paper knows, you know that that case that just comes down, like the week before the papers do before you're going to give a talk that could either either totally screw up your entire agenda, right or it can reinforce everything you are going to say. So, the case, luckily, that I'm going to talk about reinforces a lot of what's already in the book. Okay, so, so this is a case that involves a politician, a pretty prominent public figure in Austria, and I had to write it down because I, and I'm sure I'm gonna screw up how you pronounce your name anyway.

2:44

To correct

2:46

right. Okay, so Glawischnig-Piesczek that's the last name am I [correct]?. Sounds good enough. Well, we'll go with that. Okay, so Eva Glawischnig-Piesczek is a Green Party leader in Austria. That's what you should know about her: Green Party, Austria. A couple of years ago, actually 2016, a Facebook user posted a story about the Green Party, of which, at the time she was the leader, and and also a member of the national legislature in Austria, and this Facebook user when posting this article, and the article was basically suggesting that the that the Green Party supported a living wage for refugees, which was actually true, but this Facebook user saw this as a as a really problematic policy and this user called Eva Glawischnig-Piesczek called her a traitor. And "oaf" O-A-F I'm not sure what "oaf" is in German, but called her an oaf and a member of a fascist party. Okay. I mean, this is the internet, right? That seems pretty tame. And yet, although maybe oaf in German has some deeper meaning, actually, I'm not even sure really what the meaning is in English. So what does she do? She goes to Facebook and says, take this down, right? This, this user has harassed me, and according to your own rules, Facebook, right, this is content that should be taken down. And we'll get back to that a little bit later when I when I wrap this up, but basically, she's calling on Facebook to enforce its own rules, which are pretty extensive around hate speech, around harassment, around all sorts of other issues that are related to misogyny and racism and so forth. And Facebook says, no, we're not going to take this down. Right? No problem. I mean, this isn't really harassing. And, and I have to footnote this by saying that Facebook wasn't very articulate in saying why they didn't take the content down, which is normal, right? If anybody has ever complained about Facebook content, you know, that you might complain, and you'll get very little in response, you might not know what the reason was why either the content was taken down or left up. Okay, so I'm just going to call her EGP. That's going to save me a lot of hardship here. So EGP brings a suit. She brings a suit in Austrian court. And she says under Austrian law, this Facebook user who is identifiable, by the way, but this Facebook user defamed me. And the first Austrian court says, yeah, this is defamation. We will put aside questions of Austrian law, I think for the hour, basically, but basically, they say that's true under our law, this kind of attack, even though it's on a public figure, constitutes a form of defamation goes up to the next highest court and appellate court in Austria. And and that court says, we also have concerns about this content. We think that it might be inconsistent with Austrian law. However, there's a broader European issue at stake here having to do with the nature of regulation of what's called e-commerce in the European Union. So what the highest court in Austria does is basically refers it to the European Court of Justice. This is the top court in the European Union. Okay, so now we're at the European Court of Justice, which is based in Luxembourg, the European Court of Justice with the recommendation of the Advocate General, who is a little bit like a an attorney general in our context, although one that basically stays in one place and doesn't go around the world seeking indictments. I'm sorry, I just

7:03

say that.

7:03

So, so the Advocate General says, in essence, you know, the internet is, is different than other forms of speech. And there's certain kinds of rules that have to apply. And we encourage, and he's using the Royal we, we encourage the European Court of Justice to side with EGP. Okay. And so the European Court of Justice exactly one week ago today, that's why I was bringing this all to this case that actually reinforces the themes of this book. The European Court of Justice says, first, Facebook has to take this content down. Facebook has to observe local law has to observe the law of Austria. In fact, that's not all that surprising that isn't super controversial, right? Even Facebook's own rules, say we will have our own rules but also we will observe local law wherever we have a market wherever we have users, okay, so they are Say that that part wasn't super controversial. But there were two other parts of this decision that are really quite interesting. And I mean, I've only just started to grapple with this. So it's really a question for all of us are a couple of these issues are for all of us to think about. So one thing that the court says is that Facebook has to block not only this content, right, so the specific user who posted this particular content, but if someone were to copy that content, like share it, that has to be blocked, and any similar content, so maybe not the same exact words, but let's say you took this content and you played around with it, and you posted something, but so you didn't say "oaf" but whatever the other word for oaf might be in German, you use that that would have to be blocked. And by blocking, they don't mean simply that once it's up, it has to come down. They're basically talking about the need for Facebook to have what are called upload filters so that it would be impossible even for the user to upload that article with that kind of content along with it. I mean, think of it in American terms, American legal terms as a prior restraint, right? They're saying, you can't even say this, if we equate saying with uploading. Okay, so that's one issue. And for any of you technologists in the room, or anyone who thinks a little bit about artificial intelligence and automation, and the possibility of recognizing language, right, because it's not just the exact content. It's also content that sounds like it, you know, that we're heading into a very difficult situation. Certainly difficult situation for Facebook, but for any online entity that's trying to grapple with these kinds of questions and what do we take down? How do we know if something is defamatory or hate or something like that? Okay, so that's one issue. The second issue, and there's very little discussion about this in the in the case, the European Court of Justice says, not only does Facebook have to take this down for Australian users, or we could say German speaking users, but Facebook has to take this content down worldwide. So they have essentially said, right, that Australian courts have the authority, right to order Facebook to take down content anywhere in the world, regardless of the jurisdiction. Okay? Very little discussion in the case about it. But this is a big deal. These two issues are actually potentially monumental for the way the internet, at least content online will be regulated going forward. But this last point, the idea that one court in one country, right, can order the takedown or the restriction of content anywhere in the world is a step that's that's pretty far beyond what most courts have said even European courts before this. It's all been here. Heading in this direction, right? But this is a a particularly big deal. Okay, so this case has everything. And I'll wrap up here. So we can talk about a whole bunch of different issues that I think are really current issues around speech regulation online. But I just I had four questions being like judaic about it. One is who should decide, right? Who should be deciding these questions of what is legitimate content online? Right. This gives us the question like in a very clear way, right, because the first place that EGP went was to Facebook, right, assuming that Facebook was the decision maker in this context, Facebook has rules. They should be enforced by Facebook. But there's also the question of the Austrian courts and the European courts, who should be deciding these kinds of questions Who should be making the rules as to what's legitimate content online? Should it be private actors? Who have so much authority over the public square in many respects around the world? Or should it be governments who have traditionally had the responsibility and the authority for deciding these kinds of questions? Right. So that's one question who should decide? A second question is regardless of who decides, what should the rules be? Right? Should those rules be human rights rules? Right? These are global companies. Right? So we probably can't expect them to have a different set of rules for every single jurisdiction where they operate. And we can talk about that maybe that is the direction that they should go. It's not very feasible at the moment. But that's one possibility. The other possibility is global companies global norms. That's the argument that I make in the book that the companies because of their massive impact on Human Rights around the world. And by human rights, I don't mean just freedom of expression. Right? They also have a massive impact on discrimination, on incitement to violence. We were talking about Myanmar a moment ago. What should the rules be around incitement to genocide? Should those be rules that are based on international law? In particular, the Genocide Convention, or should they be something else? Okay.

13:27

Then there's the question of what should be the remedy. Right. So in this particular case, the remedy was a takedown, right, essentially saying, okay, if you have this kind of content that violates national law, which European court says that's okay, the National Court can do this. One possibility is the remedy's just take it down. But of course, there's a lot of harm that may be caused by different forms of speech. I mean, we in the United States don't usually talk about speech harms. But of course, there are torts around defamation and libel, I mean, there are certain ways in which speech can cause harm - reputational, or in the context of international human rights law, actually all states that are parties to the International Covenant on Civil and Political Rights, which is about 170 states, although many of them reserved to this particular provision, United States, among them, states are obligated to prohibit, essentially hate speech, it's national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence. So there's all sorts of speech harms that are already recognized in human rights law, but what should be the remedies for violations of those rules? And then the final issue that this raises, is should there be special status for public figures? Right? Should the rules apply in general to all users, but we're going to carve out a public figure like EGP so that, you know she can make a different kind of claim. I mean here, in fact, I mean, I think that this, and I say this in a very preliminary way, I think that this ruling on this particular issue, since it does not deal with her status as a public figure, I think it's inconsistent with European Court of Human Rights jurisprudence, which actually says that journalists and public figures, particularly politicians, should be I mean, essentially, they should have thicker skin. Right, they should be able to, to kind of take it when it comes to the kind of criticism, criticism that we're talking about here. But this case doesn't even deal with her status as a public figure. And what's interesting at this particular moment, if you've seen the news over the last couple of weeks, well, Twitter did this about maybe four-six months ago. They basically said, you know what our rules are going to apply regardless of your status as a public figure, right? So if you're a politician, and you're using hate speech, we're going to treat you as you know any other user who might use hate speech. In other words, they there they call it hateful conduct. If you violate that policy doesn't matter whether you're a public figure or you're a regular user, you're a white supremacist, whatever you might be, this rule will apply to you. Right? Facebook, just a week or two ago, basically said, you know what, we're not going to apply the rules to public figures in the same way. Maybe in the case of incitement, we will, but generally speaking public figures because of their understanding of the importance of robust public debate, right, we're not going to take down that kind of information, we're not going to take down the content of public figures. So you actually have a little bit of a clash emerging among some of the giants of social media. And but again, even answering that question requires us to answer a couple of fundamentals for ourselves. Right. One is, who should be deciding these questions? Like, are we comfortable with companies, private actors doing this? And you might have seen the story of the last couple days about the amount of ads that the Trump campaign has already been buying on Facebook, which is, I think, apparently already over a million dollars, which, you know, that's, you know, kind of chump change, generally speaking, but on Facebook, that goes a long way night, and they're not applying the same rules around disinformation, that that they might apply to other users. They're not applying, for example, signaling and labeling that they might for regular users who might circulate this information. So it's that question of who decides, and the second question of what the rules should be. So those are those are basically the stories that I try to focus in on in the book this case, I don't talk about this case in the book because it just was handed down last week. But, but hopefully this sort of opens up a conversation for us to think about these these fundamental questions as we think about what we expect, and what we want freedom of expression to look like online moving forward. So and there is perfect, that's good. Right. Well, thank you.

18:26

So why don't we start off the case and it's a really interesting one, you talk about other European efforts in the book, I remember the Yahoo France way back case. So Europe has always been distinctive in its approach. And I guess it got reasons, a couple of things. So So one is, you know, there's the politics here, which you discuss in the book, the fact that these companies are American companies overwhelmingly. So you talked about Twitter, talking about Facebook, talking about Google. They're all California companies. And so there's a kind of political dimension to European discomfort with the power of these companies, and then also maybe a deeper philosophical legal difference in terms of the way that we in the United States take a very, in a global sense extreme view about freedom of expression. And you talk in the book about the influence of human rights law, which is quite different. So let's just talk about that US-European difference. So why is Europe so different?

19:23

Yeah, it's a, it's a really important question to get out front, I actually think that the differences are, are not as extreme as we often make them out to be. And I think the fundamental problem right now is a kind of alienation that Europeans have towards these American companies dominating their space. So just to give a couple of examples, so, you know, in 2015, this was a like for, for Europe, 2015 is sort of a turning point moment in many respects, because it's the moment when you know, sort of the the policy regarding Syria had kind of come home to roost in the sense that you had massive numbers of migrants or refugees fleeing from fleeing from Syria, leaving going usually through Turkey and coming into into Europe. And in the summer of 2015, Angela Merkel says we're going to accept 1 million refugees, but which is a lot of refugees. And in in many respects, if you think about German history, it's it's a remarkable moment. But what happens at that time when all of these refugees are coming in - like 4 million here? - Yeah, it'd be right exactly it'd be like us observing 4 million, which, like in this current moment, I mean, it's really striking when you think about it. So so at that moment, at the same time that that was happening, Facebook was becoming, you know, extremely popular in Germany, not quite as popular as it is elsewhere. But if anybody has ever tried to use the internet in Germany, has anybody tried to do that? They have like ancient copper wires. I don't know what it's like the most advanced country. But anyway, so, so the the foreign minister or the Justice Minister at the time, starts getting a lot of emails and, you know, sort of comments and complaints from his constituents who are saying, there's this company, Facebook, that is on the one hand, taking down images of nipples, which we don't care about we're German. And I mean, they care about, anyway. But on the other hand, there's not this company is not taking down incitement against these migrants. Right. And so, his name is Heiko Maas, so Heiko Maas writes a letter to Facebook and this letter is really worth people taking a look at because you can tell how alienated he is and how his constituents are from kind of this governance by a foreign company, that if it were, there's a sense of in the letter that if this were a German company, they would be responsive, they would understand the concerns, they would understand German law, they would understand, like some of the things that you mentioned, like the issues around restrictions on, you know, not Nazi paraphernalia or or different versions of hate speech law in Germany, a company in Germany would understand that. I mean, I think I think that Facebook could understand that, but they didn't care. And that that moment for me is really telling because I think it's in a lot of ways it's about it's, it's about much more than just the difference. A fundamental difference in understanding hate speech, it's a difference in like we are Facebook, we could do what we want, which really was their attitude at the time, versus a government that is facing. I wouldn't call it an existential crisis, but a real crisis of identity. And this company not being in the position to to address it.

23:00

But doesn't it go? I'm thinking of the right to be forgotten as well, which we discuss in the book, which some of you may be familiar with this case was a Spanish guy, and he wants to get some bad news about I think his debts, and he doesn't want it showing up in a Google search. And so you know, that, so core hate speech and issues in a German context. Understandable? Yeah, differences there. Right to be forgotten just feels like a more, I don't know, it's a different, it has a different quality. Yeah. And so does that change anything? I mean, it just seemed like Europeans are very sensitive about, you know, or the example you just gave with this Austrian case. Yeah, very sensitive about these things.

23:40

They definitely are. And right to be forgotten is kind of the counter example in a way to the to the German one in this in one sense, at least. It highlights how much Europeans care about privacy, right. So and I and I think Americans do too, but privacy in a way that American life doesn't reflect in quite the same way. Privacy in European law is it's not only considered a fundamental right, that is that is sort of developed. I mean, in our system, we see it as it's developed through constitutional protections that there's no explicit right to privacy. In European law, you have the right to privacy and not only the right to privacy in this vague sense, but you have a basically a human right to, to, to data protection. and European law has, its kind of expanded that over the years. So maybe say a word about GDPR.

24:36

At some point,

24:36

yeah. So right now, this is this is to the direction of your [question] but I think this also does go to some of the alienation from these American companies, because so on the one hand, Europeans are, you know, both as a matter of sort of individual experience, actually, Germany is a perfect example of this because this is in some ways most deeply felt in Germany in particular, because The experience of the of the Cold War and and East German surveillance. So but but generally speaking, Europeans are very focused on the protection of their data. And that went so far as in the GDPR, the general data protection,

25:18

data protection regulation, you know, went so far as to basically give a priority to Data Protection over freedom of expression. It doesn't say that explicitly. But in sense that's what the right to be forgotten case is about. And so for those who aren't familiar with that the right to be forgotten case, is basically a case where, and this is sort of an underlying theme of the book also. The court again, also the European Court of Justice, the same court the issue, this decision that I was talking about at the beginning, basically says that you know, where an individual sees content on on a search engine When that person has done a search of his or her own name, right, so, you know, I searched David Kaye on Google. And if the first thing that comes up is something that I consider to be no longer relevant, essentially, that's loosely the standard, then I can apply to Google to remove that content. Right. So what's amazing about that, to me, is the courts are out of that discussion, largely. It's basically a legal question that Google is now evaluating. I mean, Google has become the evaluator of these questions. And it forces Google or any other search engine essentially, to make a decision without really any guidance from the freedom of expression perspective. And what that leads to, in in both the GDPR and the right to be forgotten case is this pressure on the companies to take down content because of privacy concerns with that really integrating any kinds of priorities, or even factors around freedom of expression. I mean, they're there, but they're minimized to a real significant degree.

27:12

Well, just to underscore that one of the things you point out in the book that I think is really important to appreciate is that if they fail to do that, if they fail to take something down, whether it's right to be forgotten, or something, some other kind of case, they may face a fine, a sanction, something. And so the incentives are all to take down. Yeah. And I'm just restating what you said, but there's no real incentive to keep things. Yeah. Other than some generalized understanding that content is what keeps their money flow. Yeah, flowing. Yeah, but any one particular thing doesn't really matter. So there's a huge incentive to constantly take down content,

27:45

right? That's exactly right. And just to to broaden it out a little bit more. That has been the pressure across the board on all sorts of content. So you've got on the privacy side, right to be forgotten. On the terrorist content side, you have this very strong push, which is understandable given terrorism in Europe over the last several years, but a very strong push to take down content that is, unfortunately pretty loosely defined as terrorist content or extremist content, which is even a kind of worse area of a kind of openness open ended, openendedness. And there was a third area around hate speech. So Europe, basically the European Union, and several of the largest American companies reach an agreement called it's called the hate speech code of conduct. And this puts, again, not legal pressure, but political pressure on the companies to take down content as hate speech. And then you have German law. Something that was adopted a couple of years ago called the NetzDG. Again, I won't say it in German NetzDG, it's the Network Enforcement Act, which does exactly what Kal's suggesting, which is puts pressure on the companies to remove content. That's consistent with some specified German law, and if you fail to do this in a systematic way, you could be subject to binds that reach up to 50 million euros. So with with all of that happening, you see all of this pressure. And I guess the one of the reasons why I think it's important for Americans to understand this is that these are all companies that work at scale. Right? For them, it doesn't make a whole lot of sense to have rules that we have these rules that apply in Europe, and we can have different rules in the United States. I mean, I think that probably should be the norm. But the way they operate is, well, if we're being pushed in this really huge market in Europe to evaluate content in this way, we should just do this worldwide. Right which so we should we should take down more content, according to maybe German law. And that will will just make that our international standard. I mean, you can think about it almost as if it's like California emissions standards. Right. So you which is great for us in California. Just like German, we call hate speech emissions standards, right, in Germany, like those are great, those are great for Germans, but maybe not, they may not fit for the rest of the world. But that's, I think that that's the pressure that we see. And that's actually the movement that we're seeing, which is, and if you look at the development of the company's own community standards, their own rules, they're moving like in ever kind of, you know, consistent in ever-consistent fashion, moving towards something that looks a lot more like European content standards than American, that might not be a bad thing. I mean, there's nothing inherently wrong with European standards. It's just that they're not what the standards should be expected, that we should expect to be in every market for these companies around the world. Right.

30:49

And it's ironic, because for so many throughout the 20th century in a totally different context, Europe was always very upset about American extraterritorial regulation. Yeah, and now suddenly they are the biggest regulators completely. Let's pivot to Asia before we open it up. So you don't talk a lot about China in the book. Yeah. And, you know, there's some obvious reasons. But it's interesting to think about whether we are some people have suggested moving towards essentially three kinds of internet. So there's a kind of North American or whatever hemispheric there's a European. And then there's a Chinese, that's totally, totally different. But even putting that aside, it would just be interesting to hear you talk a little bit about how Chinese governmental control they many of the companies that operate in China feel, obviously Twitter, otherwise you're not. And China has a whole panoply of very large companies that could compete with American companies. They do obviously have much more control. It's a much more heavily censored so just how does that fit into your framework? Yeah, what difference does it make that there is this enormous Chinese market?

31:55

Yeah, it's so

31:59

it's complicated, right, and it's complicated for several different reasons. One is one that hasn't actually happened yet, although you started with the Yahoo case, right. So Yahoo, the Yahoo case, Kal (in the 90s) and I reference, right was in the, in the late 90s, when basically Yahoo, which had a presence, a pretty significant presence in China. They were they received some demands from the Chinese government to hand over user data. Right. So this was data people who were essentially critical of the Communist Party and, and Yahoo turned over that data, and those people were arrested. And that I mean, in in the United States at the moment at that moment, and was a pretty big deal. And Tom Lantos had initiated some hearings on it, which if you're really interested in going back to this particular moment, that Tom Lantos hearing with, with Yahoo is fascinating to watch. Because you realize that a lot of the issues that we've been talking about that I haven't talked about here, but that we've been talking about for the last several years, as a matter of public policy, they were part of the internet question way back then. And the focus was on China. And in fact, if you look at the development of sort of the movement to impose human rights standards on the company's initially, it wasn't about whether those companies should evaluate the human rights of their own users, in sort of thinking about the rules that apply to content, but rather about what should the companies be doing when they get these, you know, when they get authoritarian regimes making demands of them, they could put people's lives at risk. It was always it was actually a very American kind of thing, because we think of like, we don't speak human rights in the United States, right? It's just not our vocabulary. But we do speak it when we talk about other countries. That's like my little, my whining hobbyhorse, is that. Okay, so. So China. So I think there's two different problems on China. So one is the big massive one that I don't deal with in the book much at all, which is China imposes very strict regulations on what their users of say WeChat or Weibo, or Baidu, or any of the other platforms in China that are Chinese, what people can say post like, etc, right? So there's, they have upload filters. If you're if you're in China, and you try to post something about Tiananmen, which is just, you know, the most obvious example, nobody will see that because the filter before it goes to the public presence will take it down immediately. Right. So it's a very strict form of censorship with respect to certain kinds of content. I mean, the Chinese internet is also I don't speak any Chinese are just taking this from people like Molly Roberts at UC San Diego does a lot of great work on this. There's a it's a robust internet, it's just that if you want to talk about the Communist Party of China, or you want to talk about organizing, forget about it, you can't do it there. So one risk is that, as the American companies get involved in a place like China, they will face the same pressure that the Chinese companies will face. So there's sort of that that fundamental question of whether what should they be kind of collaborating with that kind of censorship? And then related to that? Is that again, going to extra territoriality? I think you're gonna have to update your book and have a European flag on it instead of the American flag. And we should read cows book. It's does the constitution false flag to fly? Very good. Yeah. So but so the issue for for Chinese companies is their regulation isn't territorial, right. So if you are a user of WeChat, and you're in the United States, Right, WeChat will still take down your content. It might not be as rigorous. But but the rules will basically apply wherever you are. And we've seen some of this. Actually, there's been I don't I haven't seen the the primary data around this but around Tik Tok, there's been some question about the way in which news about the Hong Kong protests over the last several months has been reported around the world. And people know what they can access their be great to know that.

36:32

So that's one of the Tik Tok his own

36:35

is a Chinese Chinese company. Right. Exactly.

36:37

Not not, I think widely appreciated in the US that it is, but it is, yeah.

36:40

So so as we as we start to use that, that platform, I think it'll be interesting to see how the rules apply. So that's so one problem is against American companies get about the other is just purely about. I mean, I think, again, I don't deal with this at all in the book is a question of China basically separating itself out or away from any kind of discussion of human rights norms, right? So our discussion in the West, which is really broad, it's not just the US or North America and in Europe, it's also South America, much of the Middle East as well. I mean, many parts of the world are having this discussion around what does it mean to enjoy rights online? People in China, you know, people, particularly in Hong Kong have that discussion, but it's virtually it's just a different kind of discussion because of, you know, the degree of censorship that is so built into the system. Yeah.

37:42

Okay, good. So we have time for questions. I will start with questions from students. So students, go ahead.

37:51

I was wondering, so you talked about how many of these European countries are very uncomfortable with Americans basically deciding on the regulations for speech, I was wondering if any of them are considering the Chinese route and just blocking Facebook and creating their own separate platforms with their own regulations?

38:09

Yeah, so

38:10

there are European leaders who would love to

38:12

see that. The problem is, as with all of the discussion in the United States around like, delete Facebook is the platform is really popular. I mean, it's, it's funny, right? Because Facebook is, is not as popular among. Well, it's not as popular as it used to be. It's slowly happening in school and things like that, but but in Europe, and it really does vary from jurisdiction to jurisdiction from country to country. It's popular. So while on the one hand, there's this pretty serious alienation, you know, at the at the official level, and among activists toward the American companies. They're still popular and so there's, you know, and the fact is that there haven't been European companies that have delivered It to provide a kind of alternative to Facebook. And that's, I mean, they exist in some smaller markets in Europe, but not not really across the continent. So so they don't have too much of a, like they don't have something to offer. They can make that argument but they don't have something to offer in its place.

39:17

Just to follow on that question. Do you think if there was a European Facebook a different competitor that was popular with the same problems with the European governments, you kind of implied this would the government's act differently because they felt that it was their company, and somehow they had a relationship?

39:35

I think so. I mean, I heard so many, when I was doing the some of the interviews for the book, I heard from so many, particularly law enforcement authorities that they had no, it really no ability to reach out directly to Facebook's Law Enforcement Unit whereas they knew that American law enforcement you know, you can be, you know, a small town Police Force somewhere in the country and yet they had access that Interpol, or you know, the French law enforcement did not have. So I think that there's this I mean, I think the assumption is and it's probably right that they feel that if they have their own, or you know, we talked a little bit about breaking up Facebook, if there was like a global breakup of Facebook if there was a more localized Facebook that they would have that kind of that kind of in with the company's success, okay,

40:31

Other other students right here in the back. Can discuss who you think is most harmed by limiting the so called harmful speech?

40:40

Who is most harmed by limiting harmful speech? Because we- Yeah? -the acronym lady is protected and that's an argument but equal opposite reaction who is harmed by not being

40:49

able to criticize?

40:51

Acronym lady is the Austrian?

40:52

Yeah. I like acronym lady that sounds like a superhero.

40:58

So it's a really good question. So, so who's being harmed? is, it's also complicated question, right? Because it depends on a kind of a prior, right. So what kind of political debate? Do we want to, well, what kind of political debate do we value? What kind of responsibility? Do we want to put in the hands of companies to monitor that kind of debate? And, and so, and what kind of rules from, you know, national law? Do we want to imply for these spaces? So there's a whole bunch of layers of issues that are built into your question. I think, and and I would say one of my, like, fundamental concerns is that when and this is based on just the development of really the last four or five years in terms of national law, European law and you know, community standards is that the companies because of the issue that Kal mentioned before, because of the incentive to take down content that will be seen as either being hateful or defamatory, or whatever it is that they will, they will be too aggressive in taking down content. And that and that, and I think what we've already started to see in the in the German context, is the kind of sweeping up of legitimate content, so legitimate, but maybe robust, even uncomfortable debate with real hate speech or incitement. It gets sort of all swept up together. And actually, one of the problems is that, we don't, the companies are so closed, we don't have like researchers have a very hard time getting access to the data so that we can actually look and say, well, what is it that you're sweeping up? Part of that is because the companies enjoy Germany, so Facebook, Twitter and YouTube in particular, they're moving all of their enforcement to community standards. So they're saying, oh, there's not that much hate speech under German law, because we just moved that over to community standards. So there won't be any cases. There's no case law. None of that's happening. So explain that a little more. Yeah. So. So what's happening is so because you have this NetzDG law, right, so you have this German law that says that companies need to take down content that's illegal under German law. The companies have two different options, right? So one option is, okay. So as content comes in, they'll say, okay, we're going to hire, you know, 50 German lawyers to be like art court to evaluate German law in this particular content. So that's one thing they can do. The easier thing for them to do is to say, oh, we'll just change our community standards. And we'll make this all about evaluating our own platform. rules and we don't really have to worry about German law because they overlap now. And so whereas with the German law, you might develop a kind of, you know, you'd have, okay, a decision under German law, maybe somebody would object to it under German law, and you develop some kind of case law and transparency. Here, you don't have that it gets under community standards, they don't really disclose what they're taking down and what they're leaving up. And you don't have a sense of whether it's because of German law or something else. So I mean, the but the fundamental problem is the pressure to take down content leads to this, this kind of over regulation of the public, the public square, I mean, and I should emphasize, I don't have a huge problem with a national government assuming like basics of rule of law of democratic governance. I mean, that's a lot to assume these days. So I get that but like in the ideal world, right. I don't have a problem with a government saying these are the rules that we expect to play out both offline and online. And our courts will evaluate, when you've crossed that line, what I really object to is doing it in a very opaque way, and assigning the responsibility to the companies to do it, that that's, for me a real fundamental rule of law problem. And, and, and it has all sorts of ramifications that we can also talk about, I talked about in the book, like locking in the power of these big companies, because nobody else can afford it, you know, giving more authority outsourcing, like public discussion and public law to private actors, all sorts of problems that are embedding that

45:40

that's like a long winded answer to a very good question.

45:44

You do stress a lot in the book, the the fact that the process you just described does basically takes the power that a government ought to a democratic government ought to exercise about what's appropriate and inappropriate and hand it to a company that then does it behind the scenes. Maybe you could just say where do you kind of alluded to the issue of break up? Right now we have, you know, Elizabeth Warren, many others saying for people across the political spectrum saying, we need to break these companies up, is that a proper solution to the problem that you just pointed to?

46:15

Yeah, I mean, I don't know if it's a solution to this problem. So, to me, it's- Or even just wouldn't help

46:21

without a solution?

46:22

I yeah. See, I, I don't know if it would help. I really don't know. Because so the breakup question as Chris Hughes laid out in the New York Times piece maybe six months ago or so. Right? He laid out basically separating Facebook, Facebook proper from Instagram and WhatsApp, which might make a lot of sense. I mean, Facebook has a ton of power and the truth is around the world Facebook. Well, Facebook is still popular and has a lot of growth. But WhatsApp and Instagram are the real drivers of public debate around the world. Why? WhatsApp especially and WhatsApp is, you know, end to end encrypted and problematic for a lot of different parts of the world, in particular India. But anyway, what that doesn't really get at breaking that up is that Facebook will still have a pretty large presence in all of those jurisdictions. And so I mean, there are two things that struck me when Chris Hughes wrote that. So one is, well, how do you deal with the problem of, of content? If you're even if you're just thinking about Facebook as the Facebook platform? How do you deal with content in the not just the hundred and some odd countries where they have a market, but down at these like micro hyper local levels, where even the national government doesn't have a good sense of you know, what the code is around hate speech and so forth? And then the second part of the problem I think, is OK, if you, so Facebook has been the cash cow, the ATM machine, basically, for WhatsApp, because WhatsApp is free. WhatsApp is great. I love WhatsApp. I mean, this is about its usability or anything like that, but because it's free, it's become a really useful tool, in many respects, a fundamental tool for people around the world. You know, because you know, most people around the world that have a have a smartphone, and they use WhatsApp to communicate to do, you know, commerce, all sorts of things. If you break these up in the United States, what kind of pressure does that put on WhatsApp to find a different business model, which might be more of the attention model? You know that Tim Wu is written a whole book about? That is one of the fundamental problems of this whole economy.

48:38

Great. Other questions?

48:40

Yes. Right here.

48:42

You mentioned something about that Facebook cannot determine whether something is the recovery, right? Oh, yeah. Okay, so now my question comes in from a country like mine, which is India, and it is multi lingual. Facebook exists in at least 10 languages than it has tried to access antiquated now that's just a question of the year. Yeah. All over the world. I'm not hazarding a guess, but it might be something like more than fifty to hundred languages in which Facebook is transmitted. So how can you determine how can Facebook will determine all those dynamics? Yeah, you should have read my book.

49:21

This is I mean, this is a fundamental question, right? So these are it's it's a global company. The same is true for YouTube, less over Twitter, which is important in the US, but it's not really globally doesn't have the same kind of presence, even though it may be important. But But you're asking exactly the right question, I think. Right? So you have a global company, you have even if you have global rules, which I think they should have rules that apply across the platform. And those rules should be human rights standards, because that would put them in the mindset of thinking about protecting user rights and protecting the public. But how do they do this in its more than 50 or 100 because people also post all sorts of content by by kind of tweaking the API of the Facebook itself, or just by doing, you know, screenshots of language and posting it. I mean, there's all sorts of ways to get around to circumvent the basic linguistic limitations of Facebook. In fact, this was a very serious problem in Myanmar, where Myanmar did not- it gets a little bit technical that I can't even describe very well. But basically, Burmese language is very difficult for the platform to to moderate to regular companies use moderate governments use regulate. It's the same basic thing, right? So it's very difficult for the companies to do that. So you imagine a place like like India, so what let's imagine that this imagine they had kind of perfect insight into the language, right, so they have I think I heard Mark Zuckerberg say 30,000 content moderators right now, that is humans who are working as subcontractors around the world who are looking at content that gets flagged to them. And they say either delete it, or ignore it. Right. So let's say there's 30,000, globally, so maybe, I don't know. 500. Right. And in India for at least one language, right, so the first question is, what can they understand of the language itself? Right? Maybe they could do that, like they could probably train their algorithm to flag certain kinds of content that is, you know, particularly around imagery that is problematic. But how do they get down to the level of understanding the kind of code that people use because Facebook is very colloquial, right, it's hard to train it. And there's been some research on this about how hard it is to train algorithms to train the machine learning to be, you know, responsive to action. hate speech just because of the variety and the change ability of hate speech. So what you know, what do you do in those circumstances? It, it can't be that it's just an algorithm that does the work of cleaning up the platform of hate speech, you have to find some way to get the platform well, to provide some kind of local ownership to the platform. And that, to my mind is is like the big challenge for the platforms moving forward. Because they need to figure out exactly the problem that you're describing. How do you solve that problem? They, frankly, they may not be able to solve it. And, and that leads to questions about sort of the role, like it's a more fundamental question than just regulation. But you know, what, what do we expect the future of these platforms to be, you know, over the long run?

52:55

Yes. Hi, I want to go back to Asia. I'm kind of interested in your thoughts on some of the trends and some of the other other Asian countries with regard to freedom of speech and privacy and mass surveillance, I'm thinking about some of the recent laws passed in Thailand and Cambodia and Vietnam in Pakistan, which are really just cybersecurity. Yes. Or cybercrime laws that are written to give sort of the opposite of what you're trying to. Right. So So what's what's been happening is so most of the book is really about democratic countries. And the struggle to figure out some basics like who makes the rules and what the rules should be. But But I do talk in one chapter about authoritarians, particularly countries, that Freedom House puts in the category of partly free, so a country like like Kenya, for example. Or you could think of some of their countries in Southeast Asia. Singapore is a good example where, you know, there's a certain amount of kind of public freedom but kind of famously has some pretty harsh speech rules. And Singapore actually just adopted a lot earlier this year. That says it basically, if you publish on a platform, disinformation or misinformation, any member of the government can order the company to take that content down. And there could be significant fines for it. It's a really, it's quite a draconian new law. So you see that happening. And then you see exactly what you're describing, which is cyber security is is like the, you know, the blog, right? It's, it's kind of covering everything. And cyber security isn't like what we think of it cyber security in the United States or in Europe, which we tend to think of as, you know, protecting, you know, public processes, protecting individuals against, you know, the theft of private data, stuff like that. I mean, it really the cyber security laws have been encompassing hate speech, extremism and they're giving companies, countries, governments, the power to order the companies to take down content, you know, readily. So it's a, it's a completely different dynamic. It's extremely problematic. There aren't a lot of voices external to the countries like other governments that are calling them on it occasionally through the Human Rights system in the UN. It comes up but but not not very often. The one thing I would say is, the situation for users in a lot of these countries is really different than the situation in the United States. So where we've got the end Facebook hashtag end Facebook or delete Facebook, in a lot of the countries that you mentioned, there's state media, and so Facebook is the place where people get a diverse set of sources of information. And that's, you know, and actually, there was a story in the New York Times just this morning, kind of buried in the business section about how New York Times journalists use Facebook and Twitter and others as a tool to understand what's happening in the countries that where they operate, particularly in authoritarian places. So you know that the dynamic is totally different, the government's are pushing hard. And activists, users are much more like they're much more committed to using these platforms than than in the United States or in Europe.

56:23

I think that's a good place to stop unfortunately, we're at the end of the hour, but you're going to stay, yeah, to sign books and maybe answer some questions. So maybe

Transcribed by https://otter.ai