Hallo Du!
Bitte vergiss nicht deinen Fortschritt im Fortschrittsbalken auf der Seite des Talks einzutragen.
Vielen Dank für dein Engagement!
Hey you!
Please don't forget to mark your progress in the progress bar at the talk's website.
Thank you very much for your commitment!
========================================================================
I'm very happy to have these guests here. We have Thomas Lohninger from Epicenter Works. And Chloé from Airdrie and they are talking about content takedowns. Who cleans the Internet? The EU plans to swipe of our freedom of expression under the carpet, and I'm very happy to announce them and that they're here. Give a big applause to Chloé and Thomas Lohninger. Thanks! *applause*. Thomas: Thank you and welcome to everybody. Yeah, let's get started. We're going to talk about content takedowns and content moderation, but a wider debate. You can call platform regulation. The next 40 minutes will be about, in my prediction, I would say after doing digital rights for 10 years. Probably the biggest digital rights debate it will have in the European Union. But before we get into that, we have to start with the basics. And the basic in this case is weird, complicated word called intermediary liability. What are intermediaries? You can think of that if you go back to deptries, to classical media. If you publish a newspaper, you are liable for every article that somebody writes in there. So, a publisher has to take responsibility for the content that they are putting in their paper. For classical media that works and media and liability law around these things has developed over decades and centuries for radio, for television, for all types of media. But that logic, of course, did not work, if the Internet came about and there have been several cases in the EU in the 90s where the police thought an ISP is just like a newspaper, they have to be hold liable and responsible for the contents of their users on their service. And so several ISPs in France, in actually not France in Italy, to the left in Austria, to the right were raided and their servers were confiscated. Thousands of websites of companies were offline because the police thought: "OK, We have to arrest the internet and to took all the servers away." There was even a famous case in Germany around CompuServe, where
a CEO of a
n ISP and host was actually charged with criminal charges about pornography, because one of their customers made content available that was illegal. But all of these cases, of course, created a huge uproar in Austria. The Internet was even shut down for a day. And then the EU reacted to that. Chloé: And as a result of this legal uncertainty, rules were adopted at European level and they were formalized into what now we call the e-commerce directive, which was adopted in 2000. And it gaves, basically Internet companies, legal protection against illegal activities that are taking place on their system. The rationale at the time was that it was to ensure that there was a unified digital market that would develop and expand in Europe at a time where very few people had access to the Internet and the digital global corporation that we know nowadays didn't exist. And one of the key provision of this directive, Article 14, says that company are not responsible for illegal content that their users are generating, unless *emphasize* unless they obtain knowledge of it. In such cases, they have to act as quickly as possible to remove it and that was until now, the European model. Thomas: Yeah, and you can also think about individual liability as a sword and a shield. That analogy mostly holds true for the U.S. and in Europe It's a little bit more complicated, but the basic principle still holds true. So, intermediary liability protections, these safe harbors, that Chloé just explained in a commerce directive, act as a shield. You are not liable. Your users can do whatever they want. And when a problem comes up, you have to deal with it, of course. But you don't have to go through every eventuality of bad thing that could happen if you start a new service. So that shield was very influential and important for creating the diversity and innovative capacity that we have witnessed over the past decades in the open Internet. But intermediate liability also gives companies a sword. They can
moderate on
their own. They can proactively moderate. They can moderate based on laws or terms or services, and they can do that to their own choosing. In the U.S., the debate, right now, is a little bit: if you don't use your sort more often than we'll go after your shield. Now, we also want to give you a brief video that also summarize it, that concept of intermediary liability by the most famous and most painful example that we had in Europe: The copyright directive. That was the video we created with Alexander Lehman for the pledge 2019 campaign. Video Voiceover: The open internet. We all know and love, where everyone can participate will soon no longer exist. EU politicians are about to pass a law which they say is supposed to combat unauthorized copies. In pursuit of that goal, they're about to make a fundamental change that would affect all of us. It all comes down to the main question who is liable for files illegally uploaded on the internet? So far, the person uploading illegal copies is also the one responsible and liable for the content. The app or website they do this on is innocent, unless they're made aware of the copy and do nothing about it. It's like when a crime is planned over the phone. That doesn't make the phone company responsible. Today's legislation makes sense. Nowadays, everyone can communicate and share with the whole world. It's simply impossible for websites to manually review each one of the billions of images, videos, texts, audio files we post online. But that's exactly what the politicians want to change. In the future as soon as something goes online the site would be just as liable as the person who posted it. So, what you may be thinking: "why is that a problem?" Because, then the only way for websites to operate legally would be to verify every single post by every single user, assuring that it doesn't infringe any copyright worldwide. If they cannot guarantee that, it cannot go online, but it gets worse. This new law is not only demanding s
omething technically impossible. It
's also threatening one of our fundamental rights, namely freedom of expression without unjustified censorship. But no program can tell for sure whether a parody, commentary or remix is legal or infringing on copyright. Making these decisions today takes lawyers, judges and long trials. And yet websites will be expected to somehow make them automatically millions of times a day. To avoid massive fines, the platforms will have to filter extremely strictly and a lot of perfectly legal content will get caught up in these filters, but unfortunately, technical feasibility and censorship are not the only problems. Before this bill has even passed the EU Commission has already presented an even further reaching law. This one would require filters for so-called extremist content, with each post having to be checked with law enforcement agencies. We're talking about nothing less than EU wide internet censorship machines. What could possibly go wrong? If we don't act now, we could find the Internet in Europe scrubbed clean of anything challenging, surprising, weird or enlightening. The big winners of this new law would be multi-billion dollar companies like Google or Facebook as their budgets would still allow them to implement the new guidelines. All smaller platforms would only be able to further offer their services if they used the filter systems provided by the big corporations. The smaller companies will have to trade in their data, which would make the big players even more powerful. All of us would be the losers of this new legislation, because structures that make the Internet so diverse today would die tomorrow as a result of this law. If you want to stop this dull and miserable filter net, you have to take action now. These websites tell you how. Call your representatives in the European Parliament today and tell them that you will only vote for them in the 2019 European elections if they vote against upload filters. Thomas: Yeah, that vote, of course, we certainly lost
by five votes. So five MPs we could not convince and henceforth the copyright directive was adopted and is now on the way to be transposed by EU member states. But it, the discussion doesn't stop there. But before we come to that first to the... Chloé: Just to kind of summarize again what you just said, like there is some problems now in the ecosystem, in the online ecosystem that make certain people believe that the ancient rule, that the rules from the ecommerse needs to be fixed. And why does that? Today's intermediaries are very different from the ones from the 90s. Obviously. We now are witnessing what what we call the platform economy, which is a centralization of the net around few players. And most of what peoples do online today is mediated through intermediaries and those intermediaries, those are a few giant corporation that dominate the local system. And the reality is, there is very few possibilities for credible challengers to enter this market. The second problem is there are many, many more people online today than it used to be. And everybody can post content 24/7 online and can passively reach a global audience, even that for free. And this is quite overwhelming. I mean, this is how we translate into numbers. And this is how the online communications landscape looks like today. So million, even billions user whose online activities are mediated by few platforms and they post an enormous amount of content per day, which makes it impossible even for those powerful companies to control and assess each piece of content that is posted. And that's a big problem. These billion users subject the entire communication under monolithic speech regulation policies. And this is impossible for a single set of rules to encompass and to accommodate the diversity of cultural norms in the world. What's more, because those platforms not only host content, but they actually curate it, they push it, they delist it, demote it, they decide for their entire use
r base, which voice gets heard, which viewpoints gets visibility and which does not. And because they host most of our communication, their content regulation rules, so-called community guidelines becomes some sort of constitutions that actually regulate a quarter of the world's population speech. That's a huge problem. And they start behaving like they were actually the ones making the law. Recently, Facebook created an oversight board, which is supposed to kind of make decision on content moderation cases. And in this way, they are acting more or less like a Supreme Court that decides and interpret Facebook terms and services. Thomas: And on that point, it is important to understand the core concept here. The laws that we are accustomed to, which regulate which steep speech is acceptable or not, they are always contagious. I mean, the case law of Europe's highest courts even says fundamental rights protection, freedom of speech protections are particularly for challenging speech for still legal, but, you know, like these very hateful statements, even statements that are creating an uproar that might even spark a demonstration. These are exactly the cases that we need to protect the freedom of expression. Yet, most of the content moderation decisions that happened today are not even based on law. It's right to assume that around 80, 90 percent of the content migration decisions are actually about the terms of services of every platform. And those are not laws. Fundamental rights protections do not apply to these texts that companies have written themselves and probably change every week. And that's so important because that also changes the dynamic of any legislation that comes further down the road. But there is an ample amount of cases where platforms have acted really irresponsible. Ticktock, the Chinese social network for short videos that's politically popular in the younger generation, The Guardian and its Politics.org recently leaked the content moderation
guidelines for the humans that are moderating the content on ticktock. And what they found is that this platform intentionally corrupts the reach of people with disabilities, with autism or Down's syndrome or people that are not thin, that have a normal or regular fat body size. All of these people are kept in their reach, so their posts and their videos are actually never reaching a critical audience. Chloé: And what platforms actually rely on automated tools, on filters to the content moderation, it doesn't get better. Actually, even when they tried to do the right thing, which is, for example, fighting hate speech against people of color, it end up in black people are the ones being the most censored, rather than the violent racist speech that is targeting them. And that's because the technology is unable to understand the nuance in every language. And that's the case of Twitter here. Before going presenting to you what's coming up at European level, let's have a quick look in what happened in the past year, because both the EU and its member states have started to lead a true crusade to clean the Internet. They were adopting very quickly legislation to tackle all sorts of problematic content. Thomas: Yeah, and of course, copyright directive is first and foremost there. We really wanted to win this fight, not just because of Article 11 and 13, now 15 and 17. So, upload filters and... until we copy right for news publishers, but also because we knew that in the upcoming fight around Digital Service Act about the intermediate liability debate, it would be a bad start if the copyright directive went down as it did. And the other further was mentioned in the video, is the regulation on terrorist content online. That's still up for grabs and will probably be adopted next year. The Audio Visual Media Service Directive is a particularly nasty piece of legislation because it is a law that doesn't say companies should do X, Y, Z. It is a law that said companies should have ter
ms of servi
ces that do X, Y, Z. So it is kind of outsourcing or privatizing. Things like that moral development of young people. And then we also have two pieces of soft law, a code of practice of disinformation and a code of conduct on illegal hate speech. That are not even legislative acts. It's basically the commission sitting down with Facebook, Twitter and Google and saying, you really don't want us to regulate you, do you? So just follow this law and come up with a self-regulatory scheme and then we'll let you alone. Chloé: And actually, members that have not been resting on their laurels either, they've been very prolific. And the first one to adopt is it's on anti hate speech law was Germany. And it was actually copy pasted by a lot of its European counterparts. And all those laws are currently under debate in their national parliaments and governments. But basically, there is copycats law in France, UK is talking about it, Ireland and also Croatia. One of the common denominator of those laws is that they are shifting the responsibilities to decide what to block or nod online to, onto the shoulders of companies. And that's very convenient for European governments to do that, because companies do not have to respect human rights law like the EU Charter of Fundamental Rights. Contrary to the very same governments who have to abide by these laws, the very, very convenient way for them to shrug off the difficult task, to balance fundamental rights at stake. What is actually the playbook of these laws. So again, they are pushing content takedowns to be based on terms and condition. So, contractual rules rather than the state law. So that's a very big problem for the principle of the law. And then for some of those laws, it goes even beyond actual legal content under the law. It also covers harmful content, more or less undesirable, what is undesirable in the eyes of the legislator? But they don't give a legal definition, obviously. A
nd what els
e? They also incentivize companies to act very quickly to make the decision on the content. And for that, it leads company to obviously use, if they can, ultimated means, so-called upload filters to the job very fast. And how do they incentivize companies? With high fines. If you don't comply with the rules, then you fined. Thomas: I again summarized that nicely: Platforms are put in a really difficult position with a strong incentive of over blocking. Some piece of content is notified to a platform. They could either just delete it and be done with it or start a quite complicated, expensive legal assessment. Is this within the rules or not? And if they decide it wrong, then they could face the risk of penalties or even liability for that content. And hence you have a strong incentive for over blocking. And it is not really a level assessment that what you do. But to increase complexity for them further. It is not just about content moderation. So this whole debate about platform regulation will also include e-commerce. So there is a famous case when L'Oreal counterfeit products were sold on eBay. L'Oreal sued, ultimately lost. That case was in 2010 and the European High Court went with us when we have interviewed liability protection back than. But, as you have seen with the years of the previous dossiers, both on a member level as well as on a rural level, it is actually more or less difficult for us now to hold that line. The CHU, the European High Court recently decided in 2017 that: "yes, Uber is a taxi company" and does not benefit from the protections of liability. In the US. We had SESTA to stop enabling Sex Trafficking Act, which was a law that was adopted bipartisanly in the middle of the Trump administration in 2018. Both Democrats and Republicans could agree on that law, and it again removed liability protections for Hosters. Tumblr deleted around 17 million sex, positive sex educational blogs because of that. Because they could not bear the cost or was
not willing
to actually go through that content on their platform. So, they just massively deleted millions of blocks with a lot of content. And that is a kind of sign which Internet we could end up with, if suddenly every platform has to take responsibility before stuff even appears online. And this is not just a Western debate. Also in India, the regulators are looking at that. Try lots of consultation and they are also proposing new rules on intermedia liability. And that whole debate is global in nature. It's important to understand that every rule that we make here will have repercussions in the rest of the world. Netz DG, the German long content moderation, was copied by 17 countries already. The first of which was Russia. When we call for Facebook to respect the decisions of a court. People in Azerbaijan hear: "OK, that same court that blocks 100 opposition website every month. Now should also decide about my Facebook posts." So it is quite complicated to get it right here. But yet I've believed that Europe has the best cards to come up with a fundamental rights based solution to this interpret problem. Now, about this reform. Chloé: Yeah, what what is coming up? Since 2000, it was quite silence around the e-commerce directive. But in the last decade, there are a lot of movement. Thomas already explained to you, there were several case law, but there were also several calls at European level to reopen the text or to complemented with another more specific piece of really of legislation. But nothing really concrete came out of it. But now, with the new commission in place since first December this year and with its new president, it is quite confirm that the review of the old e-commerce directive will happen and it will take the name of the Digital Services Act. And to be honest, what will this future legislation say or contain? We don't know exactly yet, but there are several ideas on the table. Among them, there is a possibility that the reform look at the current defini
tion of wha
t is an intermediary. How, what do we call an intermediary and how we categorize them and trying to obtain the old definition, trying to take into account the new Internet companies that emerge in the 2010s, like Airbnb, for example. It will look at one of another core principle of the e-commerce directive, Article 15, which prohibits member states to put an obligation on platforms, to monitor all their content, to look actively for illegality. And this is a big question mark for us, because will this principle be upheld or not? That's a big question, because the copyright directive already kind of start going against that very principle. And then it will also look at some obligations for platforms or for intermediaries in terms of how they regulate their content, their practices and how they do content moderation on their services. Thomas: Another thing, which is actually very welcome from outside, this will list here, is based on a leak from the commission on the working level. We actually have good people there working on that file. They've been doing that for over a decade and they have a really good understanding of the issue. The question is how the political side will deal with these good ideas that are being discussed. But yeah, rules on online advertisement are good, because we have to talk about the business model. Most of these problems are symptoms of the strong concentration in the market and a few very dominant platforms that we have in the internet today. And so actually tackling the attention economy and the attention merchant as Tim Wu puts it, is the right thing to do also, because the full scope of this phenomenon called targeted online advertisement, I would say, has not been completely understood. And we need more sience and for that we need more data. Another thing which is actually really hopeful, I think for the people in the room here, is interoperability. Yesterday Moxie Mollens, talk where he basically bashed idea of decentralization?
What he is
missing is really that is not about decentralization in the 90s sense. It's about interoperability. It's about making dominant platforms open again, forcing WhatsApp to establish a protocol where competing competing messaging services like Signal FEMA can communicate with people on these other networks so that competition can happen even in an economy that is strongly based on the network effect. And interoperability is not a Swiss knife that can solve all problems. But if it is applied in a case by case way to also solve the privacy and security problems that come with it, I think it would be really a visionary thing for a European Internet that. Is the central and open and not concentrated between China and the U.S.? Another finger, of course, many people in government want to see is accessibility of data. Many local governments and cities have problems with taxation from ABM B, for example, and to refusing to cooperate to allow for these types of taxation, forcing them to hand over this data. I think in general it's something to look at. And lastly, all of these rules will most likely be informed by a new entity, by a platform regulated on a European level. Again, that's still up for debate. People want to see media regulators to more or telecom regulators. But I would think that we'll see a new regulator for these tasks in the EU in the near future. And that is why the digital service sector is a is often dubbed the Constitution of the Internet. And that is actually a quote from somebody in the commission working on that file. And to compliment the picture. Why is this important? Let's look at a more mechanical thing like the lobby side with Chloe and I are working on on a daily basis. The stakeholders that will be part to that debate is, of course, all of the U.S. Internet giants and Silicon Valley. It will be the whole of the European Internet industry. Every telco host a content delivery network provider ever. So basically, all of the actors of the general da
ta protection regulation will be on the table with a stark business interest on that file. But it's not over. You are also a big opportunity for a big copyright battle revival with all the classical actors in this debate like the classical media. Think about newspapers, but Catholic broadcasters a publisher a.k.a. The Return of Axel Springer. Then you will have the entertainment industry which is representing more or less all the rights holders from the music, cinema, etc. and represented actually by an army of lobbyists. And then obviously the word. But yes, there you are and more. But anyway, I got it. And then obviously the providers of applied filters like evil, magic, Facebook, Google, who have really big financial interests in this here and it doesn't stop there. So brick and mortar stores, taxi drivers, hotel owners. So everybody who is attacked or affected with the sharing economy, with the gig economy, of course, will have a say. And particularly in national debates, those are strong lobby organizations. Then it's us. It's a digital rights community. And I think we have to be essential in this debate because in a way, it is about the soul of the Internet or that question about interoperability and liability could easily change the landscape of the Internet that we have today. So I think that's why we should care. Then, of course, human rights. Amnesty International is waking up to these rights. Those actors have a different mindset, but I think we have to see them as allies. And lastly, awesome. Every type of marginalized group, every anti-racism organization, feminist organization, we heard about people with disabilities. So all of the things where we are society currently disagree. If this should be shown or somebody should be allowed to say that mixed in a big bucket steered and then you global private profit oriented company, please solve it for us. So if you aggregate all those stakeholders, I would dub it not the constitution of the internet, but defi
nitely deep model of all these civil rights debates just because their interests are so big. And that's why we will definitely not have an easy time with that. Okay. What's next? When should you be ready to act more or less? This is all super blurry, but rumors hold that the first public consultation will be launched early next year in 2020, to which all of you can participate gladly. We would be very glad about that. And the proposal of the commission to the first draft will be released probably at the end of 2020 by the commission. Then serious big business begins. When the text reached the two collegial leaders, the Council of Member States and the Parliament, which, to be fair, we are going to take probably more years to find their own positions on the text, their own version of the text. And so we expect the negotiation between all the parties after in 2022 and probably the adoption in some years later. To summarize, so the nine years have shown us that we needed to adopt the rules because the internet is not a paper printed on dead trees. Then for a decade it was mostly silent for the last decade. We had to fight off attempts to open up this law and now it's going to happen. So we have to deal with the reality. And a good lesson to learn is that in European legislative processes to earlier you come to the table to more effective you are. The same letter that you could send one day before the final vote will be nothing, but if you send it months before the dossiers actually, will you sort of consultation has launched, you can really bring ideas to the table that will end up in the law. That's what we have done. This epicenter works with platform regulation to you, which is an attempt to actually solve that problem in the form of requests for comments. It's an RAAF sea based Web site and you company's comment on that. And we'll, of course, continuously increase these ideas and hopefully we'll see some of them are later done in the law and the legislation that wi
ll be released. Then lastly, the epicenter works as a donation funded organization. We have very strict rules on our fundraising and the work we do, which is a public watchdog. It's also not welcomed by many people because we can't be bought. We actually always try to speak truth to power. And that means we are even more dependent on the support of many individuals. And a diverse group of supporting members is actually what makes this work possible. And you can become one if you go to support that epicenter works. My time for it, but amazing. Follow our work. If you want to be updated on this file, if you want to take part in many, many battles as European level, that concerns all of you. You follow our work. You follow the best newsletter in Europe at every dot org slash newsletters. And also those are the ones from Epicenter Rex. And follow us on the latest. Unfortunately, Twitter channels that we have. Thank you. Thanks for questions. Applause Thank you so much. Thank you so much. Again, the contact details are here and we have questions from the Internet. And we have questions here in room Clark. And I would like to start with number two, please. Thank you. So when lobbying against some of the things that we will be lobbying against in the coming years. One thing I hear is the threat of species, a human right by threat of reach. No. Has been. So how do you how do you react to that? How do you best react to that? Sorry. Could you repeat the question? Yes. Yes, sir. What what people say is that threat of speech is obviously human. Right. But the right to free speech that is using the platforms is not the answer to that. Yeah, I mean, it is actually touching on a complicated thing because it's actually a very simplistic view to just think that it's about content just being published or being taken down or deleted in many cases. It is about algorithmic creation. What actually ends up in the news feed, it is also about which content gets monetized when gets a user su
spended. If content gets deleted, does it also get notified to the authorities? But often that content is also evidence for a crime. If you speak about the stuff witnessed that August working on in Syria or the documentation of child abusers, all of these things actually deletion could not be the most viable step to take. And we wanted to talk about that, but because of time, we didn't go into detail about the policy. Proposals are all tackling that. And when it comes to the question of responsibility and algorithms, there will be a separate dossier that we don't know much about. But there are rumors to a regulation which also might be mixed together before there's a come in conjunction. We don't know it yet because those decisions haven't been taken yet. But you surely will see these questions also being addressed in the ongoing legislative term of the EU. Thank you. Okay. The gun would never question from the internet. Please signal. Hello. So one of the questions from the Internet is does the law require Facebook and Google to snoop data that is not pleasant to the public? Snoop to snoop data, the sense of monitoring it. Monitor. Yes. It's actually a good question. I mean, if we talk about a private WhatsApp group or private Facebook group, it all depends on the under national legislation if there and only know it from Austria. If you have more than a certain number of people in that group, you have a public forum. And for example, if you would deny the Holocaust in such a closed photo, WhatsApp or Facebook group, that would still be an illegal fence. But then the question is, where's the judge? So like if that gets notified to the platform, they should take action. And for a Facebook group, Facebook could do that and would probably also do it if people do their job on end to end encrypted WhatsApp channel. They simply cannot technically if they are not party to that communication. I hope that answers the question. OK. Thank you. We have another question at the m
icrophone. One, please. Hello. I'm from the United States. I run a nonprofit transit Internet service provider by running about 6 percent of terror network exit capacity. So I facilitated quite a bit of onion circuits which are decentralized and entered encrypted so we can't see what's in them. How does this affect me and what how can I get involved and can I get involved in your conduct? Yes. I think where you would fall actually nowadays would be the definition of mere conduit intermediaries which have less responsibilities than hosting providers. And because you're just providing like the the network. Like not really hosting it on your own servers. If I understand correctly and this and that and the rules that I mentioned with the knowledge and like the responsibility and the liability exemption doesn't even imply to you you're more protected either. As for the digital services like what's going to be coming up, I'm not sure, but I don't think there is intent to actually modify the rules that apply to them, their conduit. It's more like the hosting providers, like the big ones, the forums, the blogs and so on. Yeah. Okay. Thank you. Microphone two, please. So given how we again, how with this question, for example, we see how where the responsibility falls is also a bit part of the negotiation as well. So we have given you have given interpretation now of how this law affects the current way that publishing works in the Internet. But don't you think that this change can also incentivize innovation in the way that we do publishing on the Internet and maybe incentivize more decentralized approaches to publishing merely in order to escape responsibility? Of course. I mean, one reading would be that platforms just die off. Everybody has to all have their own website again. But you. Where would the discussion then happen like in the RSS feed said? We all have on our species, even something like Mastodon would be directly in scope of legislation like this. So every typ
e of aggregation, every type of news feed, every type of interaction even would fall within the scope of that law. And that's why in tolerability, I think it's a nice synthesis of this central versus de central debate. Absolutely. OK. Thank you. I'm very sorry. I see we have more questions. But unfortunately, time is up. Please give another expert applause for Chloe and Tim. I don't know. Thank you so much.