Hallo Du!
Bevor du loslegst den Talk zu transkribieren, sieh dir bitte noch einmal unseren Style Guide an: https://wiki.c3subtitles.de/de:styleguide. Solltest du Fragen haben, dann kannst du uns gerne direkt fragen oder unter https://webirc.hackint.org/#irc://hackint.org/#subtitles oder https://rocket.events.ccc.de/channel/subtitles erreichen.
Bitte vergiss nicht deinen Fortschritt im Fortschrittsbalken auf der Seite des Talks einzutragen.
Vielen Dank für dein Engagement!

Hey you!
Prior to transcribing, please look at your style guide: https://wiki.c3subtitles.de/en:styleguide. If you have some questions you can either ask us personally or write us at https://webirc.hackint.org/#irc://hackint.org/#subtitles or https://rocket.events.ccc.de/channel/subtitles .
Please don't forget to mark your progress in the progress bar at the talk's website.
Thank you very much for your commitment!




======================================================================






Specifically, programmers and hardware manufacturers are getting more and more pressure from law enforcement to get back doors in. In their products and Curt over here, his general counsel, Latif's, in 2007, he received the title of one of the best attorneys of the years in that California lawyer, Somersby Years magazine. And he will tell us more about this problem with an overview of today's issues. Thanks. All right. Thank you. Thank you, everybody. Good afternoon. Welcome. Welcome to C.C.C.. And I'm glad to see you. So many people here today we're going to talk about today is is the fight for encryption in twenty sixteen and you're going to get the slides up here and we go. So, yeah, my name is Kurt Opsahl. I'm the deputy executive director and general counsel with the Electronic Frontier Foundation, which is a nonprofit organization dedicated to defending your rights online. Thank you. It sounds like there are a few people familiar. In the audience, so let's start out with a bit of an overview of how things have gone over the last year in the fight for encryption. There is some good news that, first of all, that strong and encryption, both communication, encryption and end and device encryption remain legal in most jurisdictions today, more deployments than ever. These things are rolling out all over the world. We'll talk about that a little bit more. The bad news is that this as governments are still at it, they're still trying to find ways to weaken encryption, to get plain text access to encrypted materials. And they're trying to do this by pressuring companies as well as pushing forward some laws. A couple of bad laws have passed. And then probably the worst aspect is some of the ways of sort of getting around that which are blocking technologies, trying to make it so people can't use strong encryption by blocking at the network level, placing malware on devices to get around encryption by attacking the endpoints. In some cases, governments have resorted to 
arresting individuals associated with some encrypted communication tools in order to really push the pressure on those companies. And before we get into the nuts and bolts of things, I just want to at least take a moment to provide an overview of the encryption debate and how it has been going as an initial matter. I mean, why do we love encryption? I think many people here are already convinced. But just to go over some of the reasons encryption protects our data and our infrastructure, it helps ensure both privacy and security. But at the same time, you have a government who are concerned that they are, quote, going dark. This is their their euphemism for having less ability to access materials. They say it hinders their ability to conduct law enforcement efforts, national security. Over the last year, we've had some some good shifts in the debate. And one of the ones I wanted to highlight was moving from a discussion of privacy versus security to a discussion of security versus security, that is to say, recognizing that encryption provides security in and of itself. This is particularly good news because oftentimes when a debate is framed as privacy versus security, privacy versus safety, it is privacy that loses out. And this was a rhetorical device used by people who wanted to weaken encryption, wanted to get more access to encrypted communications to try and frame it in that way. But people have come forward, talk to their legislators, talk to policymakers, and pushed a shift toward understanding it as a benefit to security. That can be contrasted with some of the concerns the government has about reducing security. And there's also been a recognition among some policymakers that weakening encryption does have severe consequences, is not a cost free improvement to law enforcement's abilities. And there has been at least some recognition of a core tenet from from my organization. That code is speech that there are First Amendment free expression implications th
at come from both regulating the ability of people to use cryptography. And also there are enhancements to freedom of expression that come from the ability to have that encryption and anonymity help enable freedom of expression. So I want to take it before we get in another brief Segway to put some perspective on things. But one of the things that government talks about were going dark, that this is an unprecedented inability of government to get into places that can be they can be locked out of. And I wanted to go back a little bit in time into the seventeen hundreds when a locksmith, Joseph Barama, created a uncrackable lock and for for 67 years for the lock out there, he put what we call an early bug bounty program on it. Two hundred guineas he offered to anybody you could pick his lock, had it hanging up in the store so people could come along and give it a whirl. And it took fifty one sixty seven years for it to happen. Fifty one hours for the lock picker to actually do the first break through it. And during those decades there was something which provided pretty good security that made it very difficult for government, even if it had a warrant to get within the lock. Now you could have implementation flaws. This is kind of the same thing with crypto today. The safe built with this might have a weak, weak metal, you know, hinges in the wrong place, but nevertheless. Like Krypto today, it provided strong security and society survive through those 67 years. So this is not quite as unprecedented as they would like you to believe. All right. So let's turn, first of all, to the big thing that happened that last year, a very big public showdown between Apple computers and the FBI and the FBI for for many years has actually been seeking access to smartphones. They recognize that smartphones are an incredible window into people's lives. And they wanted access to that when they wanted to look at and see what people have been doing. And they were trying to do this throug
h through the courts with court orders. And there were two key cases that help help frame this debate. One was a case in Brooklyn, New York, outside of New York City. And the other one brought a little bit later was in San Bernardino, California. And the first case, the Brooklyn case, was a relatively routine case. It was they were trying to get onto the phone of an alleged meth dealer, sort of a small time local dealer. They weren't able to get on the phone. They wanted the evidence there. They had a lot of additional evidence to be able to convict the guy, but they they wanted a little bit more. And they submitted a application for an order to get Apple to help them onto the phone, which they relied upon something called the All Writs Act. We'll discuss them more in just a second. But this was not unusual for the FBI to go to the court to ask for this access. A little bit unusual to ask for a third party, an unrelated party, to assist in helping getting access to the phone. And the court did something very unusual. Was it asked for more briefing? Oftentimes, these are arguments made by the government directly to the court without anybody else weighing in. But in this case, the court was like, well, I don't know. I don't know if this this argument really works. And I would like to get some additional briefing on that. So Apple filed a brief EFF and ACLU filed a brief. And we attempted to explain this to the to the court. And this is why we didn't believe that the All Writs Act was the was provided the authority that the government signed. The All Writs Act is kind of a catchall law. It is actually one of the oldest laws in us. It was originated in 1789. It's you have the language up here. It's a it's a little bit convoluted, but all the writs that are necessary or appropriate basically to allow the court to do its job. So if the court had power to do something, then it could issue a writ in order to enforce that power. So when it was written, obviously wasn't think
ing about things like smartphones, it wasn't really thinking anything along these lines. It was just a basic tool. And pretty much it's the fallback position. If you have nothing else, you can always go to the All Writs Act and see if that will that will fly so well. That was pending. A new case came up in February of twenty sixteen, the California iPhone case. And this was out in San Bernardino, California, where there had been a horrific terrorist attack. Two people, employees of the San Bernardino County health system, went to their office holiday party and opened fire, killing scores of people before before fleeing and eventually perishing in a in a shootout with police. It was a fairly devastating attack and really made it made a lot of news. Several months after the attacks, they decided they wanted to get access to an iPhone that actually belonged to the county, San Bernardino County, but had been in use by one of the attackers. And he had left it in a car, a black Lexus. So the case was actually styled in a search warrant of black Lexus. And a couple of months after the attacks, they wanted to to get into this phone. And so they submitted an application to the court. And that very same day, the court turned around and issued the order to Apple. They in the court signed off on the government's brief without any modifications on their proposed order, I should say, without any modifications, and issued it the same day. And it was a fairly lengthy order. So this this may suggest that not a whole lot of deep thought went into whether this was possible. And under that government requested order, they wanted to bypass the auto erase phone. We're after a certain number of failed attempts to access the phone, it would erase itself. They wanted to be able to submit passwords or pass codes electronically, and they wanted to have no delay in the password attempt. So what basically they were asking for was to remove the features that were designed to protect brute force 
attacks so that they could brute force the phone. So Apple called this gov os that they were being asked to to make a new operating system to be used just on this phone in order for enabling government access. And the court left open one thing, which was, well, Apple, if you want to challenge this, if it is unreasonably burdensome, you can do so. So indeed, Apple did challenge it. They asked the court to reconsider its order. And somewhat unusually, Tim Cook, the CEO of Apple, wrote a big public letter about it. And he considered this something that, first of all, they didn't have, but more importantly, they considered it too dangerous to create. And Apple filed its brief. There were many amicus briefs filed in this case, something like 40 or so. Mostly there were civil liberties organizations, industry groups, mostly on the side of Apple, though there were a few who were in support of the of the government's position. One of them I want to highlight in particular from the San Bernardino County district attorney. This is the chief prosecutor of the local area. He said that they ought to be able to get access to the phone because it might contain a lying dormant cyber pathogen. And he wanted to make sure that we we had access to that. The logic of that was a little bit unclear, because if it was really that dangerous, maybe we shouldn't get access to it. But nevertheless, he thought that was a reason to get in the phone. The FBI director, Comey, he started out by saying that this was just about trying to get into one phone. It wasn't about setting a precedent. But later, under questioning before the US Congress, he admitted that it was about precedent and they wanted to set this precedent so they could access more phones. And then he asked the question, if there are warrant proof spaces, what does that mean and what is the cost of that? And this is a reminder of the time when we had for sixty seven years the UN correctible luck. We may have had warrant proof space be
fore and we may have them again. And it also is not recognizing that there are something fundamentally different about access to smartphones because of how much of your lives are part of the phone or on the phone. If somebody has access to that, they have a more than just a little bit of evidence. They have a window into your soul and protecting that is more important than ever. So this became a major controversy. It became international news. They had a poll that went out during this time period and it was about 50 50 on whether Apple should provide access or whether they should deny access, which may not sound like OK, if people were divided. But what sort of particularly impressive about that is that when they're doing a case where they're saying we need to get this information for a terrorist attack, putting all of the pressure is associated with a national security case, saying this was vital for our national security. And still, they weren't even able to get a majority on their side. I think this was a lot less support then than the government was expecting when they brought this case. And both civil society and industry brought together to to help support Apple in this. People understanding this would be a precedent, that it would be not just a precedent about accessing phone, but a precedent about the government ordering you to make a new version of your software that has security weaknesses in it. So this was coming forward, coming to a head with a hearing scheduled in March of 2016. And then we heard from the from the Brooklyn judge. Now, this this briefing had been going on since October of the previous year. And in the ordinary course of things, you know, judges will take time to carefully consider. It might take a while, but not very long after the the news really hit about the California case, the Brooklyn judge issued his lengthy and fairly detailed opinion, concluding that Apple did not have to unlock the specific device, that the All Writs Act did n
ot provide the authority that the FBI was seeking then moving forward into the hearing. We had a sudden new news that came out the day before the hearing, the FBI said, well, we're exploring a way to get onto the phone. We need a little bit of time to check this out. Can we get a delay in the hearing? This came out the day before the hearing. Actually, a lot of people who I know we're going to go down there had already departed for Southern California. I was actually just about to head off to the airport myself when the news came in and I was able to save myself the trip down there. But this was sort of a very surprising last minute development. And then a week later, the FBI reported that, yes, they had gotten access to the phone and the hearing was canceled. And we got a little bit a little bit of details about this, that it was an exploit that cost well over a million dollars. And this was calculated because the director, Comey, said that it was more than his entire salary for the 10 years that he is going to be FBI director. So people did a little math and figured out that that would be over a million dollars. And this was a hack that apparently works on the iPhone five C and older devices. A key factor there is it doesn't have the secure enclave, doesn't have the touch ID feature, which requires the secure enclave. And so it was apparently something that was defeated by the secure enclave that we have very little detail. So the FBI withdrew the case after the exploit worked. And there was no no ruling by the judge on whether their power under the oil rich act extended this far and shortly thereafter, the government also they had appealed the Brooklyn judge's order, but then they withdrew the that appeal, saying that they had somehow obtained the passcode. According to news reports, apparently the suspect remembered and provided is his code. So what this means is that right now we don't have binding precedent on the question of whether the government has this po
wer. There is the one decision out of Brooklyn that remains on the books, but that was a decision issued from the lowest level of judge, a magistrate judge. It is not binding on any other judge. And if they had appealed and lost and took it up the chain, then it becomes more and more of a of a binding precedent. But that that is not so. We're still waiting for the next shoe to drop and get to bring these arguments out again and see if we can get some precedent on that. Also, the government didn't disclose how it got access to Apple. Apple was seeking that information and they actually had suggested that if the case had continued and gone forward, that they would use the case as a vehicle to try and obtain that information. And it's also brought up in some people's minds the the vulnerabilities equities process. This is a process that came out through a Freedom of Information Act, open government request. And it is a process that the executive supposed to go through when it's deciding what to do with a vulnerability. So if the government has a vulnerability, it weighs the equities of disclosure to the vendor versus exploiting that vulnerability. When should they disclose and how do they balance out the security harm from the availability of this vulnerability versus the advantages that they would see with being able to exploit this? This would seem like something that perfectly fit within the vulnerabilities equities process, and they should have used it here. But now, as it turns out, the FBI didn't buy the vulnerability. They bought a black box exploit, so they didn't have anything to disclose in their view and didn't need to go through the vulnerabilities equities process. So what did Apple do to to respond? Well, these are some of the goals that Apple put forward. This came from a presentation they gave this summer at the Black Hat Security Conference, and they were trying to continue to use the secure enclave, tightening up to try to limit the number of passcode
 attempts to take brute forcing out of the picture to make it difficult to do offline attacks and to with a secure enclave. There is a true random number generator or a hardware random number generator. They try to make it so Apple doesn't know what that number is and then it gets entangled with the user ID and the past codes. This makes it so that Apple has very little information to give that would be necessary to to crack this. They also put forward a bug bounty program. So still slightly less, certainly a lot less than what apparently the market is. Two hundred thousand at the top of it. But this is a very important step forward. Apple had been actually one of the last major companies to put out a bug bounty program. And so I'm very glad they finally came around to just start doing that. So the government well, what are they going to do now? They don't want to rely on buying hacks, that this is not to say that they are opposed to this. In fact, there are many instances in which the governments have either created or purchased exploits. They have what some governments around the world have bought from places like in a hacking team. The NSA group sold a exploit to U.A.E that was used to get access to a phone of a opposition activist. So these are continuing to go on rule forty one. This is a new rule in the US criminal procedure, and it makes it easier for judges to issue orders to allow the government to use NYTs network intrusion tools, which is another euphemism for basically malware getting onto people's endpoints. So the governments are certainly willing to do that, but they would prefer to have the government, the companies just provide the easy access, not a back door, of course, but something like a secure golden key. So so what could go wrong and. If you if you have access to the secure, I guess these are brass keys for a TSA lock, you should be able to get into it. In fact, if you have this photograph and 3D printer, you probably could make these keys. A
nd this is this is the problem when you have something which gives you in a nutshell, if you if you give access through a special method, well, you've got to be make sure that that special method doesn't get into the wrong hands. So turning to a bit for the to the politics of it, from the beginning, actually, slightly over a year ago, there was an effort to push President Obama to take a stance in favor of strong crypto. There was a petition up at Save Crypto Dog with over one hundred thousand signatures. And his initial response was that, well, for now will not call for for legislation, which is not a very strong response, but at least it's not not the opposite. And then later in 2016, Obama said, well, we shouldn't have an absolutist view on this. And what is sort of meaning by that is people are saying, well, that you either have to have security or if you have a back door, you will weaken security. You can't have both a back door and strong security. And that's like an absolutist view. And I think this is very symptomatic of politicians looking at this where it's all about trying to find compromises. And if the technology doesn't permit for compromises, this is a political question, not a mathematical question, not a technology question, and that we need to find a middle ground. And I think that this is actually dangerous thinking that if saying that is absolutist, to say that we need strong security, well, you might call me an absolutist, but I think it's more than that. And then on November 8th, we have a new president coming coming online. And so how is that going to be? Well, Trump is not yet not yet in office. But we're able to look at a few things to get an idea of how this is going to be. First of all, just on the Apple iPhone controversy, Trump had a few statements in the beginning. He was saying that who do they think they are? We have to open up this phone. And then as the debate continued, he was noting that he used both the iPhone and Samsung and sai
d we should boycott Apple if they don't give over the information. And yet it's a question of how serious that really was. Still tweets from from an iPhone. This is a picture of him doing a Reddit AMA after his boycott call and looks pretty much like a Mac there. So we can't really tell how serious this was. We also have some additional clues from the nominations that Trump is putting forward for key positions in his new government. The proposed new attorney general, Jeff Sessions, well, he has been long in favor of law enforcement access to phones. He felt that Tim Cook, CEO of Apple, didn't really understand how serious this was. And then the new proposed CIA chief, Mike Pompeo, he wants to remove barriers to surveillance and also was pretty suspicious of somebody who use strong encryption. It could be a red flag just if you use it, which is a pretty dangerous line of thought. There was also some efforts in the legislative world, the Burj Feinstein bill, that Senators Burr and Senator Feinstein. It was actually called the Compliance with Court Orders Act. They were trying to sort of key off of a rhetorical point being made about the Apple iPhone controversy, which is we're just asking people to comply with court orders. This is this can't be that unreasonable. But it was actually that unreasonable. It would require providers to decrypt things on demand and on pain of severe penalties and applies to communications, to storage. It applied to the App Store so that if you were to have your Apple or Google Play, having an app store, all the apps that were for sale on that store would have to have weak or backdoor crypto, and then you would need to enforce that. And it was more than just end to end and full disk encryption pretty much as it was drafted, it would have outlawed computers as we know them. It was a fairly terrible bill, but it fortunately didn't get a whole lot of traction. And the rest of Congress decided to do a little bit of looking into and having commi
ttees look at the issue and issue reports. The House Homeland Security Committee, they made a big step forward by recognizing that is more of a security versus security debate. They rejected the legislative fixes and and most importantly, they were Feinstein we just talked about, the House Judiciary Committee recognized that there would be severe problems with weakening encryption. They still called for cooperation between technologists and law enforcement agencies. And a little bit dangerously, they were saying one solution to this would be compelled decryption by the user. So rather than going to the companies and asking for a backdoor have laws that would insist that the users decrypt their material under penalty of criminal penalties. This is a bit of a dangerous thing for some other reasons. But at least in terms of looking at the availability of technology in the without back doors, the committee on the whole was headed in the right direction. And that's that's where where it stands at the moment. So I want to turn now to the United Kingdom, the Investigatory Powers Act. It was for a long time the Investigatory Powers Bill. It is now the act is now passed and been signed off, is often called the snoopers charter because it is a broad expansion of surveillance powers. It would allow access to communications data from all sorts of agencies, the police HQ, the Ministry of Defense. They would have access to Internet connection records. Internet service providers would have to store metadata about communications, made websites you visit, what time you do it, all sorts of information. And that would be stored for up to 12 months. But then the European Court of Justice said, nope, not going to do that for this. Yeah, that's a very important ruling. The European Court of Justice felt that this went went too far, that it was general and indiscriminate retention of emails was illegal, and they only allowed for targeted interception of traffic that is justified when it i
s necessary to combat serious crime. So this was a very important push back on the snooper's charter. Now, for purposes of our talk today, it does not affect the portions of it that we're requiring. Back doors will go over those in just a second. And then another important caveat is that soon the UK will be leaving the European Union and maybe pulling out of the jurisdiction of the European Court of Justice. So this ruling may not be as powerful as it might have been, but it also sets the stage for additional challenges, hopefully both to continue pushing back on the data retention features and the encryption features in the future. So what does it say about encryption? Well, it says some pretty complicated things that don't really mention encryption by name. So this is a quote from the code of practice, which accompanied the legislation, and it talks about some things like technical pazhani notice, and that you might have to provide a technical capacity. And it is interesting that it requires you to notify the government of new products and services in advance of their launch. So apparently you need to go get approval from the UK government before you launch anything that might have new encryption technologies. But it all comes out. What is this thing, this technical capacity known as well? The statute defines it a bit. It is something which is issued by the secretary of state, better known as the home secretary. And after the home secretary looks at it thoroughly and they have considered whether it's practicable to apply, whether it's proportionate there, to take into account the technical feasibility and likely cost of complying, these all sound like pretty good things for someone to say. But I'm not sure that the home secretary is really the best person that is a position to weigh all those features. And they may end up having a lean towards allowing for for back doors, allowing for these technical orders to go out. They also they come with an automatic gag orde
r so that if somebody receives one, they're not supposed to talk about it with anybody, which makes it hard to organize and fight back against them. And then it can be given to persons outside the United Kingdom. So in their view, everybody in the world could get one of these technical capacity notices and be required to well required to do what? Well, they might have obligations relating to the oh, it's a back door. They want to remove the electronic protection that the operator may have put there. So they've disguised it with a lot of wording. But in the end, it's a pretty dangerous provision that may both be be challenging for people trying to do business in the UK and for those who might not even be doing business in the UK, but might receive one of these under that authority and have to wonder, am I under the jurisdiction, do I have any business there? So it's a pretty dangerous thing. Elsewhere in the EU, things have been moving on a little bit of a of a slower track. The EU justice ministers have been discussing the issue. The Justice and Home Affairs Council discussed it thoroughly. They looked at different views. They spoke of the importance of a balance between individual rights and privacy and law enforcement agencies. So it's still under discussion, still under. Consideration, but hasn't moved forward, and we had some really good report out of Inessa, that is the EU cybersecurity agency, the Agency for Network and Information Security, and they issued a report earlier this month which rejected back doors that saw the problems as outweighing the benefits. And they recognized it is very difficult to restrict innovation through legislation that, you know, even if you have the best possible platonic ideal of legislation, it's still going to be only good for the technology as it was envisioned the day that it was passed and will continue to become more and more outdated over time. So it's a difficult solution to to move forward on. Elsewhere around the world,
 in April, compliance began with the Australian Defense and Strategic Goods List. This has a provision which prohibits the intangible supply of encryption technologies. And this has gotten a lot of people very worried. That expansive definition will not just be for actual military technologies, but might encompass such things as giving, giving a talk at a computer conference in India on the plus side, that they had a terrible encryption provision that would have required companies to retain plain text for for a period of time. They dropped that requirement and plan, but they have proposed something which is a little bit, well, potentially dangerous. Are asking the various phone manufacturers to add their their internal biometric authentication system to their phones. This is a system widely used in India for authenticating people, for receiving government services, and they want to integrate it into the phones. This could open up security holes, having some government code on the on the phone. Apparently, Google, Samsung and Microsoft did meet with India, but Apple refused to go. In Egypt, they started to try to block access to the signal messaging group, and this was this is going to be actually continue to be somewhat interesting. So after Eagle, Egypt blocked that access signal released and update, the update is using something called the main fronting. This disguises the signal track traffic to look like it's going to Google dot com. And this makes it much more difficult to block. I mean, you can still block it, but you'd also have to block all of Google. And this really ups the stakes for censorship that they can't just as easily target the one system, but have to remove something which is used daily by millions of people. And that makes it harder for for government to try and block technology like. Also, some some good news in the Netherlands, they came out very strongly in favor of encryption so that counterbalancing some of the efforts to to push back on enc
ryption, the United Nations issued a report this year recognizing that encryption and anonymity are necessary for freedom of expression and that encryption saves lives. Without encryption, lives may be endangered. And we've also had this year has been tremendous for the rollout of encryption technologies, so WhatsApp has added end and encryption by default to all of its chats and calls. This is over a billion monthly active users who are getting encrypted without having to do much of anything. Facebook has added an encryption to their messenger project, but not by default. So this is this is a half step, but it needs to go further. Encryption by default is really the gold standard. Likewise, Google's alow included encryption in incognito mode. But again, something that you had to purposefully select. So it's again, a half step and then signal has their downloads have gone through the roof? Apparently they reported a 400 percent increase in daily downloads since November 8th. And every man for that. And then we've done tremendous progress in encryption on the Web, the Let's Encrypt project is providing certificates to over twenty one million websites. It is, by some measures, the largest certificate authority in the world, and it is free. So this is a tremendous success. More than half of the page loads in Firefox and Chrome are using https, you can see a chart there which shows I think this is for, I think Firefox and it's causing the 50 percent mark over the course of the year on various operating system. Android is the laggard. So hopefully anger can can pick up the steam. But nevertheless, it is a good positive trend. And then if you look different measure on time that two thirds of people's time is spent on secure websites. So what what do we see looking forward in twenty seventeen? Well, we'll probably see more technical assistance laws. One of the things that policymakers have learned from the the first crypto wars is that it's dangerous to actually propose a 
specific solution. So when they came out with the Clipper chip in the 90s, this was quickly attacked, revealed to be vulnerable and then disregarded as a good idea. And so they've moved to a different model. Rather than provide a target which could be attacked is to say the technology companies need to learn harder and figure out how to give us the assistance so that we can get access. And they create laws similar to what the Investigatory Powers Bill has tried to do, requiring technical assistance without any specific of how that will be accomplished. It's just up to the companies to figure it out. There also will be a lot more public pressure where there have been pushes for compromise things saying, well, you really don't want a bill like that, that these bills that would require you to weaken encryption. So you should just go ahead and weaken it ahead of time to forestall the bills, which will be worse. Also putting pressure on whenever there would be a big controversy, trying to highlight that encryption may have made it difficult for law enforcement. These pressures will continue to exist. And then in some places, some countries where they are very upset on how people have been using the technologies, they'll continue to have blockages that will like in Brazil, where they have blocked WhatsApp three times over the course of the year, where they've arrested some of the executives saying you have to give us the information, even though they know that it's technically impossible for them to give that information, these pressures will continue to exist and we'll see more attacks on the endpoint. Where we have law enforcement is going to be continue to work in a world where there is strong encryption, then the way around that is to get to the endpoint. So we'll see more and more use of malware and more and more importance and people looking not just at making sure they're using encrypted tools, but they are using good security advice to avoid being attacked and phi
shed as best they can. Another important prediction, I think that free and open source software is here to stay, that for a lot of these laws and policy things, they're less effective when going into open source projects. There often aren't companies to put pressure upon that. If you attempt to legislate a requirement for a back door, it's going to be ineffective. Even if somebody decided they had to put the backdoor into the open source code when so many compiles it, they could always comment out that section. So it's pretty ineffective to go at them. The real challenge for some of these software projects is in deployment, getting them out into the hands of billions of people, making them usable, making them as as part of people's daily lives. And then an important thing for moving forward. How we should move forward is that policymakers can be reached. We've seen when there have been some policymakers who have taken the time to get experts views, conduct hearings, investigate the issue, then we're starting to see things more like it's really security versus security. We can encryption can harm security. Their important interests here are at play. And this is a positive step forward. It's fighting against a strong lobby. Law enforcement agencies are a very powerful lobby and legislators look to them very seriously. But technologist's views can make a difference. So what you can do well, if you're a coder, include default and encryption in any products that you have wherever wherever it needs to be, and also work on usability. Making it accessible for billions is a key point for websites. Encrypt all the things. Start using search bot support as a program that works with Let's Encrypt that makes it easy to set up a cert on a website used. Let's encrypt. There's really no excuse anymore to have a website that doesn't have Ayckbourn's and then for individuals, well, you can use encryption in your daily lives. We saw before that the nominated CIA director was suggestin
g that the use of strong encryption might be a red flag. Well, if everybody is using encryption all the time, it becomes less of a red flag to try to incorporate encryption as much as possible to make it less of a red flag that someone is using that technology. And then keep active. Pay attention to what's going on, help defend encryption by talking to policymakers, signing petitions, paying attention and being a participant. Thank you very much. Feel free to queue over the microphones over there if there is any question we've got already something on microphone one. Yeah. Doesn't work that. The microphone one. Yeah, thanks for your great talk. That was really, really interesting to see how F and your colleagues are battling bad ideas to the Stiffle encryption. I have one question that addresses the argument that if encryption is illegal, only the bad guys use encryption. Do you think this argument, which basically means it makes no sense to pass laws against encryption because those who want to break the law won't respect that law either? Do you think this argument has has gotten enough traction, for example, among lawmakers in Washington? Have you heard of any conclusive counterargument to that line? Well, what I've seen is, is that a you raise a good point. Well, first of all, the tautology, if encryption is illegal, then indeed anyone who uses it would be a criminal because to be a criminal by virtue of using encryption. But I think that one of the things that policymakers are really trying to do is get to the most widely deployed encryption. So they may recognize that there will be open source projects, that people will be able to download encryption made outside of their jurisdiction, that they won't be able to stop and that the the bad guys will be able to find and use those technologies. But they still want to make it a lot easier to get access to things which are widely deployed, where there are billions of users. And I think one of the things you can infer
 from that is that it's not just about targeted decryption going after a known bad guy, but they want to be able to have ready access to mass communications. And it ties in with some of the attempts to sort of predict who would be bad by looking at information before it happens, which raises its own civil liberties concerns. Microphone for. Hey, the assumption that you had is that we are living in a democracy, so this struggle between you guys and the government is going to be a healthy one. But the transition from a democratic to something like authoritarian, like in Turkey, it seems to be like really fast. Do you have any, like, plan B, anything for the case that something like that might happen? I mean, except from like Second Amendment rights and like that kind of stuff. But do you have any plans that we could have for a case like a group of reactionary politicians slash the rights of of a democratic society? Yeah, I mean, this is one of the reasons why you want to have encryption widely available when you can is so that if later things move into an authoritarian mode, that those things are already widely deployed. For those who are living in authoritarian regimes, encryption very directly can help save their lives by protecting their their information from being tracked and observed by by the authorities. They might want to put opposition figures in jail for the mere act of opposing the government. The challenges there are dealing with things like that. Using encryption might be seen as a as a red flag that, you know, if you get stopped by the police, they're going to want to get onto your phone. They may use strong measures to try and get your passcode. So even if it has the best encryption on the device in the world, if they're going to beat you with a rubber hose until you give up the password, this isn't going to help. So these are very challenging things. But I think that the best thing that you can do ahead of time is make it so that everybody is using en
cryption as much as possible so that it becomes less suspicious that someone is using it and having it be tied into widely used products that they would feel bad about blocking. So that's why it's nice that there's encryption and things like Facebook Messenger and WhatsApp. And then after a country has already gone into the authoritarian, it is those who are outside that country who are providing technologies in should try and make sure that those technologies are effective and secure. Um. We've got a question from Iasi and it's what is your view on so-called warrant proof devices? I'm not actually sure what what's meant by this search warrant proof devices warrant and whether they will remain legal in the future. Well, so this this was what Comey was referring to. He didn't want to have a world where there was something that was warrant proof. And I think that I'm in favor of having full disk encryption on phones where that the government, you know, they can try it under their own power to try and get in. That's what happened. But they shouldn't be able to compel the provider to change its code. And calling it about whether something is warrant proof is a is a rhetorical device that the government is using to try and set it up as a discussion about the rule of law or whether a warrant should be effective. But it's missing the larger policy issues. And so I guess in some sense, when they talk about something as being warrant proof, that may be a side effect. What happens when you have strong encryption, but it is not really hitting to the heart of the policy debate. Hi, on the subject of rhetoric, it seems like in the past year or so, but more about two years, we've heard a lot about strong encryption versus weak encryption. And it seems like it's going to be more and more a tool used by those in power to tell us, well, the bad guys, they're the only people who need strong encryption. You, the common folk, the good people, you only need the normal encryption. What w
ould you go for? The bad one. So I think you've touched on that on that subject, but maybe you could tell us a bit more about it. Yes, I think I'm mean, strong encryption is really what we need. And we've seen actually the terrible effects of this when there was a misguided attempt to have weak and strong encryption in the 90s where there was export grade encryption and domestic grade encryption out of the United States. And so Netscape Navigator had a weakened international version only with fifty six K strong keys, 56 bit keys. And then that that turned out to be sort of an unwise policy move. They said, well, this will be good enough for the average person. And I think there's a couple of things to think about that one as we've seen it happen in a kind of failed. But the second one is that, you know, if you're trying to protect yourselves now, you have to protect yourself from a wide variety of threats and you're gonna need strong encryption from those threats. And they may be an authoritarian regime. It might be a computer criminal, but the value of strong encryption is there for all these threats and deliberately weak encryption. Every time that it has come out, it has turned out to be far more of a disaster than the government has predicted. Thanks, microphone one, hello. So we've seen that governments try again and again to pass legislation that weakens encryption, what would need to happen so that they can't try to pass such legislation again and again? And are we moving to that direction? Well, I think there's there's not much that's going to stop them from trying because they're facing pressures from from law enforcement. So I guess, you know, we're moving that that pressure, but that seems very difficult. But I think the key is to try and convince policymakers to actually pass after pass anything at all, pass something that encourages the development and use of encryption. You know, I think I'm very heartened by the Dutch government's response where they 
were strongly in favor of encryption and the EU cybersecurity, the network information security group also coming out strongly in favor of encryption to get the policymakers ahead of time, be looking at this as something that is beneficial to have so that there's less incentive to go and push for further weaken encryption. I'm sorry to burst your bubble, but the Dutch passed a law last week that allows us to hack into any hackable device and that allows the government to buy backdoor software from companies. All right. Well, so I try to keep this thing is up to date as possible, but thank you for that information. And, well, I think that is something that is is in line with going after the endpoints so that even if you do have a strong encryption along the way, attacks on the endpoint are a common government solution to try and get around that difficulty. We've got one more question from Iasi, which goes into a similar vein like this one, What could we do to stop all politicians from wrong, not just wrong actions, but simple politics, like in the vein of the Berlin attacks? Again, they were asking for more video surveillance, which clearly does not prevent attacks like this. What can we do on a political level to stop our politicians from trying to enact laws that are basically orthogonal to the problem? Yeah, this is this is a very common thing. Whenever there is is an incident around the world, especially something like a terrorist attack, legislators feel a very strong desire to do something about it and and that something may not be directly related to the problem, but they will be able to go back to their constituents, the people who vote for it and said, well, I did something. And part of that is educating the voters so they are less fooled by this behavior. And to get people active and calling their representatives and telling them that they want to have strong encryption, that they don't want these kinds of measures. The other thing that can sometimes be eff
ective is legislators don't like to look stupid. And so if they are doing something which is a technologically bad response to a given area and you can show how it is ineffective, that can sometimes help you understand that it was sort of a bad move. This argument is often portrayed as between citizens and governments, and I'd like to propose another argument and ask what you think of it, which is that. Foreign nation states are actually a bigger threat than crime, and therefore states need into encryption more than anyone else, and therefore they should get on the side of being pro encryption because they need it to. Absolutely, I think that something so sometimes that that is actually an argument can work with legislatures where they they're they're not so worried about the citizens directly, but they're interested in the balance of power between nation states. And this has come up in some of the back door discussions where, you know, if you provide a back door to one government, let's say you think this is a great democratic government that only use this power wisely. What do you do when the other governments try to ask for that? You know, do they give them the same access? And some legislatures will understand there actually is a very important national security component to having widely available strong encryption. And I think that, you know, an example that that comes to mind is that there's been a lot of emails released from the Democratic National Committee in the United States that maybe now when they think back on it, maybe we should have encrypted that information, maybe we should have put stronger resistance. It's very difficult to fight against a nation state attacker, but at least you can make that a difficult job for them. Hi, thanks for your talk. I was quite curious to hear about the fact that there is actually an act in the United States that governs the use of vulnerabilities. Are there similar acts throughout the rest of the world? I think it wo
uld be amazing if we can enact a sea change that made it a buyback. So I should be clear, the vulnerable is equities process is not a legislative act. That is something that that came from the executive. So it was not commanded by the legislature, but rather done on its own authority by the executive branch, in part to to mollify some of the critics who have said that they should be reporting more more vulnerabilities. So it is something that one could put into a legislative process to require governments to go through that balancing and make sure they do it. But I'm not aware of any legislation that is yet proposed that. Don, thank you. All right, thank you, everybody. Pleasure to be here.