Caveat 3.4.20
Ep 18 | 3.4.20

Get that thing off my car.

Transcript

Dave Bittner: Hello. And welcome to "Caveat," the CyberWire's law and policy podcast. I'm David Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland's Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hi, Dave. 

Dave Bittner: On this week's show, Ben explains why Apple may pay half a billion dollars to settle a class action suit. I've got an update on a GPS tracking device story. And later in the show - my interview with Riana Pfefferkorn. She is associate director of surveillance and cybersecurity at the Center for Internet and Society at Stanford Law. We'll be discussing her recent article "The EARN IT Act: How to Ban End-to-End Encryption Without Actually Banning It." While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be right back after a word from our sponsors. 

Dave Bittner: And now a few words from our sponsors at KnowBe4. You know compliance isn't the same thing as security, right? How many times have we all heard that? It's true, too. Having checked the legal and regulatory boxes won't necessarily keep the bad actors out. They are out-of-the-checkbox kinds of thinkers. But what about compliance itself? You've heard of legal exposure and regulatory risk. And trust us, friend - they're not pretty. So again, what about compliance? We'll hear more on this from KnowBe4 later in the show. It's not a trick question, either. 

Dave Bittner: And we are back. Ben, why don't you start things off for us this week? 

Ben Yelin: Sure. So my article comes from Yahoo Finance. It is about a case that was actually settled in a district court in California. Apple Inc. has agreed to pay $500 million to settle litigation accusing it of quietly slowing down older iPhones as it launched new models. This is part of a practice that's referred to as throttling, where, allegedly, the company will slow down performance to induce the consumer to buy a more recent version of the product. 

Dave Bittner: Yes. 

Ben Yelin: I'm a very vulnerable user of these products. If I notice that my phone is not working as well as it usually does, I'm the kind of sucker who'd be like, you know what? Maybe it's time to buy a new one. 

Dave Bittner: (Laughter). 

Ben Yelin: And apparently, I'm not... 

Dave Bittner: Well... 

Ben Yelin: ...The only person with that predicament. 

Dave Bittner: Yeah. So Apple would say that what they were doing was protecting the users because as these devices age and the batteries don't perform as well as they used to, by throttling the speed, the users would not basically overload the battery; like, you know, they get more consistent performance through throttling. 

Ben Yelin: Right. And that's what's - that's actually what Apple is still arguing even though it's settled this case. The case has been settled for $500 million. It's part of a class action lawsuit. This will enable consumers of certain editions of the iPhone - the ones they named were iPhone 6, 6 Plus, 6s, 6s Plus, 7, 7 Plus, et cetera, et cetera... 

Dave Bittner: (Laughter). 

Ben Yelin: ...If you had one of those iPhones, you will be entitled, if you join this class action lawsuit, to about $25 in damages. 

Dave Bittner: Ooh. 

Ben Yelin: Now, the lawyers who are part of the case... 

Dave Bittner: (Laughter). 

Ben Yelin: They're going to get a cool $300-some odd million. 

Dave Bittner: Oh. 

Ben Yelin: But, you know, that's just... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Generally how it goes. 

Dave Bittner: Yeah. 

Ben Yelin: Or at least - maybe not that much, but at least 30% of what the total settlement ends up as. 

Dave Bittner: Wow. 

Ben Yelin: So Apple, despite settling here, has not actually admitted any liability. What they're doing is trying to avoid future damage to their public reputation and any conflict with their potential consumers. So they maintain that any issues with the iPhone as it relates to this alleged throttling is related to things like temperature changes, high usage, other issues... 

Dave Bittner: Right. 

Ben Yelin: ...That come about by somebody constantly using their iPhone. 

Dave Bittner: Yeah. 

Ben Yelin: ...Which we all do. 

Dave Bittner: (Laughter). 

Ben Yelin: So the upshot of this is if you are somebody who has had one of these devices, Apple is not apologizing to you. They're not admitting that they have committed any legal wrong. They are simply trying to make this all go away. And the best way to do that is to settle the case. And it will be settled as long as this district court judge in California accepts this agreement between the attorneys for the class and Apple, and there's no reason to think that the judge will not accept this agreement. 

Dave Bittner: Now, I remember - this goes back several years because I remember I had 6s Plus, I believe, and I took it in to get a battery replacement. Apple had a program running where if you had one of these phones that was allegedly part of this, you could go get a replacement battery for a very low cost. So it seems like that maybe was their first round of trying to head this off at the pass. I guess it didn't work. 

Ben Yelin: Right. So there was a big outcry because you were not the only one facing this predicament. And once Apple realized what was going on after this first outcry, in order to stave off legal liability and the potential damage to reputation, they lowered the price of replacement batteries. So this article actually mentions the price was lowered from $79 to $29. 

Dave Bittner: Yep. 

Ben Yelin: And that's obviously an enormous percentage and, I think, largely due to this initial outcry. Issue sort of went away for a few years, but when you enterprising lawyers and you have enough product users who are willing to join this class action lawsuit, then, you know, even if the company doesn't think it's done anything wrong, you do have this avenue to seek some sort of legal recourse, which is why companies like Apple loathe class action lawsuits... 

Dave Bittner: (Laughter). 

Ben Yelin: ...And want to make it much more difficult for courts to be able to certify a class and for these cases to proceed. 

Dave Bittner: Well, help me understand here - help us all understand - with something like this, who - what's the upshot of this? Who really benefits from this? Do we expect that something like this could actually change Apple's behavior going forward? 

Ben Yelin: So to be fair, I think it could because just by virtue of them being in this lawsuit, even absent admitting liability, they still have to fork over $500 million. So they don't want this to be a problem that a court is even willing to consider. So in that sense, it might change its behavior. And that's one of the reasons we have tort law in general. I mean, going back to our English common law system, we want to give some sort of disincentive to bad behavior, and that's why we allow people to sue manufacturers of products. 

Dave Bittner: Right. 

Ben Yelin: So yes, I think it could have that sort of indirect impact of making them - making it so that these issues don't exist in the future. 

Dave Bittner: Right. 

Ben Yelin: That said, the immediate impact for most users is going to be extremely minimal. You would have had... 

Dave Bittner: (Laughter) I don't think $25 buys you a lightning cable at the Apple Store. 

Ben Yelin: Yeah. I - actually, this very morning, I saw somebody complaining at my local convenience store that a charger costs $22. 

Dave Bittner: Yeah. 

Ben Yelin: So yeah, I mean, it's not going to be much for the user besides, you know, maybe a slight acknowledgment of monetary injuries that would've been faced by a user from being forced to buy a product before they should have been forced to buy a product. 

Dave Bittner: Now, the lawyers who are chasing after this, the ones who are really going to make out and make some money here, are there law firms that specialize in chasing these sorts of things down? Is this a play by some law firms? 

Ben Yelin: It certainly is. You know, lawyers pursue cases like this and shop for vulnerable plaintiffs and compelling plaintiffs because these cases are taken on contingency. So no matter what the settlement is, you know, the lawyers are going to get a large cut of the loot. Now, that actually does give quite a benefit to low-income consumers or users of these products. That's true for all types of cases. When we're talking about criminal law, indigent defendants - defendants who don't have a lot of money - are really in a terrible position because often they're assigned a public defender who probably has hundreds of cases. You know, they can't really put in the time and energy to help that criminal defendant. But when we're talking about cases like this, you know, the lawyers are only going to collect if their client wins, meaning they'll take any client, even a client who has no assets, as long as they believe that that client has a good case. So if the lawyer does the research and realizes, you know what, I think we can force Apple into a settlement, they'd be happy to take on a plaintiff even if that plaintiff wasn't rich him or herself, which I think actually does us all a big benefit. Yes, the lawyers, you know, in these cases do make out like bandits. But, you know, I don't want to hurt the profession's reputation too much because I do think there is some derived benefit for the rest of us. 

Dave Bittner: And I guess it - we don't see all of the cases that - where they make nothing. 

Ben Yelin: Yeah. I mean, lawyers lose cases. You know, you take up hundreds of contingency cases. If you're a good lawyer, you're going to win more than you lose. And so, you know, you end up making up the costs for the ones that you don't win... 

Dave Bittner: Right. 

Ben Yelin: ...By taking cases on contingency. That's usually how it works, as it applies to civil cases. 

Dave Bittner: All right. Well, I guess sign up to get your $25, right? 

Ben Yelin: Yeah, exactly. It's better than nothing. 

Dave Bittner: (Laughter). 

Ben Yelin: And, you know, I think Apple probably had the right instinct of, instead of tying this up in litigation in, you know, 50 state courts and federal courts, just give them some money. 

Dave Bittner: Right. 

Ben Yelin: I'm sure they can recoup that $500 million before the time this... 

Dave Bittner: Going to say - yeah. Right. 

Ben Yelin: ...Podcast is done recording. 

Dave Bittner: Five hundred million dollars to Apple is probably just in the couch cushions in the executive boardroom, right? 

Ben Yelin: Exactly. Yeah. 

Dave Bittner: (Laughter) All right. Interesting story. This week, I have a follow-up of a story that you and I spoke about not too long ago. If you recall, we were talking about a case where someone who had been accused of a crime, accused of selling meth, had found a GPS tracking device on his car. And it was unmarked. He removed the device from his car. And the law enforcement people used the removal of that device from his car to accuse him of theft of the GPS device, and they used that to get a warrant to search his house and, I believe, also a barn on his property. And when they did that search, they found the GPS device, and they found a bunch of meth. And that's what they went after him for. 

Ben Yelin: Drugs and drug paraphernalia, yeah. 

Dave Bittner: Yeah. Yeah. So this case made its way to the Indiana Supreme Court. And what happened next? 

Ben Yelin: So the Supreme Court of the state of Indiana came down with a decision in favor of the defendant in this case, Derek Heuring. They said that the search warrant in this case that allowed the police to recover all these contraband items and, ultimately, the GPS was illegal. And because the original search warrant, which was based on probable cause that person illegally removed the GPS from their vehicle, because that was based on faulty information, the fruit of the poisonous tree doctrine applies, meaning everything that the police discover after that point is fruit of the poisonous tree; it is inadmissible in court because the original search was not satisfactory, if that makes sense. So in - the court basically has two rationales. The first is that there was not enough information to support a probable cause determination - namely, the state wasn't able to prove with any sort of particularity or really any evidence that the person knew what had been affixed to his car and took that GPS off in order to evade detection. They didn't have any evidence to that degree. And that makes instinctive sense to us... 

Dave Bittner: Yeah. 

Ben Yelin: ...Because I think we talked about, when we first did a segment on this story, if you or I found a little black box on our car, I'm taking that thing off. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: If I don't know what it is - and I'm not thinking, like, you know, I don't want them to catch my meth dealing. 

Dave Bittner: (Laughter). 

Ben Yelin: I'm just sort of like, I don't want this strange object on my car. 

Dave Bittner: Yeah. 

Ben Yelin: I don't want anybody tracking me, if it is a tracking device. And... 

Dave Bittner: And this device didn't say, you know, property of the local sheriff's department or something like that; it was unlabeled. 

Ben Yelin: Exactly. So, you know, that would be another reason that somebody who, like the criminal defendant in this case, would have no reason not to remove that device. I mean, it'd be crazy not to if you found something strange on your car. And that's really the basis of their decision. There's this thing called the good faith exception to the exclusionary rule. So let's say, like in this case, a warrant is defective. Sometimes a court will allow the evidences to be admissible in court if they found that the officers overall acted in good faith - meaning, you know, maybe there was a tiny part of the warrant that was based on false information, but they mostly followed protocol followed procedures. Maybe they were relying on a law that had since been overturned, you know, something like that - the good faith exception would apply. 

Dave Bittner: OK. 

Ben Yelin: What the court says here is that the good faith exception does not apply. And the reason for this is that it is reckless, and this is - I'm quoting the chief justice who wrote this decision in the Indiana case - they found it "reckless for an officer to search a suspect's home and his father's bar and based on nothing more than a hunch that a crime had been committed. We are confident that applying the exclusionary rule here will deter similar conduct in the future." So again, they had no actual tangible evidence that this person was trying to evade detection other than a hunch that this person removed the device because they were trying to evade law enforcement. So... 

Dave Bittner: Right. Right. But as I think we said before, the device could have fallen off the car. It could have hit a bump and it fell off the car and went into a sewer. 

Ben Yelin: And for all law enforcement knew, at the time that they followed this affidavit, that's exactly what could have happened. Now, as we frequently see in many of these cases, the end result here is that somebody who's clearly guilty of a lot of drug crimes... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...Is going to go free. And that's sort of the sacrifice that we make when we have things like the Fourth Amendment and the equivalent to the Fourth Amendment in the Indiana State Constitution. Having this exclusionary rule means that a certain number of people who obviously committed crimes, they're going to get really lucky, the police are going to have made some mistake, and all of that evidence is going to get thrown out. Even if that person - you know, to me and you, of course this person should probably be imprisoned for possession of drug paraphernalia because they had drug paraphernalia; they had meth. 

Dave Bittner: Right. 

Ben Yelin: But, you know, we want to keep a system in which there aren't intrusive searches and seizures, general warrants based on nothing but the hunch of a law enforcement official. And I think that's what the Indiana court is trying to protect against in this case. 

Dave Bittner: And is - in terms of the process of this, is this it? One - this ruling from the Supreme Court, is that the end of it? 

Ben Yelin: It really is. So you know, in criminal cases, double jeopardy applies. So this person had been convicted. Now, state can't appeal an acquittal, but a defendant can appeal a conviction. But once any court rules in favor of the defendants that the evidence is exclusionary, the defendant is usually going to be in a very favorable position because if this were to go back down to the lower court, which it very well might, all of that evidence gleaned from whatever came from finding that GPS device, including all the meth and meth paraphernalia found in this house and in this barn are going to be inadmissible at subsequent trials. And without that evidence, they probably have no chance of securing a conviction, and it probably would not even be worth it to try. 

Dave Bittner: Mmm hmm. All right. So this defendant has an opportunity to straighten up and fly right (laughter). 

Ben Yelin: Yeah. He joins many Fourth Amendment defendants who, because of the high principles we hold in this country, is certainly getting away with what otherwise would have been crimes. But that's the cost of our constitutional system, and I think it's a cost that a lot of us are willing to bear. 

Dave Bittner: All right. So those are our stories for this week. Coming up next - my conversation with Riana Pfefferkorn. She is associate director of surveillance and cybersecurity at the Center for Internet and Society at Stanford Law. She recently wrote an article titled "The EARN IT Act: How to Ban End-to-End Encryption Without Actually Banning It." 

Dave Bittner: But first, a word from our sponsors - and now back to that question we asked earlier about compliance. You know, compliance isn't security, but complying does bring a security all its own. Consider this. We've all heard of GDPR, whether we're in Europe or not. We all know HIPAA, especially if we're involved in health care. Federal contractors know about FedRAMP. And what are they up to in California with the Consumer Privacy Act? You may not be interested in Sacramento, but Sacramento is interested in you. It's a lot to keep track of, no matter how small or how large your organization is. And if you run afoul of the wrong requirement, well, it's not pretty. Regulatory risk can be like being gobbled to pieces by wolves or nibbled to death by ducks. Neither is a good way to go. KnowBe4's KCM platform has a compliance module that addresses, in a nicely automated way, the many requirements every organization has to address. And KCM enables you to do it at half the cost in half the time. So don't throw yourselves to the wolves, and don't be nibbled to death by ducks. Check out KnowBe4's KCM platform. Go to kb4.com/kcm. Check it out. That's kb4.com/kcm. And we thank KnowBe4 for sponsoring our show. 

Dave Bittner: And we are back. Ben, I recently had the pleasure of speaking with Riana Pfefferkorn. She is associate director of surveillance and cybersecurity at the Center for Internet and Society at Stanford Law. She recently wrote an article, "The EARN IT Act: How to Ban End-to-End Encryption Without Actually Banning It." 

Riana Pfefferkorn: So the idea here seems to be that in light of a series of reports coming out of The New York Times and in light of a push more recently by the Department of Justice as well as some child protection organizations to highlight the ways that they believe that big online platforms are not doing enough to combat the scourge of child sexual abuse material on their services, there is now a bill that would try and hit the big providers where it hurts, which is their immunity from civil and state criminal claims over what their users do and say on their services under a law called Section 230 of the Communications Decency Act. And it's not clear to me, necessarily, that those - that this is really the right tool for the job. I view Section 230 as being really inapposite to trying to tackle the problem of - I'll call it CSAM - C-S-A-M, for short - on these services. 

Dave Bittner: And CSAM is the child exploitation types of materials. 

Riana Pfefferkorn: Right. And so one of the things that is still a little bit unclear in this new EARN IT Act bill is exactly what the bill would cover. It seems like it's intended to cover both imagery as well as other types of behaviors, such as grooming or enticements, where predators contact children on these services and to try to - perhaps try and get them to meet up with them or send them explicit imagery to the person who's contacted them. So it might cover a range of particular behaviors on these services, not just images. 

Dave Bittner: Now, one of the things that you point out in your blog here is that there's no lack of the platforms reporting CSAM instances. There are dozens - millions of times that they report these. 

Riana Pfefferkorn: That's right. The New York Times had reported last fall on what seemed like really staggering numbers from the major platforms of millions and millions of images being shared that were intercepted and then reported out to the National Center for Missing and Exploited Children, which has the statutory grant as being the middleman in between providers and law enforcement to receive reports of CSAM on these services. And there's some debate over, what conclusion do we draw from these large numbers? Is it that this is a huge amount of the traffic going on on these services? I mean, you know, there's millions of these reports, but it's a tiny fraction of what something like Facebook has in terms of the overall traffic of what's going on on their service. Or is it a sign that there is some sort of mechanism in place at these providers to look for and report out what they find? 

Riana Pfefferkorn: And we do know that there are providers that do some amounts of automated scanning and monitoring to try and see - for example, if you are trying to send an attachment to an email on some services, it will be automatically scanned to see if the attachment might match a database of hashes that are hashes of images of CSAM that - if there is a match, then there will be some additional review to determine, OK, is this, in fact, the piece of material that we think it is? And if so, then that could be reported on to NCMEC. 

Dave Bittner: Well, let's dig in here and try to do some reading between the lines, which is very much what you do in your blog post here. I mean, in your estimation, what is actually going on here? What does Graham and Blumenthal's bill really attempt to achieve, and why does it come up short? 

Riana Pfefferkorn: This bill doesn't actually impose any new reporting or preservation duties on online providers. There are already requirements under federal law for - that say what providers such as Facebook or Google or Dropbox have to do in terms of reporting and preserving child sex abuse material that they discover or learn about on their services. However, this doesn't actually add anything to that list of the requirements on providers. Instead, it would create a new commission composed of people from law enforcement, people from tech company platforms, people from child protection organizations and some experts in computer science and software engineering to come up with a set of best practices for how to address the problem of CSAM and potentially also other related types of activities on these services. 

Riana Pfefferkorn: And it's foreseeable under the current climate, where we have a Department of Justice and an attorney general that are very hostile to encryption, that this is kind of a backdoor, underhanded way of coming up with best practices that would tell providers, you would be risking your immunity under Section 230 if you did not adopt best practices that basically require walking away from privacy and security protective measures that those platforms have implemented, such as end-to-end encryption. It's worth noting that the attorney general would be one of the people on this commission and would have, under the original discussion draft of the bill that was released by Bloomberg News, absolute power to unilaterally amend the best practices as recommended by the commission before finalizing them. 

Riana Pfefferkorn: And so I know I'm not the only person who shares in this suspicion that by hinging continued qualification for Section 230 immunity - the EARN IT of the bill's short title - on compliance with these best practices, it's basically a way of trying to influence how providers use encryption and the encryption designs that they offer to their users on their services as a means of taking this stick of Section 230 immunity and turning it into something that would undermine the years and years of work and progress that we have seen from organizations like Facebook and Apple and so forth to enhance their users' privacy and security through encryption. 

Dave Bittner: And what do you suspect that would happen in terms of the use of encryption? Should something like this go into effect, what sort of shifts would we see? 

Riana Pfefferkorn: You know, it's not really clear to me what the actual impact of this bill might be were it to go into effect as it is now. The bill does say that there are two potential ways to retain the immunity under Section 230. One of them is compliance with these best practices and certifying that your organization complies with those to-be-written best practices. The other is to come up with other, quote, "reasonable measures" for addressing the same goals of - as the best practices do of how to combat child sex abuse material on these services. 

Riana Pfefferkorn: And so we might see organizations coming up with their own set of measures, even if they are not complying with those best practices, and taking their chances. But that's something that probably only the largest platforms with the deepest pockets - your Google, your Facebook - would be able to afford to do because the problem with this safe harbor being contingent upon the reasonableness of your practices is that it would pretty much leave that open to litigation. And you'd have to have a court decide whether or not that particular company would continue qualifying for Section 230 immunity in light of whether or not the court decides that the practices that company adopted in lieu of or as an alternative to the set of best practices recommended by the commission - whether those were reasonable for fighting child sex abuse material on their services. 

Riana Pfefferkorn: So we might end up with a weird situation where it might be deemed by a court to be reasonable or potentially unreasonable to use end-to-end encryption, for example, for chat apps such as Facebook Messenger because that reduces the ability - not necessarily completely removes the ability, I think, as the DOJ has been saying - but reduces the ability for the provider to detect whether there is abusive material or abusive communications being sent. 

Dave Bittner: Yeah. It strikes me also that this child sex abuse issue is often brought out sort of as the big stick. It is the boogeyman. It is the thing that we all agree is terrible and we should - you know, we should fight against. And I think all reasonable people feel that way. But it's interesting to me that so often, it seems that it's brought out as the argument in these sorts of cases. 

Riana Pfefferkorn: I agree with that. It's really this kind of nuclear option when it comes to policy debates in pretty much any context - not just online, internet-focused contexts, but pretty much anywhere. Think of the children has this way of kind of shutting down further reasoned debate, because the moment you press back on one of these proposals that is nominally child-safety centric, it opens you up to accusations that you are not in favor of keeping children safe online or offline, which - I think almost nobody actually feels that way, but also, nobody wants to have to deal with the accusation. So it's a way, I think, of really derailing what otherwise is a pretty difficult and sticky policy debate. 

Riana Pfefferkorn: The encryption debate hasn't really made a lot of progress in the handful of years that I've been working on it, much less in the quarter century that it's really been going on overall in the United States. And yet, here, I think when it comes to what seems like a goal of further eroding the immunity under Section 230, an erosion that began with the FOSTA law in 2018 that addressed sex trafficking, it seems like this is at least something where this kind of material is illegal, not only in the United States, but everywhere in the world. It is the one category of thing that we all can agree upon needs to be extricated to the greatest extent possible. And that's a difference from a lot of the other debates over Section 230, where it turns out, basically, everybody wants some sort of speech online to be censored. They just always want the other guy's speech to be censored... 

Dave Bittner: Right, right, right. 

Riana Pfefferkorn: ...Rather than their own. And so what that looks like isn't necessarily going to be the same from person to person. If you're Josh Hawley from Missouri, for example, one the senators who has proposed various pieces of Section 230 legislation, it looks like trying to combat this myth about how tech companies are elitist West Coast institutions that are hell-bent on censoring the speech only of conservatives, and therefore, there needs to be a law requiring them to be, you know, fair and balanced. And that just gets you into this morass over, no, don't censor my speech; censor the other speech, which is the speech I don't like. And that's something that child sex abuse material neatly sidesteps because everybody agrees that this is an abomination. 

Dave Bittner: What about Section 230 itself? I mean, in your estimation, is it time to take another look at it? Are there reasonable amendments that could or should be made to it? 

Riana Pfefferkorn: You know, I am a pretty strong proponent of keeping Section 230's protections as is. I may not be the most unbiased person out there, given that I used to use that regularly to, you know, swat down cases that have been brought against my clients when I was in private practice. But so far, it seems like this is a way of inadvertently silencing more speech than was intended without necessarily actually curing the problem of abusive or horrible online speech. 

Riana Pfefferkorn: I mentioned earlier, the FOSTA law from 2018, which was aimed at sex trafficking and taking away 230 immunity with regard to sex trafficking claims. We've seen that that has resulted not only in actual harm to the sex workers who supposedly were supposed to be protected by the passage of this law; it's also resulted in platforms, afraid of being subjected to massive liability, taking down vast swathes of speech that didn't actually have anything to do with sex trafficking. And so one of my Stanford colleagues, Daphne Keller, who is a recognized expert in this area of law, of platforms' liability for the content on their services, is one of the lawyers representing several plaintiffs in a case currently in federal court in D.C. that challenges the constitutionality of the FOSTA law. 

Riana Pfefferkorn: And we've seen other very good reporting coming out both from sex workers rights organizations as well as from people in the legal community documenting the ways in which FOSTA has backfired and is hurting the people that it was supposed to protect. So at least, if we use that as an example, it doesn't really look like the project of amending Section 230 is getting off on the right foot at all. 

Dave Bittner: It's interesting to me that, you know, I've heard many people commenting about this - basically, about the encryption wars as a whole, that, you know, encryption is readily available. It's not hard to do. there's nothing really exotic about it anymore. So the notion that you could effectively ban it or keep people who want to use it from using it is sort of absurd. And so it leaves me scratching my head as to what, ultimately, the folks here who are trying to ban it are trying to achieve. It seems almost nonsensical to me. 

Riana Pfefferkorn: It seems to me like one of the unspoken assumptions in the encryption debate is that were effective encryption, meaning encryption that law enforcement doesn't have a way of accessing - were that to be banned such that U.S.-based providers or providers that have a big market the United States and want to comply with U.S. law were to comply with that, say, by stripping end-to-end encryption from iMessage, say from taking away the effective encryption that is now used to encrypt your Android or your iPhone device, we would go back to a world where criminals, sophisticated people would still have access to the same kinds of software that they have always had access to even before encryption came into common use in consumer products, and devices, and services and apps. It's just that the bulk of average people like you and me would no longer be able to just, by default, get the advantage and the protection of encryption that we now enjoy. 

Riana Pfefferkorn: And so we would end up with this weird situation where criminals would have better security than average people because average people buy a device, or they install a piece of software, and they don't change the defaults. And they don't necessarily go out of their way to install and configure the kinds of sort of clunky and user-unfriendly encryption software that prevailed in the era before we had iPhones and before we had WhatsApp and Signal. And I find that to be a really backwards kind of world to be aiming for, where criminals would still have access to the same software that they were already using, especially in child sex abuse cases, well before the introduction of mobile phones or apps. It's just that if average people are no longer using that kind of software in a world where it is no longer legal, then average people are going to get caught. Average criminals are going to get caught. And the unspoken assumption, I think, is that it might be that the Department of Justice and other law enforcement agencies that complain about the effect of encryption on their investigations might be OK with that. 

Riana Pfefferkorn: In prior iterations of this debate - from Rod Rosenstein, for example, while he was still with the Justice Department, he would admit that, you know, this is something where the really sophisticated folks are always going to have access to this technology, and we're always going to have to work harder to catch those people. But it will get, you know, the average garden variety, the low-hanging fruit. And I really wish that that's something that - if that really is the goal, is just saying, we want to make the bulk of just our ordinary investigations of average-intelligence, average-sophistication criminals more expeditious and easier, that they would just be more forthright about that rather than using the - really, the worst of the worst criminals, such as child sex abuse offenders, such as, you know, drug traffickers, whatever the Four Horsemen of the infocalypse always are... 

Dave Bittner: (Laughter). 

Riana Pfefferkorn: ...To try and push a policy that would not affect those worst-of-the-worst people because they are consistently the ones who are the most savvy about how to use technology to cloak their activities. And those technologies are out there, and those would not go away simply because of a law in any particular country restricting their availability. 

Dave Bittner: All right. So Ben, I'm going to guess you have some thoughts about our interview there. 

Ben Yelin: First of all, the blog post you referenced - I highly encourage anybody to read it. It's incredibly well-written... 

Dave Bittner: Yeah, yeah. 

Ben Yelin: ...Entertaining, informative. We will put it in the show notes... 

Dave Bittner: Yeah. 

Ben Yelin: ...Because... 

Dave Bittner: It's really great. 

Ben Yelin: Yeah. I can't recommend it enough. 

Dave Bittner: (Laughter). 

Ben Yelin: I was cracking up just reading the footnotes. 

Dave Bittner: Yeah. It's hard to make - in my opinion, it's hard to make a discussion of legal stuff... 

Ben Yelin: Well, thanks. 

Dave Bittner: ...Laugh-out-loud funny, present company excepted, of course. 

Ben Yelin: Thank you. Thank you. 

Dave Bittner: But Riana really does it. She's a - it's a pleasure to read her writing. 

Ben Yelin: It really is. And she made a lot of points that I absolutely agree with. I think the bait and switch that gets to her and that gets to me is, she's taking - these lawmakers - namely, Senator Graham and Senator Blumenthal - taking advantage of the public outrage at Section 230 as it relates to public communications, so the fact that, you know, Facebook can't be sued for content that shows up on our Facebook feeds, or Twitter can't be used - sued for content that shows up on our Twitter feeds. 

Ben Yelin: There is a legitimate public outrage. And there are certainly normatively good claims that Section 230 should be amended for that purpose. But that's not what this bill does. This bill is ending private end-to-end encryption. And so you're using outrage at something public to make legislative changes that will really hinder private communications and robust end-to-end encryption. So I thought that was just a very important point. I think that gets lost in the shuffle of these types of debates. You know, I certainly understand why Senators Graham and Blumenthal have made this proposal. It's a priority of both law enforcement and, you know, as she mentions in this blog post, it's also a priority of advocacy groups as well who want to make it as hard as possible for this type of illicit material to travel across the internet. 

Dave Bittner: Right. 

Ben Yelin: But it's just important to be aware of what those adverse consequences can be when we're talking about giving the government potential access to our private communications. 

Dave Bittner: Yeah. Yeah. All right. Well, our thanks to Riana Pfefferkorn for joining us. As we said, really interesting article here - we'll have a link to it in the show notes, and we do recommend that you check that out. We want to thank all of you for listening. That is our show. 

Dave Bittner: And of course, we want to thank this week's sponsor, KnowBe4. If you go to kb4.com/kcm, you can check out their innovative GRC platform. That's kb4.com/kcm. Request a demo, and see how you can get audits done at half the cost in half the time. 

Dave Bittner: Our thanks to the University of Maryland Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers our Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.