Hacking Humans 9.27.18
Ep 18 | 9.27.18

Kidnappers, robots and deepfake.

Transcript

Robert Anderson: [00:00:00] The speed at which cyber and artificial intelligence is moving at - it is moving at absolutely light speed. And with that, unfortunately, comes a lot of pluses but also a lot of things that will actually change your perceptions of the future.

Dave Bittner: [00:00:15] Hello, everyone, and welcome back to another episode of the CyberWire's "Hacking Humans" podcast. This is the show where each week we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I am Dave Bittner from the CyberWire, and joining me is Joe Carrigan. He's from the Johns Hopkins University Information Security Institute. Hello, Joe.

Joe Carrigan: [00:00:38] Hi, Dave.

Dave Bittner: [00:00:38] We've got some fun stories to share. And later in the show, I speak with Robert Anderson. He's from The Chertoff Group. And we'll be exploring the implications and technology of deepfakes.

Dave Bittner: [00:00:50] But before we get to all of that, a quick word from our sponsors at KnowBe4. Step right up and take a chance. Yes, you there, give it a try and win one for your little friend there. Which were the most plausible subject lines in phishing emails? Don't be shy. Were they, A, my late husband wished to share his oil fortune with you, or, B, please read - important message from HR, or, C, a delivery attempt was made, or, D, take me to your leader? Stay with us, and we'll have the answer later. And it will come to you courtesy of our sponsors at KnowBe4, the security awareness experts who enable your employees to make smarter security decisions.

Dave Bittner: [00:01:38] And we are back. Joe, you're going to start things off this week. What do you got for us?

Joe Carrigan: [00:01:42] Well, Dave, to keep things in a light and cheery mood, I've got another kidnapping scam.

Dave Bittner: [00:01:46] (Laughter) I can always count on you...

Joe Carrigan: [00:01:49] Right.

Dave Bittner: [00:01:49] ...To keep it chipper (laughter).

Joe Carrigan: [00:01:50] So there was a public safety advisory, actually, that came out as a broadcast email to JHU faculty, students and staff.

Dave Bittner: [00:01:56] Oh, where you work...

Joe Carrigan: [00:01:57] Exactly.

Dave Bittner: [00:01:58] ...At Hopkins. OK.

Joe Carrigan: [00:01:58] It's a warning about a scam for foreign students. And it's another kidnapping scam, but it's a little more clever than the ones that we covered before where there's, you know, just a phone call and somebody screaming in the background...

Dave Bittner: [00:02:08] Right, right.

Joe Carrigan: [00:02:08] ...You know, we have your daughter. So it's...

Dave Bittner: [00:02:10] As terrific as that is on its own.

Joe Carrigan: [00:02:12] Right. This is not nearly as horrific but, I mean, it's still terrible.

Dave Bittner: [00:02:16] Yeah.

Joe Carrigan: [00:02:16] But it's a little more creative. And it's kind of a complex scam, too. So here's how this scam works. Let's say you're a student from a foreign country. Let's use Wakanda, for example. The wife and I just watched "Black Panther," so...

Dave Bittner: [00:02:27] Very good.

Joe Carrigan: [00:02:28] ...I've got that on my mind.

Dave Bittner: [00:02:29] Love it.

Joe Carrigan: [00:02:30] Finally got around to seeing that movie. And you get a phone call from somebody pretending to be from the Wakanda Embassy or a consulate. And they're saying that you've been involved in a crime back in Wakanda.

Dave Bittner: [00:02:38] Oh.

Joe Carrigan: [00:02:39] Right. And they demand your cooperation because they're conducting an investigation.

Dave Bittner: [00:02:44] OK.

Joe Carrigan: [00:02:45] And they tell you, go into hiding and await instructions. In addition, they say, don't contact your family, don't contact any friends, and don't contact law enforcement because you're in deep trouble, mister.

Dave Bittner: [00:02:58] And you're in a foreign country. You're...

Joe Carrigan: [00:03:00] You're in a foreign country.

Dave Bittner: [00:03:01] A foreign land.

Joe Carrigan: [00:03:02] Right.

Dave Bittner: [00:03:03] OK.

Joe Carrigan: [00:03:03] And back home, you're in trouble. You got something going on, and you better stay out of sight.

Dave Bittner: [00:03:07] Right, lay low.

Joe Carrigan: [00:03:08] Right. So if the Wakandan student follows these instructions...

Dave Bittner: [00:03:11] Yeah.

Joe Carrigan: [00:03:11] ...It effectively ensures that they will not contact their family or their friends. And that's when they make the phone call to the student's parents back in Wakanda.

Dave Bittner: [00:03:20] Oh.

Joe Carrigan: [00:03:21] Right. So they call the parents, and they say, hey, we've kidnapped your student; give us some money. So what do you do if you're a parent?

Dave Bittner: [00:03:27] Right, and your student is overseas...

Joe Carrigan: [00:03:29] Right.

Dave Bittner: [00:03:29] ...Or far away.

Joe Carrigan: [00:03:30] You try to contact him, but he doesn't respond. He or she doesn't respond because they've been told lay low because you're in trouble.

Dave Bittner: [00:03:38] Right. Don't answer your phone. Oh, my.

Joe Carrigan: [00:03:41] Right. So we've had a couple of students targeted with this at JHU.

Dave Bittner: [00:03:45] Wow.

Joe Carrigan: [00:03:46] And the advice from the university - and this is something I learned yesterday - is that the FBI has actually set up a task force called the Virtual Kidnapping Scam Task Force.

Dave Bittner: [00:03:56] Wow.

Joe Carrigan: [00:03:57] And they have an 800 number here. It's 1-800-225-5324.

Dave Bittner: [00:04:03] OK.

Joe Carrigan: [00:04:03] That's the number you call for virtual kidnapping scams. So this is such a problem that the FBI has set up a task force and an 800 number to call and report them. Also if you're not comfortable contacting the FBI, you can contact your institution's international students' office or your institution's security force because they're targeting university students. And the way they're doing this is they're going after open-source information. So it's open-source intelligence collection. And they're going to places like Facebook or WeChat, which is a big social media company in China.

Dave Bittner: [00:04:31] Right.

Joe Carrigan: [00:04:32] And they build a profile. They know where you go to school. They know what your major is. They know your friends that are maybe at home in Wakanda or maybe here in the U.S. They know all the stuff about you, and then, of course, they know who your parents are. So it's through the open-source intelligence collection process, which is the very beginning of just about all these social engineering attacks...

Dave Bittner: [00:04:52] Right.

Joe Carrigan: [00:04:52] ...They have built a dossier on somebody that they're going to target. It makes the calls much more plausible to the point of believability.

Dave Bittner: [00:04:59] Wow. How do we get the word out to the students? What can you do?

Joe Carrigan: [00:05:02] Well, of course, there's this podcast, Dave. And everybody should listen to this podcast, right?

Dave Bittner: [00:05:06] (Laughter) Right, of course, how silly of me. Tell your friends, right?

Joe Carrigan: [00:05:09] Right. Tell your friends about this podcast.

Dave Bittner: [00:05:10] The best thing - if you love your family...

Joe Carrigan: [00:05:13] (Laughter).

Dave Bittner: [00:05:13] ...You will tell them.

Joe Carrigan: [00:05:14] That's right.

Dave Bittner: [00:05:16] (Laughter) OK.

Joe Carrigan: [00:05:16] Now we sound like the kidnapping scammers.

Dave Bittner: [00:05:18] Yeah, right. All right. Go on.

Joe Carrigan: [00:05:20] First off, yes, tell your family about it because nothing will prevent you from becoming the victim of a scam better than knowing the anatomy of a scam. When you see this kind of thing happening, the first thing in your head is, oh, this is a scam, right?

Dave Bittner: [00:05:34] Like we say, inoculation.

Joe Carrigan: [00:05:35] Inoculation, exactly.

Dave Bittner: [00:05:36] OK.

Joe Carrigan: [00:05:37] And that's kind of the mission of this podcast, as we say every week. Give your parents contact information of your friends, a roommate or other people they can contact if they're trying to reach you so your parents would be able to validate that you're safe through one of your friends here. And finally, they say that your family members can contact the International Affairs Office of the university, whichever university you're attending. It's an interesting scam, I think. I would like to know what the rate of success is with this, but, unfortunately, it's very hard to get those metrics, isn't it?

Dave Bittner: [00:06:03] Sure.

Joe Carrigan: [00:06:03] It's got almost, like, an elegance to it. I hate to speak highly of people like this - right? - but it's got this elegance to it that's admirable.

Dave Bittner: [00:06:11] Well, I wonder, too, if how much - if there's any component to this where if my family is in a situation where I can send my child overseas to study...

Joe Carrigan: [00:06:21] Right.

Dave Bittner: [00:06:21] ...Does that make it more likely that I have attained a certain level of affluence?

Joe Carrigan: [00:06:25] Yes.

Dave Bittner: [00:06:26] And so I have the disposable income to spend on this sort of ransom...

Joe Carrigan: [00:06:30] Yes.

Dave Bittner: [00:06:31] ...That I can get to quickly.

Joe Carrigan: [00:06:32] Yes.

Dave Bittner: [00:06:33] Yeah.

Joe Carrigan: [00:06:33] Chances are that that's the case because if you can afford to send your kid to a United States school of a foreign country, then chances are that you do have disposable income in that country.

Dave Bittner: [00:06:42] Right. Of course, everyone in Wakanda is loaded because they have lots of natural resources, so...

Joe Carrigan: [00:06:45] That's right. They have vibranium...

Dave Bittner: [00:06:47] That's right.

Joe Carrigan: [00:06:48] ...And tons of it.

Dave Bittner: [00:06:49] Yeah. All right. Well, that's a good story, Joe. Beware. Spread the word. My story this week, Joe, I have to ask you - do you have any robots in your house or at work?

Joe Carrigan: [00:07:00] Any robots - I have a robot kit that I used for my master's research project.

Dave Bittner: [00:07:07] OK.

Joe Carrigan: [00:07:07] But it's not assembled right now.

Dave Bittner: [00:07:09] OK. So you have a disassembled - a nonfunctioning robot...

Joe Carrigan: [00:07:12] Correct.

Dave Bittner: [00:07:13] ...Nearby.

Joe Carrigan: [00:07:13] I've built robots.

Dave Bittner: [00:07:14] OK. Well, in my house, I have a robot vacuum cleaner.

Joe Carrigan: [00:07:17] OK.

Dave Bittner: [00:07:18] And my kids have named him DJ Roomba (ph).

Joe Carrigan: [00:07:21] (Laughter).

Dave Bittner: [00:07:22] And...

Joe Carrigan: [00:07:22] 'Cause he looks like a record.

Dave Bittner: [00:07:23] He does. And he goes around and he does his work and he never complains. And one thing I realize - he's been in our house probably about a year or so, and I realized that I actually have great affection for this little device.

Joe Carrigan: [00:07:35] Really?

Dave Bittner: [00:07:36] I do. And I'm not sure why. He goes around and he does his work and he doesn't cause any trouble. But what I've noticed is, like, every now and then, he'll get stuck. There's one place in the kitchen where he can sort of get himself jammed underneath a kitchen counter.

Joe Carrigan: [00:07:50] And you walk up to it like a stuck kitten and you go, aw.

Dave Bittner: [00:07:53] Yeah, exactly what I do. I go, oh, no, DJ Roomba. What? Oh, let me help you out. And I pull him out and I - you know, I take him back to his little charging station or whatever, you know. So somehow I have assigned a personality to the Roomba...

Joe Carrigan: [00:08:09] Yeah.

Dave Bittner: [00:08:09] ...And to the point where I'm actually - I think I'm actually going to go out and buy some googly eyes and stick them to the top of him so that he...

Joe Carrigan: [00:08:16] (Laughter).

Dave Bittner: [00:08:17] (Laughter) Take it all the way, right?

Joe Carrigan: [00:08:18] 'Cause he can look like Mr. Trash Wheel.

Dave Bittner: [00:08:20] Yeah.

Joe Carrigan: [00:08:20] He does a similar job.

Dave Bittner: [00:08:21] The reason I bring this up is there was an interview on techtarget.com. This was from Michael Heller. And he interviewed a woman named Brittany Straithe Postnikoff, and she studies robot social engineering. And he interviewed her at the 2018 Black Hat conference. And she's been studying our relationship to robots and how they could be used for social engineering. And it's really a fascinating article. On the one hand, we tend to assign robots personalities like this. We have affection for them. We want them to be more human than perhaps they are...

Joe Carrigan: [00:08:58] Right.

Dave Bittner: 0:08:59[] ...This notion of an automated device in our homes. You know, we think about, like, R2-D2, C-3PO and...

Joe Carrigan: [00:09:05] Right.

Dave Bittner: [00:09:06] Although I guess you could think about the Terminator or Cylons as well. But...

Joe Carrigan: [00:09:09] The Cylons were just misunderstood.

Dave Bittner: [00:09:11] Yeah. They want to be our friends, and we want them to be our friends.

Joe Carrigan: [00:09:13] Right.

Dave Bittner: [00:09:14] And this article talks about how people often see robots as authority figures. If a robot tells you to do something, chances are you're going to do it, or you're going to at least consider it because you think, well, the robot's a computer, it has - it must know what it's doing or what - you know, if it's given me an instruction, it's for a reason. And so we give a certain amount of deference to the robot. But the one thing that really caught my attention in this interview was the researcher - she was saying that she uses these robots for social engineering attacks basically for pen testing. And she said one of her favorite attacks is to put snacks on top of a Roomba-like robot as a way to get into a locked space. So what they do is they put snacks on top of the robot. She researches who might be in the space. She writes that person's name on a nameplate, puts it on the robot and the snacks on top of the robot and then by remote control has the robot go up to the door that she wants to get into, and she has it bump up against the door a few times.

Joe Carrigan: [00:10:17] Like knocking.

Dave Bittner: [00:10:18] Right. And the people come to the door, and they open the door. And who is there?

Joe Carrigan: [00:10:21] A robot.

Dave Bittner: [00:10:22] It's an adorable robot (laughter).

Joe Carrigan: [00:10:24] And it brought snacks.

Dave Bittner: [00:10:26] And he brought snacks. So what are you going to do? Are you going to slam the door on the cute robot who comes bearing snacks? No, you're going to let that robot in because that robot is adorable.

Joe Carrigan: [00:10:35] And he's equipped with cameras and listening device.

Dave Bittner: [00:10:37] Exactly, exactly. So this is called piggybacking robots. There's another researcher who's actually done some work on this. So I just think this is both delightful and brilliant, right?

Joe Carrigan: [00:10:51] Yeah. I agree with you. I'm troubled by one thing is that people will take orders from robots. I think there's room for some human supremacy here when it comes to dealing with robots (laughter).

Dave Bittner: [00:10:59] Yeah. Yeah, I think so. Yeah.

Joe Carrigan: [00:11:03] They take orders from us, not the other way around.

Dave Bittner: [00:11:05] I see. All right. Well, maybe in your house. But yeah, so it's a fun article. Do check it out. And there are some interesting things in there. I hadn't really thought about this notion of, you know, what do you do when there's a robot at your door? Well, of course you invite it in because it's adorable.

Joe Carrigan: [00:11:20] Right.

Dave Bittner: [00:11:20] I guess as long as it's not - doesn't look like the Terminator (laughter) in which case you close the door and run away and go out the back door. So check it out. We'll have a link to the article in the show notes. All right. Joe, it's time for our Catch of the Day.

(SOUNDBITE OF REELING IN FISHING LINE)

Dave Bittner: [00:11:39] And our Catch of the Day this week was sent in by a listener. This was sent in from Dr. Scott. He is a devout listener to the CyberWire and "Hacking Humans." And he got this in his email today, and he thought he would send it on to us. This is an email that pretends to be from American Express and has the American Express logo. Everything looks legitimate. And here's what the message says.

Dave Bittner: [00:12:04] (Reading) We're reaching out about a partial upgrade on our online service platform, and we feel the need to re-evaluate card members' profile. Your profile failed to pass our integrity check at the moment of evaluation. However, just to be safe, we declined access to your account and request that you confirm with us what we have on file for you.

Joe Carrigan: [00:12:24] Uh-oh.

Dave Bittner: [00:12:24] (Reading) Attached along this message is a web fillable form. Complete request by downloading and filling out the form. And then, of course, there's a link where you can click through.

Joe Carrigan: [00:12:36] And there seems to be some broken HTML in this thing as well.

Dave Bittner: [00:12:39] Yes. Yes. And one of the tip-offs here that Dr. Scott pointed out was the return email address is americanexpress@ - well, before I get to the - what would you say - what's the abbreviation for American Express?

Joe Carrigan: [00:12:53] A-E - Amex.

Dave Bittner: [00:12:54] Amex, right. Amex. I think most of us would say, oh, I have my Visa, I have my MasterCard, and I have my Amex, right?

Joe Carrigan: [00:12:58] Right.

Dave Bittner: [00:12:59] So the return address on this is americanexpress@amep - with a P instead of an X - .com.

Joe Carrigan: [00:13:09] That's interesting. I think American Express should look into acquiring control of the amep domain.

Dave Bittner: [00:13:16] (Laughter) Right. Right. So pretty straightforward here. There's some funny capitalizations in the message.

Joe Carrigan: [00:13:21] Yeah. The English is all broken and bad. It's not exactly a good translation.

Dave Bittner: [00:13:23] Yeah, it's a little strange, this whole need to pass our integrity check at the moment of evaluation (laughter).

Joe Carrigan: [00:13:30] Yeah, that's very awkwardly worded.

Dave Bittner: [00:13:31] Right. Right. Dr. Scott pointed out that, well, it is a tender moment of - that moment of evaluation. It's a tender moment for everyone.

Joe Carrigan: [00:13:40] (Laughter).

Dave Bittner: [00:13:40] And that is our Catch of the Day. Coming up next, we've got my interview with Robert Anderson from The Chertoff Group.

Dave Bittner: [00:13:46] But first, a quick word from our sponsors at KnowBe4. And what about the biggest, tastiest piece of phish bait out there? If you said A, my late husband wish to share his oil fortune with you, you've just swallowed a Nigerian prince scam, but most people don't. If you chose door B, please read important message from HR. Well, you're getting warmer, but that was only No. 10 on the list. But pat yourself on the back if you picked C, a delivery attempt was made. That one, according to the experts at KnowBe4, was the No. 1 come-on for spam email in the first quarter of 2018. What's that? You picked D, take me to your leader? No, sorry. That's what space aliens say. But it's unlikely you'll need that one unless you are doing "The Day the Earth Stood Still" at a local dinner theater. If you want to stay on top of phishing's twists and turns, the New-School Security Awareness Training from our sponsors at KnowBe4 can help. That's knowbe4.com/phishtest.

Dave Bittner: [00:14:57] Joe, we are back. I recently spoke to Robert Anderson. He is a former national security executive. He is a former executive assistant director with the FBI. And this guy has had quite a career. He's a military combat veteran. He's a former state trooper and a member of a hostage rescue team. In 2012, in fact, he was awarded the Meritorious Presidential Rank award by President Obama. That is an award given to government senior executives. It is the highest award for leadership. So this guy - quite a resume here. These days, he works at the Chertoff Group. And that's a security organization that was founded by the former U.S. director of Homeland Security, Michael Chertoff. And the focus of our conversation was on deepfakes. So here's my conversation with Robert Anderson.

Robert Anderson: [00:15:46] Deepfake videos - kind of the layman's term of these things are videos that, when you look at them, they look like a certain individual you may or may not know, but they actually do not involve the real person at all. They use a surrogate or a source person and then superimpose the face and facial features of the person they're trying to replicate in the video.

Dave Bittner: [00:16:10] Now, I think for a number of years now, we've been accustomed to the notion that, you know, folks who do Hollywood special effects, they've been able to do this sort of thing. We've seen, you know, TV commercials with celebrities who may have passed away and things like that. But what we're talking about here is a much easier way to come at this.

Robert Anderson: [00:16:27] Absolutely much easier way. And for the main part is you don't even need a real subject, just somebody to mimic that individual. And then through the advancements of machine learning and artificial intelligence, they can then transpose a very lifelike and - anymore, it's getting very hard to tell them real videos from fake videos - and using that individual's face, deliver a message either to the private sector or the government or the military that looks believable.

Dave Bittner: [00:16:59] Can you take us through some of the potential scenarios here? How would people use this?

Robert Anderson: [00:17:04] How it actually began several years ago is like most things, it was sort of a novelty that people would use on different apps that you could get on your cellphone stores that would superimpose the face of a dog or an animal. And then later on, people became much more malicious and then actually used transposing people's face into pornography and other things. But now, it's developed to the point where criminal organizations and nation states can potentially use this type of technology to simulate a individual of possibly rank order in the military or a senior person in a company to the point where people would believe this is the actual person giving orders either to transfer money or potentially launch some type of military operation. So very trouble.

Dave Bittner: [00:17:53] Is the capability there to do this sort of thing real time or is it still a processor-intensive kind of thing?

Robert Anderson: [00:17:59] It's definitely a process, but I will tell you the technology, especially an artificial intelligence and machine learning, has evolved so quickly. Several years ago, these types of videos were very easy to spot. The facial movements, the eye twitching and a lot of other things stood out, and you could tell that something wasn't right with the video. But like with anything else in sort of the modern-day cyber realm, the enhancements have become so significant that they can make these videos in much shorter periods of time. And it is getting increasingly harder and harder for either the private sector or the government to determine if these are real or fake videos.

Dave Bittner: [00:18:39] I'm thinking when it comes to election security, which is certainly top of mind for a lot of people right now. A lot of times, you'll see advertisements from opposing candidates and they'll have video of their opponent saying something that is unappealing or something like that to put words in other people's mouths.

Robert Anderson: [00:18:58] That's absolutely true. They absolutely can do that. And like a lot of what's going on in cyber and artificial intelligence now, a lot of times, the laws are kind of behind where individuals can use the technology to commit either fraud or crimes like we've just discussed. And so the race really is on by not only the government, but the private sector in developing algorithms that will help stop deepfake videos or portrait deepfake videos - which are even like kind of A standard, gold standard for these videos nowadays - and enable us to be able to catch them much quicker, whether they're on social media, as we've seen that's been a platform for a while now for people to launch kind of malicious attacks or whether it's on television or just an internal communication between companies.

Dave Bittner: [00:19:47] Now, is this the kind of thing where there could be a policy solution? Is this the kind of thing where government could crack down on these tools?

Robert Anderson: [00:19:55] I think, for once, I pretty much think that the government and private sector are pretty aligned with this. In a lot of states, these have already become illegal. The hard part is trying to prove quickly enough and being able to attach it to an individual to press charges. I think everybody sees the maliciousness on the low end and then obviously the potential for criminal or counterterrorism or even counterespionage behavior on the far end. And so for that, a lot of times, in the last several months, I've seen and been involved in conversations with both the government and private sector on, what would you do or how would we best plan to try to catch and capture these things before they cause too much harm?

Dave Bittner: [00:20:39] And I can even see lots of different scenarios for this. For example, we get a lot of stories about phishing emails where people will pretend to be a loved one or something like that asking for money. You know, please send me money. I'm stuck out of town. Can you wire me some money? I could see something like this taking that to the next level.

Robert Anderson: [00:20:57] Absolutely right. I mean, if you just look at the way that normal businesses function nowadays, I mean, you know, here just even today since I've been in the office, I've been on three video conferences and several phone calls. And so think of just normal business activity during the day with tens of thousands of companies around the United States. And you can begin to see how big of a threat this actually is if somebody can fake an individual especially that has responsibility or the power to influence conduct by the company. It can become pretty frightening.

Dave Bittner: [00:21:32] But what about the bigger picture of this? I mean, obviously, there are the specific instances of this sort of technology being able to use to fool people or commit fraud or or things like that. But I'm thinking of just the general sort of societal erosion of what we trust. If you cannot trust your own eyes - the video of someone giving a speech or something like that. In this era of, you know, so-called fake news, this seems to be leading us farther down that path.

Robert Anderson: [00:22:01] It absolutely does. And I think the one thing that it really should impress upon everybody is the speed at which cyber and artificial intelligence is moving at. It is moving at absolutely light speed. And with that, unfortunately, comes a lot of pluses but also a lot of things that will actually change your perceptions in the future.

Dave Bittner: [00:22:22] Do you suppose we might see some way to verify that a piece of video footage or audio is legitimate, some sort of chain of custody, if you will?

Robert Anderson: [00:22:32] I can see that absolutely happening in the future. And whether it's like, you know, nowadays when you click on your different internet search engines, it'll tell you if the site you're clicking on is a valid site and it's safe for you to click on it. I can see the same thing very rapidly moving to videotape technology, especially because of the threat of deepfake videos.

Dave Bittner: [00:22:53] So what's your advice to folks out there who are having to deal with this sort of thing? I mean, is this something that should be on everyone's radar? How should we be protecting ourselves as consumers going about our day-to-day lives?

Robert Anderson: [00:23:05] I think that's a great question. I think the biggest thing is is if you see something or hear something that sounds not quite right or it doesn't sound like that is something that you've seen that individual say or do before, I think you should stop and question it. And if you can in any way make inquiries to see if that individual actually made that phone call, made that video conference - sure. You know, check on social media sites, and see if other people are questioning the same thing. It's no different than how - several years ago, when a lot of cyber actors would get CFOs of companies to transfer money posing as the CEO. A simple phone call could alleviate that from that organization, even though the email was very convincing. I think the same type of kind of immediate action plan should be formed around deepfake videos.

Dave Bittner: [00:23:54] Joe, what do you think?

Joe Carrigan: [00:23:55] It's an interesting problem. Actually, last week, Carole on "Smashing Security" was talking about this same problem - actually, one of the advancements that they made in deepfakes with blinking.

Dave Bittner: [00:24:04] Right.

Joe Carrigan: [00:24:04] They used to be easy to spot by the absence of blinking. And now they've gotten around that. The technology has some legitimate applications. It will definitely make life a lot easier for Hollywood special effects artists.

Dave Bittner: [00:24:14] Right.

Joe Carrigan: [00:24:14] And Carnegie Mellon was also saying that it could also help with autonomous vehicles - so self-driving cars, which are the way of the future, I guess.

Dave Bittner: [00:24:22] The technology being able to recognize facial features and so forth.

Joe Carrigan: [00:24:26] Yeah, I think. Or - I'm not exactly sure. I'm not an expert in artificial intelligence or machine learning. So I don't know how that would help. And I'm certainly not an expert on autonomous cars yet, but that might be coming up. Who knows? These things are being made illegal. I like that Mr. Anderson points that out. And I heard on "Grumpy Old Geeks" that there were a number of porn sites that have just banned them.

Dave Bittner: [00:24:48] Right.

Joe Carrigan: [00:24:48] So if you produce one of these and upload it to the site, they won't have it. So if a porn site is saying they're going to ban something, maybe that should be illegal, you know?

Dave Bittner: [00:24:59] (Laughter) Right. You're right. Yeah. Yeah, yeah.

Joe Carrigan: [00:25:00] You know, if these guys think something - these guys make their living living on the very edge of a law.

Dave Bittner: [00:25:04] Right.

Joe Carrigan: [00:25:04] Right? And they're very, very good at doing that. And they've been doing it for long enough that they've gotten good at it.

Dave Bittner: [00:25:09] But I suppose if - it's one of those things where kind of - not unlike encryption technology, where once the code is out there and...

Joe Carrigan: [00:25:18] Right.

Dave Bittner: [00:25:18] ...Someone open-sources versions of it or whatever and you can't put that genie back in the bottle.

Joe Carrigan: [00:25:24] No. It's...

Dave Bittner: [00:25:24] Or it's certainly not easy to do.

Joe Carrigan: [00:25:25] It's out there. There are some AI algorithms that will notice it, that will find - you know, there are artifacts that will show up that are recognizable by other machine learning algorithms. You guys were talking about, like, a chain of provenance...

Dave Bittner: [00:25:37] Right.

Joe Carrigan: [00:25:37] ...And how do you verify videos. Well, you can do that with a simple, you know, digital signature. So let's say you have a news organization that has a video. And they're going to sign the video with their private keys. And then you as a citizen watching the video can verify the signature with their public keys. And you don't need to build any new infrastructure. Maybe you need an app, a video-playing app that validates the signature with the public keys and then displays the source. And then you can decide whether or not you trust the source.

Dave Bittner: [00:26:01] Right.

Joe Carrigan: [00:26:01] Right?

Dave Bittner: [00:26:02] Right. Seems complicated (laughter).

Joe Carrigan: [00:26:04] Yeah, but we're already doing that with email.

Dave Bittner: [00:26:05] Yeah, I guess.

Joe Carrigan: [00:26:06] So it already exists.

Dave Bittner: [00:26:07] But you have that extra layer of the people who are presenting the newscast, you know? Today, you know, the president of XYZ Corporation said such and such. So do you trust that news organization with the chain of provenance of the video that they're showing you?

Joe Carrigan: [00:26:22] Correct. Yeah.

Dave Bittner: [00:26:22] Yeah.

Joe Carrigan: [00:26:22] That's the question that you have to decide. Do you trust any news organization to be honest with you and not trying to be manipulative?

Dave Bittner: [00:26:27] Right. Right. And I know it's messy.

Joe Carrigan: [00:26:29] Yeah.

Dave Bittner: [00:26:30] Yeah.

Joe Carrigan: [00:26:30] It is messy. It's a big mess.

(LAUGHTER)

Dave Bittner: [00:26:34] All right. Well, our thanks to Robert Andersen for joining us. And thanks to you for listening.

Dave Bittner: [00:26:38] And of course, thank you to our sponsors at KnowBe4. They are the social engineering experts and the pioneers of new school security awareness training. Be sure to take advantage of their free phishing test, which you can learn about at knowbe4.com/phishtest. Think of KnowBe4 for your security training.

Dave Bittner: [00:26:57] Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more about them at isi.jhu.edu. The "Hacking Humans" podcast is produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. Our coordinating producer is Jennifer Eiben. Our editor is John Petrik. Technical editor is Chris Russell. Executive editor is Peter Kilpe. I'm Dave Bittner.

Joe Carrigan: [00:27:23] And I'm Joe Carrigan.

Dave Bittner: [00:27:24] Thanks for listening.