Hacking Humans 8.3.23
Ep 253 | 8.3.23

Privacy matters when it comes to ChatGPT.


Raj Ananthanpillai: How about flipping the script and saying, "Okay. What if we get it once in some safe place, right, and then reuse it over and over again without ever giving out any personal information?"

Dave Bittner: Hello, everyone. And welcome to the CyberWire's "Hacking Humans" podcast where each week we look behind the social engineering scams, phishing schemes, and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner, and joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.

Joe Carrigan: Hi, Dave.

Dave Bittner: Got some good stories to share this week. And later in the show, my conversation with Raj Ananthanpillai. He is CEO of Trua, and we're talking about privacy concerns in Chat GPT. All right, Joe. Before we dig into our stories this week, we have a couple of items of follow up here. What do we got?

Joe Carrigan: We do indeed. Clayton wrote in to comment about your story either last week -- I think it was a couple weeks ago.

Dave Bittner: Yeah.

Joe Carrigan: About the bomb threats being used to shut down retail stores for ransom. There was a brief discussion about whether this was an escalation of ransomware, but I would argue it is an escalation of swatting. Call it monetizing swatting, if you will. The monetization of swatting. Great show, as always. I'll agree with that. I think that's a pretty good assessment.

Dave Bittner: Yeah. I could go with that. Yeah. Sure. All right. Well, thank you, Clayton, for sending that in. We got another note from a listener named David who said, "Hacking humans long time listener. Listening to your podcast about wire fraud prevention and wondering why you never suggest sending a $50 test wire before sending the $700,000 purchase price. 100% robust against third party fraud." Well, David, the reason that I've never suggested this is quite simple, and that is because I've never thought of it.

Joe Carrigan: Right.

Dave Bittner: [Laughs] so thank you for the suggestion. I mean seems solid to me.

Joe Carrigan: Yeah. Yeah. I've done this, actually.

Dave Bittner: Oh yeah?

Joe Carrigan: With -- yeah. With other -- not with wire transactions, but with any crypto transaction I always send a very small amount first to see if it gets there.

Dave Bittner: Right.

Joe Carrigan: And then send the rest of it to the same address.

Dave Bittner: Yeah.

Joe Carrigan: So yeah. But this could easily be applied to -- to real estate transactions.

Dave Bittner: Yeah.

Joe Carrigan: No problem.

Dave Bittner: Seems like a solid plan to me.

Joe Carrigan: That way you're only out 50 bucks. You're not out 700,000.

Dave Bittner: You know what I'll bet? I'll bet -- so I would be willing to bet that a reason why this is not a routine practice is that it probably costs like 5 bucks.

Joe Carrigan: $12 I think.

Dave Bittner: Is that right?

Joe Carrigan: Yeah. Okay. So $12 to wire money.

Dave Bittner: Right. So in the 700,000 or whatever, you know, multiple hundred thousand dollar transaction of buying a home, people are probably really concerned about not having to pay an extra 12 bucks.

Joe Carrigan: I don't want to have $24.

Dave Bittner: Which, you know, hey look. I understand. You know, you want to save every possible bit, but as David points out here, I mean that's -- to me, that seems like money well spent.

Joe Carrigan: Penny wise, but pound foolish.

Dave Bittner: There you go. There you go.

Joe Carrigan: Yes. I think it is well spent, money well spent, to send a test transaction.

Dave Bittner: Yeah. All right. Well, thank you, David, for sending that in. It's a good idea, and Clayton as well. We would love to hear from you if there's something you'd like us to consider for our show. You can email us. It's hackinghumans@n2k.com. All right. Let's dig in with our stories this week. I have one. This comes from the record. This is from the folks over at Recorded Future. They're a news organization. And it's a story by Jonathan Grieg. And it's titled "Called a Bogus Airline Customer Support Number? Google is Hustling to Fix That." This article tells the story of a gentleman named Shmuli Evers who discovered that his flight from JFK airport was canceled. So he Googled customer support for Delta, and he called Delta customer support and he was just about to hand over his credit card information when he realized that he was speaking to a scammer who had replaced the number for Delta's JFK office on Google with another number. Now what makes this particularly interesting is you and I have talked about folks basically buying ads on Google to get their -- their fake ads for services at the top of the list for SEO.

Joe Carrigan: Yep. And I've called those numbers before.

Dave Bittner: Yeah. But that's not what happened. That's not what was going on here. What was going on here evidently is that someone managed to replace the phone numbers in Google Maps with the fake phone numbers.

Joe Carrigan: How did they do that?

Dave Bittner: Well.

Joe Carrigan: They probably clicked on that thing, "Do you own this business?"

Dave Bittner: This remains a mystery, Joe. So we'll dig into that. So, as you know, when you Google a company name or something like that, a lot of times the first thing that pops up is Google's own little summary of that business.

Joe Carrigan: Yes.

Dave Bittner: And that includes a component from Maps and other -- you know, it will have a link to their website and a link to online reviews and a summary of what that business is. And so it seems as though in this case it was pulling phone numbers from Google Maps. This gentleman, after running into this situation he checked support phone numbers for a bunch of different airlines at JFK, and he discovered that Delta, American Airlines, Southwest, Air France, Qantas, ITA Airways, and Turkish Airways all had bogus phone numbers on Google Maps.

Joe Carrigan: How about that?

Dave Bittner: Yeah. Yeah.

Joe Carrigan: I wonder how many people have been scammed.

Dave Bittner: That's a good question. The folks at The Record contacted Google and they didn't --

Joe Carrigan: Let me see. Hold on. Let me use my powers of -- of clairvoyance here.

Dave Bittner: Yeah.

Joe Carrigan: They -- they didn't respond. Or they didn't -- did they respond?

Dave Bittner: They did.

Joe Carrigan: Okay.

Dave Bittner: But I bet you you couldn't guess what the response was.

Joe Carrigan: Security is our number one concern.

Dave Bittner: Close. Yeah.

Joe Carrigan: We're endeavoring to make sure this doesn't happen again.

Dave Bittner: You're -- yes.

Joe Carrigan: And we're trying to fix it right now.

Dave Bittner: You are in the right neighborhood. That is -- you've paraphrased what they said with great precision and foresight. So yeah. Google said, "We have stopped more than 20 million attempts to create fake business profiles."

Joe Carrigan: With 20 million attempts 1 or 2 are bound to get through.

Dave Bittner: Right. Well, you can't expect us to be perfect.

Joe Carrigan: Right.

Dave Bittner: So this gentleman, Evers, posted about this on Twitter which I think is how The Record found out about this case. And lots of people said that they had been scammed after using fake numbers that they found on Google Maps or Google search.

Joe Carrigan: The idea it's the amount of the losses these people have faced.

Dave Bittner: No. No. This article doesn't go there, but people have theorized that -- this is some of the folks who responded in comments on Twitter, people that theorized that Google was simply approving changes to pages without checking with the businesses themselves.

Joe Carrigan: Yeah. Yeah. I'm sure. Because that would involve a human doing something. Right? And somebody probably found a weakness in Google's process, exploited the weakness, and redirected all these companies' phone calls to them.

Dave Bittner: Right. Right. Now since then the numbers -- the correct numbers have been replaced on Google Maps.

Joe Carrigan: Good.

Dave Bittner: But you've got to wonder, I mean how -- how is Google going to go about checking every number on Google Maps?

Joe Carrigan: Yeah. Here's a good question. What about the mom and pop business? You know? Somebody that doesn't have the pull of American Airlines or Delta or whoever. Right? You know, it's -- it's like Broadway Deli or something like that. Broadway Deli went out of business, but you know what do these guys do when Google has hosed them to a point where -- I mean because imagine. Imagine you have a small business and you get some malicious guy that you mess up their order or something, and now they want to take it out on you. They want to make you pay. So they go and they redirect all the calls on Google that everybody searches to some disconnected phone number or something like that. And now nobody can reach you, and your business plummets. And, I mean, but there's nothing you can do because you're not a large company that has lawyers. Right? So you can't -- you can't threaten Google with -- with misinformation or whatever. I don't know how they go about doing this.

Dave Bittner: Well, and I would suppose these days in this era of voice over IP where it's so easy to spin up phone numbers, if I was the scammer doing this and I wanted to come after, you know, Joe's Amazing Deli, right, I could spin up a unique phone number, put that on Google Maps. People start calling Joe's Amazing Deli for Joe's Amazing Deli sandwiches. I start taking orders. But what I'm getting at is here by spinning up that phone number, when someone calls in I know to answer the call and say, "Joe's Amazing Deli."

Joe Carrigan: Right.

Dave Bittner: Right? And it's very little --

Joe Carrigan: You start taking orders and charging credit cards.

Dave Bittner: Right. Exactly. So it's very little effort for me to be able to do that and run some kind of call center where I could just be taking calls from all kinds of business.

Joe Carrigan: Yes.

Dave Bittner: Yeah. I wonder if there's any liability here for Google. I mean Google provided the wrong information.

Joe Carrigan: I'll bet that if you look in their end user license agreement, there's no liability for them.

Dave Bittner: Yeah. I'm sure you're right.

Joe Carrigan: But I don't know. I think you might be able to make a court case because these businesses might be able to assert some kind of standing because Google has misrepresented them in a large public forum which Google is. They might have some -- some liability there. And there's no EULA that's going to cover them from that.

Dave Bittner: Yeah. The other thing I wonder is who do you trust. I mean if you can't trust Google Maps with giving you the correct phone number -- because I've used that for many things. You know, I've looked up my --

Joe Carrigan: That's the only way I do it on the -- on my phone.

Dave Bittner: Right. Right. Yeah. So yeah. I mean I guess the answer is you go to the website that you know is the company and then you look for their customer service number. In this case, though, they were looking for specific numbers at specific airports.

Joe Carrigan: Right.

Dave Bittner: Which would be, I imagine, harder to dig in, dig down and find on the company's big national website.

Joe Carrigan: I'm sure there's no way you can find Delta's JFK number without Googling it.

Dave Bittner: Yeah. Yeah. So beware, folks. This is a new twist on an old scam. So I would also say if you're a small business owner, go check your business, your how your business is listed, on Google Maps, and make sure that the information they have is accurate.

Joe Carrigan: Yeah. Make sure that -- make sure that you have ownership of that.

Dave Bittner: Right. Right. All right. That's my story this week. Joe, what do you have for us?

Joe Carrigan: Dave, my story comes from Tech Crunch, and it was written by Zach Wittaker, and the title is "Spyhide Stalkerwear is Spying on Tens of Thousands of Phones." So we've talked before about stalkerwear on this show, and it's essentially it's spyware that somebody else installs on your phone.

Dave Bittner: Right.

Joe Carrigan: The products are manufactured. You know, written and sold, under the auspices of being child safety applications.

Dave Bittner: Right.

Joe Carrigan: But they are not. They are almost exclusively used by abusive partners for the purposes of monitoring and controlling all the communication of the partner they're abusing.

Dave Bittner: That's right.

Joe Carrigan: That is -- that is the intended audience of these apps. I -- you know, I don't think there's any legitimate reason for this. Well, this story is about a phone surveillance app called Spyhide which is one of these apps that they actually call it right here spouseware that's planted on a victim's phone often without their knowledge and by somebody who knows the pass code.

Dave Bittner: Right.

Joe Carrigan: And the app is designed to lay low and stay hidden on the phone. And it does this by pretending to be the Google settings app or by pretending to be a ring tone app. So if you go looking for it, you won't find something called Spyhide. You'll find -- you'll find a ring tone app or something that looks like Google settings, but is just another -- it is this app, but it looks like another Google settings app.

Dave Bittner: Okay.

Joe Carrigan: It has to be side loaded. It can't -- they don't -- they don't have this in the Google Play store. So it's not available. But there's a security researcher out of Switzerland who has posted a blog post about how they have penetrated the network here by exploiting a few vulnerabilities, and they've found all the data for 60,000 compromised Android devices. This is the way they're described. There are actually 65,000 device with these things installed. All the way back to 2016 up to the date of exfiltration in July.

Dave Bittner: Wow.

Joe Carrigan: So one of the things that Tech Crunch did, this person provided the data to Tech Crunch and Tech Crunch, they say that they did this offline. They didn't -- they didn't go to an online service to do this. They put the data into a mapping application, and there is a great picture in here of how this app has tracked people across the United States. And it looks very similar to one of those pictures of the United States at night pictures.

Dave Bittner: Right. Right. Maybe with like the highway system superimposed over it which makes sense.

Joe Carrigan: The highway system is superimposed. I mean I can see most of interstate 70 right here going across. There are some spots where interstate 70 is not covered, but it -- I can see it going like from somewhere in Ohio down to Kansas City. But this data's from all over the world. And actually the U.S data is a small fraction of it. But the person who did this, who -- I don't know how to say it. This app and these vulnerabilities, they -- if they were in the U.S, they may have violated some laws with what they've done.

Dave Bittner: Okay.

Joe Carrigan: But they're in Switzerland which I think has a very liberal policy on these things.

Dave Bittner: I see.

Joe Carrigan: So I'm glad that this person's done this and posted this and that Tech Crunch can post an article about it, but the software, they think it was made in Iran and is being hosted out of Germany. So I think, you know, we all -- we've said this 100 times before that these -- these kind of apps are just reprehensible and the people that make them and write them are terrible people. And I still say that. I don't -- I don't have any change in my feeling on that.

Dave Bittner: Yeah.

Joe Carrigan: But there is a list of things you can do. They have a general guide, a link in the article that's a general guide to the -- to how you can protect yourself from these kind of apps. And the first thing they say is make sure that your Google Play protect is on. Now I checked my Google Play protect settings today.

Dave Bittner: Okay.

Joe Carrigan: When I was reading this article. And I didn't see a way to turn it off which was good.

Dave Bittner: Okay.

Joe Carrigan: But it is on on my phone. And you can scan apps, and it will let you know if it finds anything. A lot of these apps make use of accessibility services. So accessibility services exist on your phone for people who are disabled, people who can't see things, people who can't hear things, people who can't see the colors properly.

Dave Bittner: Yeah.

Joe Carrigan: So these things are there for them to provide the information, but these accessibility services can be misused by spyware, by spy apps, to collect all the same data, and to feed it up to the cloud. These guys are just going out and sucking up all this data. They really don't care about the -- the privacy of the people that they're infringing upon. What they care about is getting the money from the people that want to monitor these other people.

Dave Bittner: Yeah, and this article points out that these apps are buggy. I mean they're not -- they're generally not well written apps.

Joe Carrigan: That's a good point, Dave.

Dave Bittner: Not something you want to have on your phone anyway.

Joe Carrigan: I forgot to mention that. They are buggy. They are notoriously buggy.

Dave Bittner: Yeah.

Joe Carrigan: And because they're just written quickly and their -- their goal is to make as much money as possible as quickly as possible before they probably get shut down.

Dave Bittner: So let me ask you this. On -- on the Android side of things which, you know, I'm not that familiar with, can you restrict side loading?

Joe Carrigan: You can. In fact, you have to enable it.

Dave Bittner: Okay.

Joe Carrigan: Right?

Dave Bittner: So it comes out of the box with side loading restricted.

Joe Carrigan: Correct.

Dave Bittner: So presumably the person who's putting this on your phone --

Joe Carrigan: Yeah. Somebody has physical access to your device which is bad. There is another caution here about the -- about disabling the device. Because -- and it says so right in the article. Remember that switching off the spyware will likely alert the person who planted it. Which may be a problem for the person who's being spied upon.

Dave Bittner: Right.

Joe Carrigan: And at the bottom of this article there is a -- a telephone number for the National Domestic Violence Hot-line.

Dave Bittner: Yeah.

Joe Carrigan: So if you're being spied upon this way, and you have the ability to terminate that relationship and you just find out about it, I don't know. That would be a deal breaker for me, Dave.

Dave Bittner: Yeah.

Joe Carrigan: And I don't want to pretend to know everybody's situation, but if you're -- I would terminate any relationship where I found that to be the case.

Dave Bittner: Yeah. Yeah. And, as you say, I mean everyone's situation is different and there are lots of people who are in situations where it is not easy to leave.

Joe Carrigan: Yeah. You can't just walk away from it.

Dave Bittner: Yeah. I wonder if you went looking for something like this on your device and you're worried that if the person who puts it on there may be displeased at finding you taking it off, I wonder if it's in your best interest to accidentally on purpose have the device be lost or destroyed or, you know, air quotes, "stolen."

Joe Carrigan: Or reset.

Dave Bittner: Yeah. Yeah. Reset. Something like -- yeah. I mean I suppose if you reset it, you could say, "Oh, I don't know what happened. I hit the wrong button."

Joe Carrigan: I had some problems with the phone. I called tech support. They told me to reset the phone. That's what I did.

Dave Bittner: Yeah. Right. But then how do you keep the person -- presumably if the person had access to the device then they're assuming that they're going to have access to the device again because, "Why don't you trust me?" You know? That sort of thing. What are you trying to hide? You know?

Joe Carrigan: I had that discussion on the phone the other day with somebody. My new cable provider. He said, "Oh, you've got to connect that device to our router." I'm like, "No. I'm connecting it to my router." And they're like, "Why not? Why not ours?" I said, "Because I don't trust you." That was the sentence that came out of my mouth. The guy on the phone was like he had never had anybody say that to him.

Dave Bittner: Right. Right. Maybe they sell or maybe they'll send you in the mail like a logoed tin foil hat. They should have those for when they run into folks like you. Yeah.

Joe Carrigan: I should just keep a tin foil hat by the door so whenever anybody knocks on it, I go, "Oh. It's a solicitor. Let me get my tin foil hat."

Dave Bittner: Yeah. The other thing I've heard that's very effective for getting people to keep their conversations with you short, this also works for keeping people from sitting next to you on an airplane.

Joe Carrigan: Boy, I really want to hear this.

Dave Bittner: It's just take a nice -- take a little piece of string and just hang it out of your mouth.

Joe Carrigan: What?

Dave Bittner: Just take a piece of string, like a piece of twine, just hang it out of your mouth.

Joe Carrigan: And that keeps people away?

Dave Bittner: Yeah. Because you open the door and who's -- you've got a piece of string hanging out of your mouth. Somebody's coming down the aisle in the airplane. They look at you, and you're just looking at them and you have a piece of string hanging out of your mouth. They don't want to get anywhere close to that.

Joe Carrigan: What? Why does that -- that kind of makes me want to sit next to somebody. There's a story here, and I want to hear it.

Dave Bittner: Okay.

Joe Carrigan: But I'm not like most people, I guess.

Dave Bittner: Yeah. Yeah. It's way -- I don't know. Just reminds me of the old "Far Side" cartoon about how nature tells you stay away.

Joe Carrigan: Nature's way of saying, "Don't touch."

Dave Bittner: Right. Yes. Exactly. That's -- that's what it is. All right. Well, yeah. It's interesting. I mean it's all -- it's always a bit chilling when we wander into this area of bad relationships and people in bad situations.

Joe Carrigan: Yeah. I don't like talking about these things, but we have to talk about them.

Dave Bittner: Yeah. Yeah. All right. Well, we'll have a link to the story in the show notes again written by Zach Whittaker. Great journalist and author. I've had the pleasure of interviewing him a few times. Generally if he writes it, it's worth reading. So do check that out. Like I said, we'll have a link for that in the show notes. All right, Joe. It is time to move on to our catch of the day.

[ Soundbite of reeling in fishing line ]

Joe Carrigan: Dave, our catch of the day comes from [inaudible] who writes, "Hi, Dave and Joe. Big fan here. You guys as well as Dave and Ben are classic duos." You're the member of two classic duos.

Dave Bittner: Yeah. That's right. That's right. I don't -- I can't pick my favorite though, Ben -- or Ben.

Joe Carrigan: [Laughs] I think you just did, Dave.

Dave Bittner: There you go. Oh well. Cat's out of the bag.

Joe Carrigan: So [inaudible] goes on to write, "This -- this is a spam email I got a few months back. I get lots of spam emails as most people I'm sure do, but this one is funny in particular. It definitely had me spooked a little bit when I saw my old password in the first line, but I quickly realized there was no real issue."

Dave Bittner: Okay.

Joe Carrigan: So he says, "I hope -- I hope this gets some laughs. Take care." So, Dave.

Dave Bittner: All right.

Joe Carrigan: I think you can guess what kind of email this is going to be.

Dave Bittner: Yeah. Yeah. Yeah. It says, "I know your old password is your password on day of hack. Let's get directly to the point. Not one person has paid me to check about you. You do not know me, and you're probably thinking why are you getting this email. In fact, I actually placed malware on the adult vids website, and you know what? You visited this site to experience fun. You know what I mean. When you were viewing videos, your browser started out operating as a RDP having a key logger which provided me with accessibility to your display and webcam."

Joe Carrigan: He doesn't have a webcam, though.

Dave Bittner: Immediately after that, my malware obtained every one of your contacts from your Messenger, Facebook, as well as email account.

Joe Carrigan: He also doesn't have a Facebook account.

Dave Bittner: After that, I created a double screen video. First part shows the video you were viewing. You have nice taste OMG. And second part displays the recording of your cam and its view. Best solution would be to pay me $2,747. We're going to refer to it as a donation. In this situation I most certainly will without delay remove your video. My Bitcoin address. You could go on your life like this never happened and you will not ever hear back again from me. You'll make the payment via Bitcoin. If you're planning on going to the law, surely this email cannot be traced back to me because it's hacked too. I've taken care of my actions. I'm not looking to ask you for a lot. I simply want to be paid. If I do not receive the Bitcoin, I definitely will send out your video recording to all of your contacts including friends and family, coworkers, and so on. Nevertheless if I do get paid, I will destroy the recording immediately. If you need proof, reply with yeah. Then I will send out your video recording to your eight friends. It's a nonnegotiable offer and thus please don't waste mine time and yours by replying to this message.

Joe Carrigan: Don't ever reply to these messages. In keeping with the -- with the anonymous person who wrote in last week, I'm going to -- I'm going to say that, that you should not reply to these messages in particular. These are all pretty much scams. And it's just a basic sextortion scam, but all they do is they search for an old password breach with an email and password pair, and they -- they send the email to the email address and they put the password. It's all automated. So I would like to check the -- the Bitcoin address. But it's unfortunately [inaudible] sent it to us as a picture, not a text file. So I can't check it because I mean I could go ahead and enter it. I'm not.

Dave Bittner: But you're lazy.

Joe Carrigan: But I'm lazy. That's right. Very, very lazy.

Dave Bittner: Okay. Yeah. I think your point that the emotional response this draws when you see your -- one of your old passwords in an email.

Joe Carrigan: Yeah. That's --

Dave Bittner: You go, "Whoa. This is the real thing."

Joe Carrigan: That [inaudible] said right off the bat he said, "That scared me a little bit." Made him -- I'm sure it made his heart skip a beat.

Dave Bittner: Sure. Yeah. But yeah. Don't fall for these things. They're just -- just scams.

Joe Carrigan: They are just scams.

Dave Bittner: Yeah. Best to ignore them.

Joe Carrigan: It is. Odd that they want $2,747.

Dave Bittner: Yeah.

Joe Carrigan: Why not just $2,500?

Dave Bittner: I'll bet you that having an oddly specific number like that is -- gets less scrutiny from spam filters.

Joe Carrigan: Maybe.

Dave Bittner: I would -- I mean that's a wild guess, but it seems plausible. All right. Well, thank you for sending this into us. Again we would love to hear from you. You can send us a potential catch of the day to hackinghumans@n2k.com.

Joe Carrigan: Send them in, folks.

Dave Bittner: Joe, I recently had the pleasure of speaking with Raj Ananthanpillai. He is CEO of a company called Trua. And we're talking about privacy concerns with Chat GPT and the other large language models. Here's my conversation with Raj.

Raj Ananthanpillai: Right off the bat, anybody using Chat GPT or any of those generative AI, never -- start with putting nothing meaning nothing personal. Start there. Okay? Not even your name. So try and use it. And then as you get better and better at it, then you can say, "Okay." You can put a name in there because these systems don't require you to identify yourself, but unfortunately people are putting all kinds of personal information, their family information and so on and so forth. Remember this is a one way street. You can never go back to these generative tools and say, "Take out my name. Hey, delete my this or that." If you look at their terms of service, it's multi page, fine print, you know, you need a magnifying glass to even read it and understand it. That's how they are getting away with it. And -- and it is very, very dangerous tool for personal information because they take only your input to fine tune their algorithms over time. That's exactly what happens.

Dave Bittner: So you point out in some of your research that, you know, there's -- there are things that I think people will put into this that they don't consider being personal information. For example, someone might upload a resume and say, "Help me reformat this resume." But there's all kinds of information on a resume.

Raj Ananthanpillai: Absolutely because remember everything is scannable. Right? So they scan. I don't know how they upload it. They scan it. And then your name, the employer, so those are all what I call identifying information. At some point, right, somebody's going to use just like KBAs, knowledge based authentication, remember way back when are even some of the institutions used that. Hey, what kind of car did you drive in 1992? What color was your car? Or maybe you lived during this year. Right? Those are all historical information that were always being used for authentication purposes. And now imagine your employment history as -- as part of your resume becoming part of that. That could be another dangerous avenue for somebody to hack and start using it, you and your profile, for fraudulent purposes.

Dave Bittner: So you -- can we imagine that someone could go into for example a Chat GPT interface and say, "Tell me everything you know about Raj's work history."

Raj Ananthanpillai: Yes. You could if it already has that. Right? I've already put in there. Even otherwise they're going to scrape it anyway. They go find different things already that's in the public domain. Right? They're going to find something. Even today you and I can Google, for example, about anybody any place. Right? Whether it's true or not, that's a different story, but they have enough information to go out and start interacting with you as if they are [inaudible]. Right? And that's all it takes for somebody to easily succumb to some of those fraudsters. Think about spoofing, that thing about phishing, right. They're being used as if it's coming from you. It's the same concept at a level that's probably a hundred times more than what you're experiencing to that.

Dave Bittner: You know, I think it's safe to say that there is true utility with some of these tools, but as you point out, I mean it -- it's so alluring to put information in there because the -- the answers you get back, I mean you hear people talking as if these things are almost therapists.

Raj Ananthanpillai: That is true. That is -- that is the thing about this Chat GPT. What happened was somehow this became a what I call a mainstream slash consumer excitement. Usually these are technologies that are used by big corporations to automate stuff, to do big things. Right? Somehow it's -- got into the mainstream consumer, you know, and everybody's just toying with it, in my opinion. That is the big thing. If you think about some of the major technologies, right, it took a long time before consumers starting using it. Right? But they were not necessarily geared towards consumers.

Dave Bittner: Are there ways that people can use this sort of technology in a safer way? Can you -- is it possible to run something like this on a local instance?

Raj Ananthanpillai: Yeah. As long as the provider of the tool assures you that your personal information is going to be erased right after that, because you don't want to leave any personal information behind the scenes. Right? But they need that information to generate what we are trying to tell the tool to generate. All right? So otherwise it's going to be a garden variety vanilla response. So you're looking for specific input from this particular tool. So the key is to always minimize the number of personal information. Again what you're looking for from these tools is not necessarily anything absolutely impersonal. If they don't have anything about you personal, it's going to take a while before you start gathering some of the personal information. But they're going to say, "How do I reduce my anxiety?" For example. Right? And my name is this. I work in this industry. Right? Then go. You put your name. You put your industry category. So they're going to keep that for their future analysis purpose. But if you say, "I am, you know, Joe or Jane Smith, and I work in a fictitious industry," right, you don't care what it -- you're looking for output from it. So that is how I will approach for a while until this dust settles.

Dave Bittner: What about business information? You know, I've heard of folks taking things annual reports and uploading them to have them reformatted or reworded.

Raj Ananthanpillai: Big -- big time no no because it has some proprietary information, confidential information. That is the big no no in my opinion because you are literally letting somebody else hack into it.

Dave Bittner: What are your recommendations for folks who are charged with securing their own organizations? I mean it strikes me that if you cut off access to Chat GPT all together, that might not be the most practical path, but at the same time you want people to be aware of the risks.

Raj Ananthanpillai: Yes. You -- you can, you know, filter and control some of those kinds. For example, there are a lot of [inaudible] out there, right, and first you need to develop them. Like if you're trying to go to a porn site in a corporate environment, they block those things. You do not do that. Right? Many forward looking organizations. Or if you're trying to go to a gambling site, you cannot go there from the computer. So similarly we have provisions where if you have Social Security number or any personal information, as you're importing, it will grab that email right off of that. So similar concepts have to be derived and developed for interacting with Chat GPT where there's accompanying information or accompanying proprietary technology or whatever it is. You can potentially filter those things out.

Dave Bittner: Can you share with us a little bit about the work that you and your colleagues are doing there at Trua? I mean trust and safety is at the center of what you all do there.

Raj Ananthanpillai: That is true. What we do is we have what we call a reusable verified digital identity and [inaudible] meaning today when you have to verify your identity or verify your background with anyone, right, you have to keep giving your personal information over and over again. Have you -- nobody has ever solved that problem because everybody says, "Okay. I've got to put cybersecurity boundaries around it. I have to protect this PII." Everybody's collecting is like a hot potato. The [inaudible] is coming down to, you know, smack all these big corporations that are collecting personal information. How about flipping the script and saying, "Okay. What if we get it once in some safe place?" Right? And then reuse it over and over again without ever giving out any personal information. Yet that is what we do. We -- you get it once. You can use it multiple times. And then if you have to verify my identity or if you have to verify my background, you don't need to ask me nor I should provide you with any of my PII like social, date of birth, address, and whatnot. Just the name, and then I have an interaction with you to say, "Okay. Here's my identification." You scan the code. It's tokenized. And then boom. It's verified.

Dave Bittner: Joe, what do you think?

Joe Carrigan: You know, Dave, when Chat GPT came up and started becoming a big thing back in February and everybody was starting up accounts and all that stuff, this problem occurred to me. And I should have said something about it.

Dave Bittner: If only you'd said something, Joe.

Joe Carrigan: If only I'd said no. You're putting the -- this text, whatever it is you're putting into a large language model, that goes in there and stays there.

Dave Bittner: Right.

Joe Carrigan: Right? And it is their property forever. And if you need to be convinced of that, just go ahead and try to read the EULA, the end user license agreement.

Dave Bittner: Right.

Joe Carrigan: Or maybe you can actually get through reading it. I haven't gone through these things because I really, really hate them, and I also have only played with the anonymous ones that you can play with. I haven't signed up for an account.

Dave Bittner: Right.

Joe Carrigan: And you very quickly see why, why this is the case. And Raj touches on that. He says these EULAs are like 10 pages of microscopic text that probably say, "We get to keep everything and it's all our property from now on."

Dave Bittner: I wonder what would happen if you put Chat GPT's EULA into Chat GPT and said, "Summarize this for me."

Joe Carrigan: You'd get a black hole, Dave. A black hole forms on your desktop.

Dave Bittner: Right. Exactly. You'd -- you'd tear in the space time continuum.

Joe Carrigan: You end the universe as we know it.

Dave Bittner: Yeah. Yeah. We had a good run, Joe. Humanity had a good run.

Joe Carrigan: You know that? You should -- you have a Chat GPT account. Right?

Dave Bittner: I do.

Joe Carrigan: You should do that and see, "Summarize this for me." And see what it says.

Dave Bittner: Yeah. Okay.

Joe Carrigan: I mean I'm sure it will try to summarize it for you.

Dave Bittner: Yeah. Oh yeah. And it's very good at summarizing things actually. That's one of the things I find Chat GPT is particularly good at. In other words, if you're not relying on it to generate information or facts on its own, but you're asking it to re-synthesize information that you provided in the prompt, it's actually exceptional at that.

Joe Carrigan: Really?

Dave Bittner: Yeah.

Joe Carrigan: Yeah. That's the -- the Ronald Reagan thing is he used to say to his -- to his entire cabinet, "I don't want a 20 page report. Give me a one page summary because I'm the commander in chief. You're the guy that's supposed to know everything in the report. I'm just supposed to make the executive decisions."

Dave Bittner: Right.

Joe Carrigan: And summaries are a great way to get that. So if -- if this is -- if this is the way -- a good way to get those, maybe I will sign up for an account. But keep in mind everything you put in there is going to be saved and stored forever.

Dave Bittner: That's right.

Joe Carrigan: And Raj makes an excellent point. This is one way street. There's no getting it back.

Dave Bittner: Right.

Joe Carrigan: It's like putting stuff on Facebook. You can say, "Go ahead, Facebook, and delete this." Facebook goes, "Sure thing, buddy." Right? And all they do is --

Dave Bittner: Like they always say on Grumpy Old Geeks, set visibility to zero.

Joe Carrigan: Right. Exactly. It's not deleted. Nothing is ever deleted. They keep all that stuff. It may be available for other users to see. Right? Like if you say, "Hey, here's my resume." And then I go in and go, "Show me Dave Bittner's resume," it goes, "Here's what I have."

Dave Bittner: Right.

Joe Carrigan: I don't know if that's -- if that's a real risk. Maybe it is. Maybe it isn't. I'm sure in some of these models it will be. But either way, it's now 100% available to the company. That becomes part of their training data. Right?

Dave Bittner: Yeah.

Joe Carrigan: Don't upload proprietary or confidential information to any of these models ever. I mean that is just a surefire way for losing your intellectual property.

Dave Bittner: Yeah.

Joe Carrigan: You and Raj touched on this during the conversation briefly, but you can actually build your own large language model at home. I think we talked about this last week, and you don't need a super powerful computer. You need a -- you know, a chrome book isn't going to do it. But, you know, a gaming PC will do it.

Dave Bittner: Yeah.

Joe Carrigan: Will run one of these large models. Particularly if you have a pretty good graphics processor. That's really the barrier to entry into these things is having a good GPU.

Dave Bittner: It's a matter of how fast you need your answer back too.

Joe Carrigan: Right. And yeah. If you don't need your answer back right away, and you can wait, you know, 15 seconds, 20 seconds, or a minute for a response, it's fine.

Dave Bittner: Yeah. All right. Well, again our thanks to Raj Ananthanpillai from Trua for joining us. We do appreciate him taking the time.

That is our show. We want to thank all of you for listening. Our thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu. N2K strategic workforce intelligence optimizes the value of your biggest investment, your people. We make you smarter about your team while making your team smarter. Learn more at n2k.com. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I've Dave Bittner.

Joe Carrigan: And I'm Joe Carrigan.

Dave Bittner: Thanks for listening.