
Privacy Please
Tune into "Privacy Please," where hosts Cam and Gabe engage with privacy and security professionals around the planet. They bring expert insights to the table and break down complicated tech stuff everyone can understand.
Privacy Please
S6, E239 - Russian Hackers, Leaked Military Secrets, and Your DNA
Privacy threats continue to escalate as human error undermines even the most secure systems, from military officials accidentally exposing classified information to Russian hackers targeting encrypted messaging apps.
• Signal security breach occurred when defense officials accidentally added a reporter to their encrypted group chat discussing sensitive military operations
• Russian-linked attackers targeting Signal users through QR code vulnerabilities, tricking users into linking their secure accounts to attacker-controlled instances
• QR codes present broader security concerns as users can't verify where they lead before scanning them
• Attackers can place malicious QR codes over legitimate ones in public spaces like restaurants and airports
• 23andMe's bankruptcy raises critical questions about the fate of genetic data from 15 million users
• When companies holding sensitive personal information go bankrupt, data ownership and protection becomes uncertain
• Human error remains the primary vulnerability in most privacy and security systems
• Always consider the long-term implications when sharing personal information with any service
Remember to think beyond the present when sharing your data – consider what might happen to that information in 10, 20, or even 30 years from now.
All righty then. Ladies and gentlemen, welcome back to another episode of Privacy. Please, cameron Ivey, alongside Gabe Gumbs. We got to catch up here a little bit, gabe, so last week things have just been busy. I know people didn't get to hear your voice, but I talked about three different privacy fronts. Well, actually there were three different privacy threats across multiple different applications. There's a couple of different things. Just to recap, I talked a little bit about 23andMe and them going bankrupt, then talked a little bit about the Honda Fine by the CCPA related to complicated opt-out processes, and then also talked about the signaling app. Let's dive into the signal part, because I wanted to talk to you about the security side of things. First of all, when you heard about these the signal thing because I know we've talked about signal quite a few times on the on the show. It's an encrypted messaging app. It's, you know, use it at your own will, I guess. But what were your first thoughts when you heard about this?
Speaker 2:I mean there were a couple of layers to this right. So the first kerfuffle, use that word today.
Speaker 1:It's a good one.
Speaker 2:Yeah, it's a good one. There were some senior folks in the administration discussing some sensitive topics around some national security issues and some attacks and things of that nature conventional warfare attacks and someone accidentally added a reporter into that threat. Then after that, there was an announcement that there were attackers, specifically Russian linked attackers, that were targeting signal, targeting signal for exploitation. Now, that exploit was actually originally covered by Google Threat Intelligence Group back in February of this year. You know that actually wasn't new news at all, that threat actors were targeting signal, and it's here's the thing. Let's break down what signal is a little bit more and where the threat Wait before you dive into that.
Speaker 1:I don't know why this is funny to me, but it just made me think of Russian spies and how it's 2025 and we're still dealing with Russian spies.
Speaker 2:Spies everywhere. It's just spy versus spies just everywhere, I guess so.
Speaker 1:Well, let's dig into it, Okay.
Speaker 2:Spies aren't going anywhere. Signal, as you mentioned, it's a secure, encrypted messaging platform, communication platform. You can text, you can talk, you can group chat, you can do all that good stuff. It does have a couple of flaws, though. The first flaw is that it's not protected from human error as nothing is as nothing is.
Speaker 2:And so accidentally adding someone to a secure group chat. Well, you know, that's just accidentally adding someone to a secure group chat. The real question is why were these folks not using, you know, a sensitive compartmented information facility, aka a SCIF, for this type of communication? Why were they using Signal? Presumably because Signal is quite trusted as a secure communication application heard with it. But one would assume that for that kind of top secret information, that Signal would not have been an approved SCIF. So there's that problem.
Speaker 2:The second problem is the attack itself. The attack vector isn't against Signal directly, it's against how you use signal. So, like you can link your signal to different devices so that you can chat on your laptop and then you can also chat from your phone. And so the attack targets individuals, gets them to use a QR code to link their signal to an attacker controlled instance. So you see they didn't hack signal itself to an attacker-controlled instance. So you see they didn't hack Signal itself.
Speaker 2:They basically still just exploited the human weakness in all of these technologies, which is why skiffs tend to be completely isolated, like even from the regular internet, such that you know you can't cross the streams. So there's like two layers of problems happening here, and they all come down to human error. At the end of the day, they all come down to the weakest link in any technology and in any attack vector is the human problem. And so you know, knowing that senior officials were leveraging a platform that is being targeted and like look, we have to assume that that platform has always been targeted because it is a secure platform, like every spy, I'm certain, has been targeting it for exploitation.
Speaker 1:It's like a girl or guy where they're like it's the challenge of trying to get into something that is secure. That's like candy for these people that do this for a living, For a spy.
Speaker 2:I can't get into it, yeah right, that's where I want to go. I want to get right to that information For sure.
Speaker 1:For sure. So yeah right, that's where I want to go, I want to get right to that information?
Speaker 2:Yeah, for sure. So you know, I mean, from my perspective, is Signal still secure, I think, for anything below nation state communications? Sure, absolutely. I'm still not necessarily worried about it. Scanning QR codes, like look, we've talked about that in the past too. I'm not really convinced that QR codes are useful enough to solve. I just don't like them as a solution to the you know, get a quicker way to a link problem. I get it and it is quicker, don't get me wrong, but you can't get me to scan one, really you can't get me to scan one, really.
Speaker 1:So wait, wait. Are you meaning like, how does that? I know that there was something that with the QR codes, with Signal, but how does that relate to scanning a QR code? Are you saying, in general, if you have Signal and you use your camera on your phone to scan any kind of QR code, that makes you vulnerable to these type of?
Speaker 2:hackers. I'm saying when you look at a QR code, you actually have no idea where that thing is taking you.
Speaker 1:True.
Speaker 2:I can write below the QR code wwwgooglecom. That doesn't mean the QR code is taking you to wwwgooglecom, right.
Speaker 1:And it's used a lot. It's used if you want to connect your phone to your Netflix account on your TV, things like that you pull up.
Speaker 2:So COVID introduced QR codes at a lot of restaurants because we weren't using physical menus for a while. So it's still a bit persistent in the wild, so to speak. If I walked around town and just slapped a QR code that takes you to a site that I control that delivers a payload to your devices, there's no shortage of people that are just going to scan it, just because Now, if I get more intentional with that and maybe I go to a restaurant and I put that QR code over the QR code that the restaurant had there, or airports have them now too, tons of airports have them you pull up to the restaurant inside the airport and there's a QR code in front of you and you scan that to order and pay. So I create a site that looks exactly like theirs. I create a QR code that takes you to my site that looks like theirs.
Speaker 1:I go to the airport and I just put those over every place, Matt.
Speaker 2:That's like the new gas station credit card thing. That's it. It's yeah Right.
Speaker 1:That's way. That's a way broader, like scope too for people to. That's dangerous. That's genius, though, though just clicking on random it's smart, right?
Speaker 2:you just shouldn't just go around scanning random things.
Speaker 1:No offense to people, but there's a lot of dumb people out there yeah, well, there's a lot of trust, it's easy. Yeah, yeah, yeah, and I don't know the convenience of it and we've been.
Speaker 2:You know we've collectively and socially asked people to trust it also.
Speaker 1:Like you know again.
Speaker 2:Covid really drove a lot of trust in QR codes. People like, oh, another QR code, I'll scan that thing and like the same thing. So if I did that at every, at every place that there there's a QR code, I'm just going to use that airport reference again and you use Signal and you scan that QR code and I get it to link your Signal account to my controlled Signal instance. Boom, I'm in. But you know, the thing is best we can tell. In the case of that sensitive information getting out from those government officials, someone literally just fat fingered adding another contact to the group that's what it sounds like that.
Speaker 1:Was it Human error, which I mean? I don't know which is worse, admitting that that was the accident, or if they were hacked.
Speaker 2:Man, I almost prefer to know that it was just human error, right? And when I say prefer to know, because you're never going to train humans out of this, but it certainly points at least to the necessity to then leverage a technology stack that tries to eliminate as much human error as possible. Hence the reason skiffs existed in the first place, right so that even if someone messes up, there's a technology backstop that just doesn't allow that to happen.
Speaker 2:On the other side of that coin, the you know, and it's tough for me to call it a vulnerability, you know, yeah, we can argue it is a vulnerability, but you know, again, tricking someone into into clicking a link, it's just another form of phishing. Really, it's just another form of phishing and phishing is wildly successful. It is still the primary attack factor for most ransomware, like link clicking, just clicking on links the old thing you don't hack in you log in.
Speaker 1:Yeah, Get them to log in for you. That's why it'll never go away, because it's just adaptable.
Speaker 2:Yeah.
Speaker 1:Yeah, so the guy. Just for reference, if nobody is too familiar with this situation, the encrypted signal app is what Defense Secretary Pete Hegseth and other leading national security officials within the administration used to discuss bombing Huthu sites. This past month. The Atlantic editor's in-chief, Jeffrey Goldberg, was inadvertently added, as Gabe mentioned, to the group and was privy to the highly sensitive discussions. They're calling it a spillage, Gabe, Are you familiar with that term? It can be a career ender for a military officer. Spillage, yeah.
Speaker 2:I mean I'm familiar with that term in different contexts, I presume that in this context it means yeah, information spilled yeah yeah, yeah. Yeah. Was it really spilled, though, or did someone just like?
Speaker 1:they added an extra straw to the glass that wasn't supposed to be in there. Well, hopefully the truth, the full truth, will reveal itself at some point, but it's an interesting I mean. So what do you look at this in terms of? So human error is the culprit here, but what is there any advice that you have for anyone that uses Signal for sensitive information or anything to leave the listeners with this entire situation? What, I guess? A conclusion of what you feel about it?
Speaker 2:Depending again on what you're using Signal for. I personally wouldn't link Signal to multiple devices in the first place. I individually do not practice that. So you know, if something else showed up as a link device, that should be a flag. But back to those QR codes, I would I'd be super hesitant to scan any QR code.
Speaker 2:You know, when you scan a QR code at first, bring like in your camera. You can see the little link that it shows first. I don't know if you can actually filter that through something to check it first. I don't know if you can actually filter that through something to check it first. I don't know if you can like send the QR code to like virus total first or something. But you know, if that's a thing, we'll look into that and maybe report back on that. Maybe that becomes a better way to solve for that problem. The phone makers it seems like that's a natural place for them to insert some help for the rest of us. When you pick up your iPhone, maybe when it looks at a QR code it should first run that through VirusTotal and tell you what's happening.
Speaker 2:That's not 100% assurance that you're not going somewhere naughty but certainly it would be helpful.
Speaker 2:I would avoid QR codes, beth can. It's tough because I'd be lying if I said I don't. I have never created a QR code for others to scan. I have and I know others will trust it. But you know that's it's a true statement that someone could hijack those QR codes as well too. So you know, I always, I always practice putting the actual link next to or below the QR code if I create one, so you know someone who doesn't want to scan it can just manually enter it in.
Speaker 1:Yeah.
Speaker 2:Yeah.
Speaker 1:So businesses, you know, make sure you have that option, just in case, yeah, people that actually care about not scanning things and keeping their stuff safe, okay, well, let's, uh, let's move on to 23 and me then talk about talk about spillage. Yeah, I mean all of these companies. I feel like these were started just so they can get everybody's DNA anyways.
Speaker 2:I'm inclined to agree with you. I'm very much inclined to agree with you. And the question then becomes you know, in their filing for bankruptcy, who owns those assets? Yeah, you know. Do they just get destroyed? Probably not. Someone probably buys them out of bankruptcy, takes them out, and then what happens to that data? Probably buys them out of bankruptcy, takes them out, and then what happens to that data? Is the use of that data transferable to the new entity for a new purpose? Are they only still? This kind of goes back to some of the language that GDPR was originally formulated around. Right, like you know, the original purpose of collection and processing. If that original purpose of collection and processing and the entity that was granted the rights to collect it in the first place changes ownership, like what happens.
Speaker 1:It's a good question. My thought when I first heard about it was like you read the headline, it's like 23andMe's recent bankruptcy announcement set off a wave of concern about the fate of genetic data for its 15 million users. Who's actually like concerned about this? Is this just like people that? Because I guarantee you people are like, oh crap, I forgot about 23andMe. Yeah, I actually use that. Now I care about my privacy because they're going bankrupt. I mean, you probably shouldn't have given them your DNA in the first place if you're worried about your privacy.
Speaker 2:You know it's not, it's not untrue, but it's hard to expect the average consumer to have really thought about it that way. There's there's also the inverse of that, which was, you know, a lot of people were using services like those to reconnect with family members. Right, yeah good point. And to learn whether or not they maybe had genetic markers for cancers and stuff like that.
Speaker 1:Which is why it was a perfect product to create, because people are so curious.
Speaker 2:Yeah.
Speaker 1:And I wonder if it was created for malicious intent in the first place. I'm going to argue that it wasn't.
Speaker 2:Yeah, I hope not, but I would also argue that it should have been known that malicious use was always possible, like always possible. And it depends on where you want to draw the line of malicious. If that data ends up in the hands of, say, a health care organization who simply uses it to deny coverage because oh look, you have a DNA marker that says you know you're going to have, you might have an issue's. I don't think like we've talked about in the past.
Speaker 1:I don't think a lot of people create these kinds of companies for evil. I think it's smart because again they're thinking well, if we can get their DNA and help them find family members, people are going to be really interested in that. This could be a moneymaker Clearly going bankrupt. It kind of died out. I guess I don't know what the real reason, but usually you're not making enough to keep it going.
Speaker 2:That's usually the reason Sales aren't going well yeah.
Speaker 1:Well, any other thoughts on this? I mean, there's so much going on in the realm today.
Speaker 2:I think the key takeaway from our two topics today is a friendly reminder that security and privacy really does still have that weakest spot of humans being involved in the decision-making process, and so we should always be mindful of our activities. It shouldn't be that we have to think if I sign up for the service, what does that mean 20 years from now? But I think we now live, and have lived in a world for a decent amount of time where that should be a question we ask ourselves when I sign up for a service and I give information to company A, what becomes of that information in 10 years, 20 years, 30 years, 40 years? Right, yeah, we should be thinking about those things as individual.
Speaker 1:Yeah, because you're not only trying to protect yourself, you're trying to protect your family name and your children. It's just all digital. Now think of it as a treasure map to your family's riches. Sorry, I've been listening to a story about the Batavia. You know that story.
Speaker 2:I'm not that familiar.
Speaker 1:It's a good one, if you haven't heard it. It's like in the 1600s, where the death of the Batavia, where they crash or like the boat starts to sink and there's all these innocent families and stuff, because these workers that are pirates or whatever, they would take their families because they would try to move them to another part, wherever they're going to work, and stuff to get out of a certain area and they like crashed and or not crashed. Yeah, they did crash on Coral Reef. They crashed on Coral Reef, started the boat started sinking, so they went to nearby islands and it just turns into a massacre. You'll have to. Didn't know about it either, but pretty fascinating story, interesting.
Speaker 2:It makes you think like that long ago people were horrible people. I mean, 1600 sounds just generally speaking like a rough time to have been alive like whenever I ask the question would you rather live in the future or the past, like it's always? Future, like always yes, yeah, yeah totally future like totally.
Speaker 1:Could you imagine living in the Black Plague times? No, I could not.
Speaker 2:I absolutely could not, I absolutely do not want to, and there may be a Black Plague in the future. But the thing about the past is I already know how much shenanigans have occurred and I'm not interested in encountering that. Yes, there will be shenanigans in the future too, but I'll take my chances with those on.
Speaker 1:I'll take robots over the Black Plague, thanks, Right right, right, well, anyways, good stuff.
Speaker 1:All right, gabe, thanks for digging into that on your thoughts and if you got anybody, if anybody else listening anymore on these topics or any of the stuff that's going on with the Honda Fine and CCPA I mean that one's pretty self-explanatory. Condefine and CCPA I mean that one's pretty self-explanatory. They're just using outdated technology for opt-out mechanisms, making it too difficult for consumers. I mean, if we learn anything here in privacy and security is for consumers. Make opt-outs and opt-ins as simple as don't. Make it so difficult for people to think that they're giving up their information. It shouldn't be a quiz. It shouldn't be like this puzzle and maze. It should be a simple swipe right or swipe left, just like those dating apps. There it is, anyways, thanks for listening. We'll see you guys next week.