Nasir and Matt welcome security guru Daniel Libby to discuss the issues involved with the secrecy of whistleblower apps, and answer the question, "We had an issue with some customer info that was compromised. Should we tell our customers, and if so, how?"
Full Podcast Transcript
NASIR: Welcome to Legally Sound Smart Business.
This is Nasir Pasha.
MATT: And this is Matt Staub.
NASIR: And welcome to our podcast where we cover business in the news and add our legal twist and also answer some of your business legal questions at firstname.lastname@example.org. That was a pretty good intro.
NASIR: That was nice and clean.
MATT: Yeah, not too bad.
NASIR: If I may say so myself.
MATT: Yeah, it was good until you brought up the fact that it was nice and clean and good.
NASIR: I know. I can’t not comment on the intro. It sticks with me.
MATT: That’s fine.
NASIR: So, what do we have up today?
MATT: Well, this is a pretty interesting story. I was unaware of this before coming across this story but there’s a few apps out there – and the ones they mention are Whisper and Secret so you can probably figure out what these are – but it’s a way to communicate anonymously. I’ve checked them out and I still don’t fully understand the purpose of them but it’s basically a way to say what you have to say and do it was anonymity. The only problem is, you know, these apps have their own privacy policies in place. It basically allows them to take the information that’s communicated over these apps and give them to necessary people. It says, you know, law enforcement, subpoena for a civil lawsuit, or simply any accusation of wrongdoing on the service.
NASIR: That’s general.
MATT: Yeah, we touched on this last week with the Snapchat thing with information not disappearing. It’s another app that defeats its own purpose.
NASIR: Except Snapchat was violating their policy, right?
Daniel, welcome to the program.
DANIEL: Good morning, gentlemen!
I appreciate it very much. Thank you.
I’m curious whether you’ve even heard about these apps or not. I just wonder, the fact that you may have employees that are using this to whistleblow or to share company secrets is a little scary. I don’t know.
DANIEL: It really is and I think it’s a group of folks that are building some apps that basically take advantage of a current trend as so many do. I was surprised by the amount of venture capital that was put into an app like this because there are several others that do basically the same thing.
I would like to comment just real quick on your issue of Snapchat. The funny thing in the computer forensics world is we knew that Snapchat didn’t delete those photos and everything else right from the outset. It took the rest of the country maybe 18 or 24 months to figure it out, but we knew it right from the outset – that it didn’t do what it reported to do.
MATT: That’s good.
NASIR: Well, I think you can speak very well to even things that are deleted aren’t exactly deleted, right?
DANIEL: You can take it exactly out of that, depending on the operating system. Apple does a better job – and I don’t know how much time I have but a really quick way that I explain it to a jury is you walk into a library and a file system on a Windows computer is basically a library. It gives you an address on where the book is on a shelf. You can go there, check out the book, not a problem.
If you don’t want someone else to have the book, all you do is remove that reference from the card catalog. Now, the book is still on the shelf but no one else knows it’s there. That’s how Windows works with respect to deletes from the outset – that reference to where that data is and opens up a spot on the hard drive for another book to be written. You’re right. Nothing is truly ever deleted. You get the in privacy browsing issues that truly aren’t privacy protected. We can exploit those forensically. There’s a lot of things that are touted to provide additional security to users that really aren’t, if you read the fine print, just like the apps that you did on your intro that you were discussing. If you read the fine print…
NASIR: For their purpose which is kind of funny, it kind of defeats the whole purpose.
Well, here are some lessons from this, I think, from a small business perspective.
The reason why this company is hard to protect their information is because, if they get subpoenaed for a customer or one of their employees that violates an NDA, for example – a nondisclosure agreement – and they want to try to find out, “Okay, who is this person that divulged this information?” They’re going to have to comply with the subpoena or somehow fight it. But the fact that they’re not putting up any kind of resources to fight that is kind of strange in my mind.
DANIEL: Yeah, especially when they tout themselves as secret and protecting protecting and then you read the fine print and find out it’s absolutely not true. Yeah, the poor business then outside or the business owner is being literally attacked on so many levels. Employees no longer have the allegiance to an employer that they once had. Employers no longer necessarily have allegiance to employees in the last twenty, twenty-five years. A lot of that trust has eroded in with the technology and the ease of which stealing of data, transferring of data, intellectual property theft, all those kind of things, you can truly hurt your employer in a really big way really quick, just ask and I refuse to use the individual’s name because I worked for the organization for twenty-five years, but I will use the name Snowden. See how quick you can really truly – right, wrong, or indifferent – what he did. I won’t give you my opinion on that but how he changed things – and I don’t personally think for the better. If you’ll indulge me just for a second, we’re seeing new cryptographic protection on Al Qaeda right now. Al Qaeda is reacting too in their communications between their element to information that was leaked by Snowden.
NASIR: What’s interesting too is that these whistleblowers, you mentioned Snowden and so forth, there’s actually a lot of laws, I don’t know about Snowden but a lot of laws that protect whistleblowers – both state and federal. I believe there’s items regarding any kind of – if you’re working for a federal government too and you basically are whistleblowing some kind of mishandling of funds, et cetera – there are some protections for those individuals and beyond that, if it’s for example you have an employee that is complaining about some kind of labor law violation, having repercussions whether you’re terminating them or whatever or even kind of punishing them somehow is also against the law as well.
DANIEL: And there are avenues for addressing those issues. You know, I just attended a labor law conference last week where this was brought up about how to address employee grievances and things like that and making sure that they have a method that the company supports to air grievances or their concerns and their issues or potential violations of the law, you know, law and policy issues and things like that. I think a lot of folks use the whistleblower as a vindictive venue and sometimes it’s absolutely necessary. There is an organization that is absolutely doing something wrong – you know, one of the biggest ones is in law enforcement. You don’t want to necessarily be the one that is ratting out all your fellow coworkers but, if there’s something that’s wrong, there needs to be a venue where that is communicated anonymously.
NASIR: Hey, you can use Whisper.
MATT: All right, we’re going to get into the question of the day. Daniel, hopefully you’ll want to stick around here. We’ll want to get your perspective, too.
“We had an issue with some customer info that was compromised. Should we tell our customers? If so, how?”
This comes from a surf shop in San Diego.
I think, from the legal perspective, this seems like a pretty obvious answer.
I would say, yes, you probably want to tell your customers about this.
NASIR: Yeah. Actually, there’s a requirement. California passed a law I think in 2003. I think there’s also, even for publicly traded companies, they also have a requirement for certain information now. I think the SEC released that. And so, any time when you have data that you’re supposed to be holding privately and it gets leaked somehow, you have to have that disclosure. I know it kind of sucks but that’s kind of the nature of it. You’ve been entrusted with this information. If you lose it, then you’re going to have to do something about it.
Daniel, what do you think? Do you have any experience with these kind of data breaches?
DANIEL: Unfortunately, a lot more than a lot of folks would like to admit. That’s true. The reason for the law – and that’s a good law – is because what used to happen in the past was it was an embarrassment and a potential hit on your corporate bottom-line if you had to acknowledge the big end, let’s take a surf shop at the low end. Usually, what the common denominator is in something like that is either an individual taking your private information from your client and that may include financial information and things like that – credit card numbers and all of that. Or it was a breach of one of your servers or something like that – processes to your credit cards. Now, you have a requirement and the credit card companies are going to come down very, very expensive to do a thorough investigation which is what the credit card companies require or they will terminate your contract.
NASIR: Very good. Well, obviously, Daniel pretty much knows everything about technology and forensics but why don’t you tell us a little bit about what your company does because I know you don’t do everything. You do have some niches there.
DANIEL: Well, thank you and I appreciate the opportunity.
We’re kind of a unique firm in that being a staunch constitutionalist, I don’t make value judgments. What’s nice about digital evidence from the forensic perspective is it’s either there or it’s not. If it’s not, why? If it’s there, what does it mean? It’s not like a soft science. When we go into court to testify, rarely do we disagree with the opposing expert. We may differentiate on how something got there or something of that nature, but the bottom-line is, you know, we bring facts to a court or to a hearing or whatever. We do the litigation side of the world. We examine everything from cellphones to servers, video systems, audio systems.
The big thing for us and I just invested very heavily in this is, you know, mobile devices and things like that. We’re the only firm in the country – and I’m the only examiner in the country – that is trusted and respected to do criminal on both sides of the aisle. I do prosecutorial and defense work. Normally, I do one or the other but not both, and we do civil. We do a fair amount of employment law related violations. We do incident response from the perspective of not that “I have been hacked” but incident response from the perspective of I had an employee and “I think they’ve gone bad on me, I think they’ve taken all of my intellectual property, what do I do?”
NASIR: Yeah, that’s very common.
DANIEL: I just worked a case, 225 hours and 16 days in order to get a TRO, a temporary restraining order, so that the individual who illegally took that information from his company could not then use it. It was very, very specific information and, had it gone anywhere else, the industry that he came from would have known that.
NASIR: That’s crazy.
DANIEL: It’s everything – cellphones to servers, video systems, audio systems. We work in the background, usually. A lot of folks don’t know who we are. We don’t advertise – as a general rule. Everything is word of mouth. There’s no signs on the door of our laboratory. That’s who we are in a nutshell.
NASIR: Yeah, I appreciate you coming on the show and we’ll definitely put your website and information on our show notes, of course. That’s Daniel Libby from Digital Forensics Incorporated. I think that’s our show, right, Matt?
MATT: Yeah, that’s it. Thanks, Daniel!
DANIEL: Thank you, gentlemen. I appreciate it.
NASIR: Very good.
MATT: All right, keep it sound and keep it smart.