Quantcast
Channel: Center for Internet and Society - Electronic Surveillance
Viewing all articles
Browse latest Browse all 85

I Have a Lot to Say About Signal’s Cellebrite Hack

$
0
0

This blog post is based off of a talk I gave on May 12, 2021 at the Stanford Computer Science Department’s weekly lunch talk series on computer security topics. Full disclosure: I’ve done some consulting work for Signal, albeit not on anything like this issue. (I kinda doubt they’ll hire me again if they read this, though.)

You may have seen a story in the news recently about vulnerabilities discovered in the digital forensics tool made by Israeli firm Cellebrite. Cellebrite's software extracts data from mobile devices and generates a report about the extraction. It's popular with law enforcement agencies as a tool for gathering digital evidence from smartphones in their custody. 

In April, the team behind the popular end-to-end encrypted (E2EE) chat app Signal published a blog post detailing how they had obtained a Cellebrite device, analyzed the software, and found vulnerabilities that would allow for arbitrary code execution by a device that's being scanned with a Cellebrite tool. 

As coverage of the blog post pointed out, the vulnerability draws into question whether Cellebrite's tools are reliable in criminal prosecutions after all. While Cellebrite has since taken steps to mitigate the vulnerability, there's already been a motion for a new trial filed in at least one criminal case on the basis of Signal's blog post. 

Is that motion likely to succeed? What will be the likely ramifications of Signal's discovery in court cases? I think the impact on existing cases will be negligible, but that Signal has made an important point that may help push the mobile device forensics industry towards greater accountability for their often sloppy product security. Nevertheless, I have a raised eyebrow for Signal here too.

Let’s dive in.

PART I.A: ABOUT CELLEBRITE 

What is Cellebrite? 

Cellebrite is an Israeli company that, per Signal’s blog post, “makes software to automate physically extracting and indexing data from mobile devices.” A common use case here in the U.S. is to be used by law enforcement in criminal investigations, typically with a warrant under the Fourth Amendment that allows them to search someone’s phone and seize data from it. 

Cellebrite’s products are part of the industry of “mobile device forensics” tools. “The mobile forensics process aims to recover digital evidence or relevant data from a mobile device in a way that will preserve the evidence in a forensically sound condition,” using accepted methods, so that it can later be presented in court. 

Who are their customers?

Between Cellebrite and the other vendors in the industry of mobile device forensics tools, there are over two thousand law enforcement agencies across the country that have such tools — including 49 of the 50 biggest cities in the U.S. Plus, ICE has contracts with Cellebrite worth tens of millions of dollars. 

But Cellebrite has lots of customers besides U.S. law enforcement agencies. And some of them aren’t so nice. As Signal’s blog post notes, “Their customer list has included authoritarian regimes in Belarus, Russia, Venezuela, and China; death squads in Bangladesh; military juntas in Myanmar; and those seeking to abuse and oppress in Turkey, UAE, and elsewhere.” 

The vendors of these kinds of tools love to get up on their high horse and talk about how they’re the “good guys,” they help keep the world safe from criminals and terrorists. Yes, sure, fine. But a lot of vendors in this industry, the industry of selling surveillance technologies to governments, sell not only to the U.S. and other countries that respect the rule of law, but also to repressive governments that persecute their own people, where the definition of “criminal” might just mean being gay or criticizing the government. The willingness of companies like Cellebrite to sell to unsavory governments is why there have been calls from human rights leaders and groups for a global moratorium on selling these sorts of surveillance tools to governments. 

What do Cellebrite’s products do?

Cellebrite has a few different products, but as relevant here, there’s a two-part system in play: the first part, called UFED (which stands for Universal Forensic Extraction Device), extracts the data from a mobile device and backs it up to a Windows PC, and the second part, called Physical Analyzer, parses and indexes the data so it’s searchable. So, take the raw data out, then turn it into something useful for the user, all in a forensically sound manner. 

As Signal’s blog post explains, this two-part system requires physical access to the phone; these aren’t tools for remotely accessing someone’s phone. And the kind of extraction (a “logical extraction”) at issue here requires the device to be unlocked and open. (A logical extraction is quicker and easier, but also more limited, than the deeper but more challenging type of extraction, a “physical extraction,” which can work on locked devices, though not with 100% reliability. Plus, logical extractions won’t recover deleted or hidden files, unlike physical extractions.) As the blog post says, think of it this way: “if someone is physically holding your unlocked device in their hands, they could open whatever apps they would like and take screenshots of everything in them to save and go over later. Cellebrite essentially automates that process for someone holding your device in their hands.”

Plus, unlike some cop taking screenshots, a logical data extraction preserves the recovered data “in its original state with forensically-sound integrity admissible in a court of law.” Why show that the data were extracted and preserved without altering anything? Because that’s what is necessary to satisfy the rules for admitting evidence in court. U.S. courts have rules in place to ensure that the evidence that is presented is reliable — you don’t want to convict or acquit somebody on the basis of, say, a file whose contents or metadata got corrupted. Cellebrite holds itself out as meeting the standards that U.S. courts require for digital forensics.

But what Signal showed is that Cellebrite tools actually have really shoddy security that could, unless the problem is fixed, allow alteration of data in the reports the software generates when it analyzes phones. Demonstrating flaws in the Cellebrite system calls into question the integrity and reliability of the data extracted and of the reports generated about the extraction. 

That undermines the entire reason for these tools’ existence: compiling digital evidence that is sound enough to be admitted and relied upon in court cases.

PART I.B: THE HACK

What was the hack?

As background: Late last year, Cellebrite announced that one of their tools (the Physical Analyzer tool) could be used to extract Signal data from unlocked Android phones. Signal wasn’t pleased.

Apparently in retaliation, Signal struck back. As last month’s blog post details, Signal creator Moxie Marlinspike and his team obtained a Cellebrite kit (they’re coy about how they got it), analyzed the software, and found vulnerabilities that would allow for arbitrary code execution by a device that's being scanned with a Cellebrite tool. According to the blog post:

Looking at both UFED and Physical Analyzer, ... we were surprised to find that very little care seems to have been given to Cellebrite’s own software security. Industry-standard exploit mitigation defenses are missing, and many opportunities for exploitation are present. ...

“[W]e found that it’s possible to execute arbitrary code on a Cellebrite machine simply by including a specially formatted but otherwise innocuous file in any app on a device that is subsequently plugged into Cellebrite and scanned. There are virtually no limits on the code that can be executed.

“For example, by including a specially formatted but otherwise innocuous file in an app on a device that is then scanned by Cellebrite, it’s possible to execute code that modifies not just the Cellebrite report being created in that scan, but also all previous and future generated Cellebrite reports from all previously scanned devices and all future scanned devices in any arbitrary way (inserting or removing text, email, photos, contacts, files, or any other data), with no detectable timestamp changes or checksum failures. This could even be done at random, and would seriously call the data integrity of Cellebrite’s reports into question.

Signal also created a video demo to show their proof of concept (PoC), which you can watch in the blog post or their tweet about it. They summarized what’s depicted in the video:

[This] is a sample video of an exploit for UFED (similar exploits exist for Physical Analyzer). In the video, UFED hits a file that executes arbitrary code on the Cellebrite machine. This exploit payload uses the MessageBox Windows API to display a dialog with a message in it. This is for demonstration purposes; it’s possible to execute any code, and a real exploit payload would likely seek to undetectably alter previous reports, compromise the integrity of future reports (perhaps at random!), or exfiltrate data from the Cellebrite machine.

What did Signal say they’re going to do about this?

The blog post announced that going forward, in the future, the Signal app will add “aesthetically pleasing” files, periodically and at random, to Signal’s app data caches on Signal users’ phones. Here’s the last paragraph of the blog post: 

In completely unrelated news, upcoming versions of Signal will be periodically fetching files to place in app storage. These files are never used for anything inside Signal and never interact with Signal software or data, but they look nice, and aesthetics are important in software. Files will only be returned for accounts that have been active installs for some time already, and only probabilistically in low percentages based on phone number sharding. We have a few different versions of files that we think are aesthetically pleasing, and will iterate through those slowly over time. There is no other significance to these files.

What exactly does that mean? Only Moxie and his team know. The rest of us are left to guess. I literally had a reporter tell me that they couldn’t tell if this part of the blog post was a joke or not.

One interpretation is that “aesthetically pleasing” means they’re image files — like, pictures of cats or something — that the Signal user never actually sees and did not actively put in app storage themselves. Another interpretation, ifwe assume those “aesthetically pleasing” files do what a “real exploit payload” could do, then (absent a mitigation by Cellebrite) these files could affect a Cellebrite machine if that phone got analyzed with a Cellebrite tool while those files were in app storage. 

If nothing else, it means that if they follow through on what they say they’ll do, then Signal will add noise to the, uh, signal in the Signal app’s local storage on some users’ phones. But only some users, and Signal won’t know which users, and the files will change periodically, if they’re there at all. It won’t be the case that all users of Signal will have the same files added by Signal into local storage at all times going forward. 

What did Signal suggest Cellebrite should do about this potential exploit?

Here’s what Signal suggested Cellebrite should do:

Any app could contain such a file [i.e. a booby-trapped file], and until Cellebrite is able to accurately repair all vulnerabilities in its software with extremely high confidence, the only remedy a Cellebrite user has is to not scan devices. Cellebrite could reduce the risk to their users by updating their software to stop scanning apps it considers high risk for these types of data integrity problems, but even that is no guarantee.

Basically, what they’re saying is: “We’re going to screw with you for adding support to Cellebrite for Signal data. If you want to be sure of your own data integrity, your users (the cops) should stop scanning phones that have Signal installed. But even then, you can’t really be sure, because the apps that you or law enforcement deem high-risk might not be the ones poisoning your machines. The only way to be sure is for your users (the cops) to stop doing the one thing that your tools are made to do, which ultimately could put you out of business.” 

Tl;dr: “Delete your account.”

Signal went on, “We are of course willing to responsibly disclose the specific vulnerabilities we know about to Cellebrite if they do the same for all the vulnerabilities they use in their physical extraction and other services to their respective vendors, now and in the future.”

Basically, “I’ll show you mine if you show me yours.” That is not generally how vulnerability disclosure works, and AFAIK, Cellebrite has not taken them up on the offer so far.

By the way, this isn’t the first time Cellebrite’s been outed for having shoddy security. In 2017, a hacker hacked Cellebrite’s servers and “obtained 900 GB of data related to Cellebrite,” including (1) Cellebrite customers’ usernames and passwords for logging into its websites; (2) “a vast amount of technical data regarding Cellebrite's products”; and even (3) “what appear[ed] to be evidence files from seized mobile phones, and logs from Cellebrite devices.”

PART I.C: CELLEBRITE’S RESPONSE

What was Cellebrite’s actual response to the hack?

According to Vice, a few days after the blog post, “Cellebrite pushed an update to its customers … limit[ing] what products can perform a logical iOS extraction.” The company didn’t admit whether the vuln was the one Signal described. (But basically everybody assumes that’s the case.) Cellebrite did say, “Based on our reviews, we have not found any instance of this vulnerability being exploited in the real-life usage of our solutions.” A Cellebrite customer who commented to Vice said, “It appears to be an attempt to minimize the attack surface[,] not a ‘fix[.]’”

From the news reports, it sounds like Cellebrite has temporarily turned off iPhone support for the Physical Analyzer tool. (Note Cellebrite only turned off support for Physical Analyzer, even though the Signal blog post’s demo was about the UFED software and they said similar exploits exist for Physical Analyzer.) You’ll recall that Physical Analyzer is the second part of the two-part system. UFED creates the backup, Physical Analyzer parses the files. 

But even though UFED has vulns too, Cellebrite customers can still use UFED to dump the data from iPhones onto a local backup. You can back up the data but you can’t do anything with it for now. [EDIT 5/16: This isn't accurate; I've been corrected about this by a Cellebrite user in this Twitter thread.] That’s still kinda weird, because if vulns in UFED could also alter data, why keep support for UFED on? Isn’t there a risk that those data dumps could be altered? My guess: Cellebrite’s going halfsies because it would be even more disastrous for their business to yank support for both products, and they’re confident enough that there aren’t any real-world exploits for UFED that they left it working for iPhones, since they figure customers will want to keep preserving evidence with those data dumps (which is surely easier than keeping the phone powered on, charged, and in an unlocked state indefinitely), but they deemed the Physical Analyzer vulns more dangerous, so that’s the part they decided to pause for now. But that’s just my guess. In any event, this is just a Band-Aid solution: Cellebrite will have to restore iOS support for Physical Analyzer sooner or later.

It’s like there’s a bull that’s in the yard outside a china shop, and it’s been locked in the yard inside the fence. So it’s being contained there. That’s not a long-term solution, and the bull might still do damage to the yard, but the owners of the china shop think the bull will probably be chill, and any damage won’t be as bad as it would be if the bull were to get inside the china shop. And to keep the bull from going inside the china shop, for now, they boarded over the door to the china shop. But inside, the shop is still full of fragile, breakable china. It won’t be safe to turn the Physical Analyzer back on until they’ve converted the china to adamantium or something. (Yeah, sorry, it’s not the best metaphor.)

PART II: THE LAW

So what does Signal’s stunt mean for law enforcement use of Cellebrite?

The journalist Thomas Fox-Brewster summarized the theoretical fallout succinctly in Forbes:

“This could be a severe issue for the many police agencies using Cellebrite across the world. If a criminal can hack a Cellebrite device by running a malicious file like the one described by Marlinspike, they could spoil evidence.” 

Uh, is that legal?

No, intentionally spoiling evidence — or “spoliating,” to use the legal term — is definitely not legal. 

Neither is hacking somebody’s computer, which is what Signal’s blog post is saying a “real exploit payload” could do. It said, “a real exploit payload would likely seek to undetectably alter previous reports, compromise the integrity of future reports (perhaps at random!), or exfiltrate data from the Cellebrite machine.” All of those things are a violation of the federal anti-hacking law known as the Computer Fraud and Abuse Act, or CFAA, and probably also of many state-law versions of the CFAA. (If the computer belongs to a federal law enforcement agency, it’s definitely a CFAA violation. If it’s a state, local, or tribal government law enforcement agency, then, because of how the CFAA defines “protected computers” covered by the Act, it might depend on whether the Windows machine that’s used for Cellebrite extractions is connected to the internet or not. That machine should be segmented apart from the rest of the police department’s network, but if it has an internet connection, the CFAA applies. And even if it doesn’t, I bet there are other ways of easily satisfying the “protected computer” definition.)

So is, uh, is Signal going to update its app to make it hack police computers? Recall what Signal said about how “upcoming versions of Signal will be periodically fetching files to place in app storage...”. It’s very cutesy, coy, evasive language and it doesn’t say exactly what the hell they mean by that. They’re winking and smiling and nudging the reader instead of being clear.

They seemto be implying — or at least they seem to intend for the reader, and more importantly Cellebrite and its customers, to infer — that Signal will add “innocuous” code to their app that might, maybe, alter the data on a Cellebrite machine if the phone gets plugged into it. If they’re saying what they’re hinting they’re saying, Signal basically announced that they plan to update their app to hack law enforcement computers and also tamper with and spoliate evidence in criminal cases. 

When you put it that way, it becomes clear why they were using such coy language and why I bet they’re bluffing: Those things are illegal. It’s a stunt that could get their own users in trouble (if the user gets blamed for what her phone does to a Cellebrite machine, she will be plunged into a world of pain, irrespective of whether she would ultimately be held culpable for the design of an app she had installed on her phone), and could get them in hot water (because they intentionally designed and put those booby-trapped files on the user’s phone).

Plus, admittedly I haven’t actually looked into this at all, but it seems like it could get Signal kicked out of the Apple and Google app stores, if the companies interpret this as a violation of their app store rules against malware. (It wouldn’t actually help protect privacy or free expression or human rights, as Signal prides itself on doing, if people can’t install and update the app, or if they sideload malicious fake versions of Signal that some cybercrime gang or evil government puts out there.)

So my guess is that they’re playing this nudge-wink, plausible deniability, vague language game, where maybe you might infer that they’re going to make their app hack Cellebrite machines and spoil evidence, but in actuality they never had any intention of actually doing that. It was just to mess with Cellebrite and make a point. At most, maybe they stick some files in app storage that don’t do anything malicious at all. And maybe Cellebrite’s prompt response conveniently gave Signal an out from having to follow through, on top of the plausible deniability of their cutesy evasive language.

Still, it’s a weird choice to make, for the public-facing official communications of an organization that makes an app with millions of users around the world, to kinda-sorta vaguely announce that you maybe just might redesign your app to break the law and screw with law enforcement.

(Coincidentally — or “In completely unrelated news,” to borrow Signal’s parlance — Signal doesn’t have their own in-house General Counsel. At this point, with many millions of users around the globe depending upon them for their privacy, security, and even physical safety, they really should. They could certainly afford to hire a GC: the official organization behind the Signal app, the Signal Technology Foundation, is a nonprofit, but they have a 50-year interest-free loan for $105 million from one of the billionaires who sold WhatsApp to Facebook. I must admit, though, that for all my quibbles with their comms strategy, Signal is looking more and more like the only reasonable E2EE messaging app option given the direction the alternative is taking. But I digress.)

Will this mean a bunch of defendants’ criminal cases get thrown out?

No. 

This is just a PoC. Yes, research showed there’s this flaw, but Signal’s demo is just a demo. It doesn’t mean the vuln they found was ever actually exploited in the wild. Cellebrite told their customers they don’t believe it was, though they didn’t say how they reached that conclusion. (And, well, there are obvious reasons to be skeptical of the quality of their incident response.)

But criminal defense attorneys are still going to try to make use of this — as they should; they should hold the prosecution accountable for the reliability of the evidence used against their clients. There’s a case in state court in West Virginia where the case already went to trial, Cellebrite evidence was introduced, the defendant was convicted, and based on this blog post, the attorney moved for a new trial and to examine the Cellebrite machine. I suspect there’ll be other attorneys filing similar motions. 

My guess is that these defense lawyers are unlikely to get their clients a new trial in many, if any, of the cases where a verdict has already been returned; but that in any ongoing open cases, those lawyers have better odds of getting the court to grant them the chance to examine the Cellebrite machine if they didn’t do so before (or maybe to examine it again if they did).

The thing to understand is that the mere speculative possibility that data in a Cellebrite report might have been altered isn’t going to sway any judges. If you’re a defense attorney, just showing the court this blog post saying “oh, Cellebrite software has a lot of vulns, and there was this vuln in particular, here’s a sample exploit for it, and oh by the way maybe Signal will do something to exploit it in future versions of the Signal app”: that’s not going to be enough.

Another reason that legal challenges probably won’t go very far is that it should be pretty straightforward for law enforcement to disprove an accusation about the Cellebrite machine. In a recent legal webinar about mobile device forensics tools, the discussion touched upon Signal’s Cellebrite hack. One of the panelists pointed out that Cellebrite’s not the only game in town when it comes to these extraction tools. It’s a whole industry, it’s not just this one company, although Cellebrite is probably the best-known actor in that industry. Therefore, as the panelist pointed out, if you’re law enforcement, you can just perform the same extraction through a different program, and there won’t be a problem because this flaw is unique to Cellebrite. Sure, probably those other companies’ tools have bugs too (and they should get their act together too), but there’s been no showing that every other tool out there has an identical flaw that could be exploited in an identical way. So Signal’s hack doesn’t draw into doubt all mobile device forensics tools. 

Thus, if there’s a challenge to the integrity of Cellebrite data in a particular criminal case, the prosecution should be able to readily prove there’s no corruption by just running the extraction through more than one forensic program besides Cellebrite. Then they could compare the outputs of the two tools and see if there are differences. Or, they could just not use Cellebrite at all and just use the other tool. (Of course, inducing the cops to stop using Cellebrite would be some sweet revenge for Signal.)

In some cases, the prosecution could also potentially use witness testimony by a law enforcement officer to corroborate what’s in the Cellebrite report and show that the report is accurate. Remember, this Cellebrite two-part system works on already unlocked phones. And criminal suspects will often consent to unlock their phones when the police ask them to. (You don’t have to, you can say no, but people often say yes.) If that happened in a case where the Cellebrite data was challenged, the state could call the cop to testify and the cop might be able to say, “The defendant unlocked his phone for me and I flipped through it and saw these incriminating texts in Signal with the timestamp on them from such-and-such a date, and that was before we took the phone back to the police station and plugged the phone into the Cellebrite machine. And looking at this Cellebrite report, yep, what’s in the report matches up with my memory: same text messages, same timestamp as I remember seeing when I was doing my manual inspection of the contents of his phone.”

The point being, as I wrote on Twitter at the time, these challenges will go nowhere unless the defense can come up with some plausible actual evidence of corruption of the Cellebrite machine or the data extracted. There has to be something to show that the Cellebrite data extraction using that Cellebrite tool on this phone is not reliable. No judge will throw out evidence from a Cellebrite analysis just because Signal did a PoC.

So I hope lawyers are advising their clients that moving for a new trial, or moving to (re)examine the Cellebrite device as part of an ongoing prosecution, solely on the basis of Signal’s blog post, is a Hail Mary move that will probably go nowhere. Lawyers absolutely should do this, in order to zealously represent their clients (and put the prosecution through its paces), but it probably won’t change the result.

What if it turns out the Cellebrite data was corrupted?

Even in the unlikely event that it turns out this exploit was used in the wild, or if a Cellebrite machine otherwise turns out after testing to be unreliable, that’s not a guarantee that every criminal case out there that involved a Cellebrite (which is probably a lot of cases) deserves to get the conviction thrown out. It may happen in some cases, but not in all.

That’s because of a doctrine in the law that, while it’s complicated in practice, essentially boils down to saying, “OK, so the unreliable data shouldn’t have been allowed into evidence. Let’s pretend that it hadn’t been admitted: would that have had a big effect on the jury’s verdict, or would it probably have been the same as it was when the unreliable evidence was let in?” If the jury probably would’ve convicted anyway even without the Cellebrite evidence, then the guilty conviction will stand.

My guess is that it’s pretty rare that the Cellebrite evidence is the dispositive crux of a case, meaning, but for evidence pulled from a phone using a Cellebrite device, the jury would not have voted to convict. Surely it happens sometimes, but not in every case. Plus, the courts have an unfortunate tendency to say “yeah, the jury would’ve convicted anyway.” This doctrine doesn’t actually come out super favorably to the defense when applied in real cases. Like many things about the American justice system, it’s not fair but it’s the reality. 

Plus, there’s a reason why juries would likely convict anyway in many cases. In a typical case, the Cellebrite data will be just one part of the evidence presented by the prosecution. In many cases the prosecution will have a lot of evidence that they can obtain from online service providers rather than just relying on the local client-side copy that was extracted using Cellebrite. That might include, for example, iCloud backups; phone records from the phone company showing whom you called or texted and who called or texted you; cell-site location information, GPS data, or other location information showing where you were when; emails from Gmail; posts and DMs from social media; and so on. Much of the data about us is in the cloud, or otherwise held by third parties and readily available to the government with the right legal process. 

Signal is an outlier in that it keeps almost zero data about its users. All they can provide in response to a subpoena or other legal process is “Unix timestamps for when each account was created and the date that each account last connected to the Signal service. That’s it.” So you can see why law enforcement would really want Cellebrite to work for Signal data in particular: a user’s phone is the only place to get Signal messages. (Of course, by definition there are multiple participants in any conversation. QED.) Still, Signal data is often going to be one piece of evidence among many.

What’s more, digital evidence isn’t the only evidence. A typical case may also involve testimony by multiple witnesses (perhaps including a police officer who saw incriminating files on the phone while going through it manually, as noted above), a confession by the defendant, testimony by a co-conspirator who turns state’s evidence and points the finger at the defendant, hard-copy documents and other items seized with a search warrant, and so on.

PART III: WHY IT MATTERS

Why does this matter, then, if Cellebrite mitigated the flaw and nobody’s case gets thrown out?

It matters because people have a constitutional right to a fair trial, and to confront the evidence against them, thanks to the Sixth Amendment. We also have a constitutional right to procedural due process under the Fifth Amendment, meaning that if you are haled into court, there are rules; it’s not just a Kafka-esque show trial in a kangaroo court where anything goes. Signal’s hack demonstrated an important point: if you’re going to convict somebody of a crime, put them behind bars, and take away their freedom, based in part on the reports from a computer system, then at least that system should have adequate security. 

Moxie’s hack might affect past cases, though I doubt it’ll change the outcome in many, if any. And, this particular finding won’t affect future cases. That’s because Cellebrite apparently has already issued a mitigation (a stop-gap measure to stanch the bleeding). (Which, if you’re law enforcement, you’re fuming mad at Cellebrite: what the hell did you pay them all this money for? As said, Cellebrite will have to get Physical Analyzer working on iPhones again before too long.) 

Going forward, Signal’s hack may induce more defense attorneys to demand to examine Cellebrite devices more often — not just because of this bug, but because if this exploitable bug was in there, surely there are others, too. The blog post makes it sound like Cellebrite’s security is so bad that there’s plenty of low-hanging fruit in there. 

Giving defense attorneys more ammo to push back harder against the use of Cellebrite devices against their clients is Good and Right and Just. The general point that Moxie made —  Cellebrite’s tools are buggy A.F. and can be exploited in ways that undermine the reliability of their reports and extractions as evidence, which is the entire point of their existence — is actually more important than the specifics of this exploit or of Signal’s announced app update (because Cellebrite’s already kinda-sorta mitigated against this exploit).

The bigger point is that Cellebrite, like other law enforcement tech vendors, plays fast and loose with the technology that they’re selling. Law enforcement and prosecutors then rely upon these tools, despite their half-assed security, to justify taking people’s freedom away. So Signal turned the tables on them, and they showed that the emperor has no clothes. Signal did a public service by destabilizing the perception of reliability in Cellebrite. 

Even if Signal’s blog post doesn’t get anybody a new trial, it proves the point that courts shouldn’t rely so readily on Cellebrite or other such law enforcement technology. Megan Graham, who supervises a technology law & public policy clinic at Berkeley Law, discussed the likely ramifications of Signal’s hack in a thread on Twitter and comments to the press. Drawing on her deep experience working on these issues, she noted that law enforcement tech vendors’ approach to security is usually basically “YOLO.” She wasn’t surprised at how bad Cellebrite’s security is. 

As Graham said, hopefully the takeaway for judges is to dig more in the future into how reliable these law enforcement technologies actually are before allowing in evidence that was obtained using these tools. This could be a wake-up call for courts — but it’s gonna be an uphill battle.

There are big obstacles between the world as it is and the world as it should be. The courts — especially the state courts, like the one in West Virginia where that one lawyer already asked for a new trial — are very busy and very short on resources. (Graham and I both know that firsthand, as we both used to be the clerks to federal magistrate judges.) Judges don’t have a lot of time; they have really heavy caseloads so there’s just too much to do. They probably lack the background to understand these new technologies on their own. They don’t necessarily have the in-house resources and personnel to do it for them, because court budgets are always strapped. That said, judges can and do attend trainings on current topics and new technologies like this... if somebody’s offering them. (And who’s offering them may be prosecutors and the vendors of these tools.)

And anyway, judges don’t run the parties’ cases for them. Judges can decide “sua sponte” (on their own) to challenge something that one of the parties says or does, they can decide on their own to tell one of the parties to do something in particular, but for the most part, objecting to something one of the parties says or does is the lawyers’ job, not the judge’s. So it’s usually going to be up to the defense lawyer in these cases to bring a challenge to a particular forensic tool or process, or challenge an expert witness’s qualifications. 

That costs time and money and resources, and those just won’t always be available to every defendant, who, let’s face it, is playing on an unlevel playing field in the American criminal justice system — by design. In our system, the cards are stacked in favor of the prosecution. And that’s why companies like Cellebrite get away with sloppy work.

So Signal’s stunt should be a wake-up call to the courts that these tools aren’t as reliable as they’re held out to be, and they really ought to be a lot more secure, and the courts should really dig in there more. But the process for actually holding these law enforcement technology vendors to account and forcing them to do better is very slow. We’re not going to see a massive sea change overnight just from this blog post. The police departments that use Cellebrite, and the mobile device forensics industry in general, are on notice that they need to get their act together. But unless and until judges and criminal defense attorneys force them to — or, perhaps more realistically, unless and until their law enforcement customers force them to by refusing to give them any more money until they step their game up — companies like Cellebrite will continue to skate by. And that’s the value of this hack. It’s one step towards forcing more accountability.

Plus, the problem is not just mobile device forensics tools

Another reason that this topic is important is that Cellebrite is not the only example of a technology that gets sold by private-sector vendors to some unit of government — law enforcement, a state administrative agency, or the courts, for example — which then gets used to help convict criminal defendants or otherwise affect people’s rights and lives and livelihoods. There’s a ton of vendors out there that sell their tools to some part or another of the country’s state, local, tribal, and federal governments. 

But their tools are often a black box. It’s not clear how they work, or whether they actually work the way the vendors say they do. The vendors aren’t above making the government customers that buy their tools sign a non-disclosure agreement saying they won’t disclose anything about the tool they’re paying for — despite the fact that the use of these tools can implicate people’s constitutional rights, which is not something you can just wipe away with a contract. 

Other examples of technologies that have been paid for with your tax dollars include:

(1) TrueAllele, a software program for helping law enforcement analyze DNA samples by using what’s called “probabilistic genotyping, which uses complex mathematical formulas to examine the statistical likelihood that a certain genotype comes from one individual over another,” which costs $60,000 to license, and whose vendor has fought tooth and nail to keep its source code from being examined by criminal defense counsel, citing trade secrecy (but, in at least one recent case, losing);

(2) Stingray devices, aka IMSI catchers — devices used by law enforcement which mimic cell phone towers and force the phones of everybody in the area to connect to them, whose vendor made law enforcement agencies sign NDAs that led the agencies to outright lie to courts in multiple cases about the very existence and use of Stingray devices in criminal cases; and

(3) algorithms that are used by the state to make decisions about such crucial choices as:

- whether arrestees who are in jail should or shouldn’t be let out on bail until their trial; 

- whether employees should be given or denied a promotion

- whether benefits applicants should get benefits such as Medicaid, Medicare, unemployment, and Social Security Disability Insurance. 

Understandably, the people who are on the receiving end of these technologies want to know how the tools work. So their lawyers push for access to peek inside the black box and examine the hardware and/or software and even the source code, to see how it works and try to figure out if it’s flawed. And often the response, as with TrueAllele and Stingrays, is that either the state, or the private-sector vendor that makes the tool, or both, will try to keep what’s under the hood a secret. They try to keep the black box closed. These black-box challenges have been a long, hard slog for advocates fighting for more transparency and fairness, and the win rate is far less than 100%.

Signal basically short-circuited that whole process by just getting their hands on this piece of law enforcement technology, tearing it apart, and publishing some of what they found. That set the stage for additional or renewed challenges by defense counsel to demand to examine the tools.

This is why white-hat security research (and updating the law to protect it) is so important. Private-sector vendors like Cellebrite that sell their technology to the public sector have a good thing going: they get that sweet sweet taxpayer money on those contracts, they have little incentive to dot their i’s and cross their t’s in terms of product quality so long as the customer is satisfied (because the people who are subjected to their tools are not the customer), and they can sometimes get away with keeping their tools’ inner workings a secret. But white-hat security research doesn’t necessarily color inside the lines that the vendor dictates, and it can be an important way to get crucial information about these gov-tech tools that the vendor would not share willingly.

In sum, Signal’s hack was a stunt that has already been mitigated and probably won’t set anybody free from prison. But that doesn’t mean this was all in vain. The silver lining is that hopefully white-hat security research like this will push criminal defense lawyers, courts, and law enforcement agencies to make vendors like Cellebrite do a better job... ideally before the black hats take advantage of their sloppiness.

PART IV: LIFE LESSONS FOR ENGINEERS & SIGNAL’S SKETCHY OPTICS

Yes, Signal did a cool hack. Overall it was a prosocial move. But it was also a stunt, and Signal’s blog post was vague and confusing and seemed to suggest, in a cutesy, plausibly deniable way, that the Signal app is going to be updated so as to hack police computers

So while computer security folks were giggling at Signal’s cute, clever blog post, lawyers like me were sighing. Why? Because of an important life lesson that engineers typically don’t understand: Judges hate cute and clever. 

In general, if you do something very clever and you show it off in a cute presentation, it won’t go over well with a judge. They have no patience for stunts and showboating. The courtroom is not the stage at DEF CON. And judges do not like mealy-mouthed vague statements that are designed for plausible deniability. 

Trying to find the edges of the law using technology will not make a judge, or prosecutors for that matter, shrug and throw up their hands and say “Wow, that’s so clever! You sure got us.” They won’t reward your cleverness with reddit coins and upvotes and retweets. They will throw the book at you. (As usual, xkcd said it best, not once, but twice.) 

The law just does not work the way engineers assume it does. Having that pounded into you by law professors is the easy way to learn how the law really works. You wouldn’t like the hard way.

Signal maybe could’ve handled this situation a little differently

Allow me to just keep being a total wet blanket and demonstrate conclusively that my years as a corporate lawyer ate all the parts of my soul that used to have a punk-rock hacker ethos.

On the one hand, Signal threw sand in the gears of the machine that helps repressive governments persecute their own citizens, and that helps our own government put people behind bars on the basis of machines that are riddled with bugs. That is laudable.

On the other hand, although this was serious work with a serious point to it, the unseriousness of Signal’s tone in the blog post and video hampered public understanding of the point they were making. You aren’t helping your cause when a reporter can’t tell which parts of your blog post are jokes and which parts are serious, or what you mean by your weird coy phrasing. This blog post was plainly written in order to impress and entertain other hackers and computer people. But other hackers aren’t the real target audience; it’s lawyers and judges and the law enforcement agencies that are Cellebrite’s customers. They tend to prefer clear communication, not jokes and references to 25-year-old cult films.

In addition to my feeling that an organization with millions of users probably shouldn’t casually suggest in an official public communication that it might go do illegal stuff that’ll get it and its users in trouble, here are my other reasons for tone-policing that blog post and video:

Encryption is in grave danger worldwide. Unless you’ve been living under a rock for the last several years, you know that the legal right to offer strong encryption for digital devices and communications is under severe threat in the U.S. and elsewhere. I’ve been fighting for years against terriblepolicy proposals (sometimes in vain). And I’m deathly scared that this year or next will be the year that E2EE just plain gets banned in markets as major as India, the UK, and the EU

In particular, child sex abuse material (CSAM, otherwise known as child pornography) has become the cause célèbre that governments in such places as the US, the UK, Australia, New Zealand, India, Japan, and the EU are holding up as the reason to finally ban strong encryption once and for all. Their major talking point is that E2EE messaging apps get used by child predators. Meanwhile, guess what? According to a 2019 study, Cellebrite UFED/Physical Analyzer is the most popular digital forensics tool for investigating CSAM cases. And now here are the makers of Signal — one of those very same E2EE messaging apps those governments demonize for enabling CSAM — voluntarily taking it upon themselves to demo an exploit against that exact same #1 choice, UFED/Physical Analyzer, with the potential result of making it harder to admit evidence against child predators in court — all basically in retaliation for a personal grudge. You can imagine how law enforcement might use this hack, not to call up Cellebrite and demand a refund, but to paint Signal, and by extension all E2EE messaging app makers, as being on the side of the child predators.

And yet, despite the looming existential threat to end-to-end encryption, the heart and soul of the Signal app, it seems like Signal’s blog post was tailor-made for the FBI director to read it into the record the next time he testifies to Congress about the need for backdoors in both device encryption and E2EE messaging services. Top officials from the FBI and DOJ like to give speeches where they accuse companies that offer strong encryption of being irresponsible and un-American, because according to their view of the world, they’re just doing it to screw with law enforcement. Now Moxie has confirmed that talking point for them. 

Like it or not, Cellebrite is a safety valve. And meanwhile, the existence of Cellebrite devices has served as, I think, a safety valve to keep backdoor mandates from being imposed on smartphone manufacturers in the U.S. to date, despite the occasionaleffort to do so. As said, Cellebrite makes software for doing extractions from locked phones, not just unlocked phones. They leverage the fact that hardware and software will always have bugs, despite developers’ best efforts to the contrary. It’s kind of a cat and mouse game between Cellebrite versus Google and Apple: Cellebrite exploits some flaw in the mobile operating system, then Android and iOS patch it, then Cellebrite finds another way around that, and so on. 

So long as it’s possible for Cellebrite to leverage flaws in software to extract data from locked phones, and so long as Cellebrite devices are in use by literally thousands of police departments around the country, then there’s an uneasy dynamic equilibrium at work, where Apple and Google can keep doing the best they can at device security, and Cellebrite and their ilk can keep doing the best they can to find the flaws in it. This uneasy equipoise has even more force when paired with the ongoing availability of digital evidence from other data sources like iCloud backups. I think all that has helped to keep really bad anti-encryption legislation at bay, at least in the U.S.

But if Cellebrite machines stop working reliably, or the evidence obtained from them is hella sus and can’t be relied upon in court, then that safety valve — the ability for the cops to get courtroom-worthy evidence off phones notwithstanding strong encryption — gets plugged up. And closing the safety valve adds more pressure. It’ll become easier for law enforcement to make the case for why smartphone encryption needs to be backdoored.

Of course, if the problem is that Cellebrite has lax security that makes their products unreliable in court, the answer isn’t “well then Apple and Google should have to backdoor their smartphone encryption,” it’s “Cellebrite should fix its shitty security.” But that might or might not be how members of Congress see it. It’s definitely not how the director of the FBI will frame it to them.

The timing looks kinda fash. I also think the timing of Signal’s blog post was suboptimal. Why? Because Cellebrite devices were used in some of the criminal cases against the Capitol rioters, to extract data from their phones after they were arrested. It’s still early days in those criminal prosecutions, those cases are still ongoing, and there are hundreds of them. (I don’t know how many of them involve Cellebrite evidence.) The DOJ is already stretched extremely thin because of how damn many of these cases there are, and if even a fraction of those defendants got Cellebrited-upon, and they all decide to file a motion to examine the Cellebrite device and throw out the Cellebrite evidence, that will add further strain. 

Now, don’t get me wrong, I’m no fan of the DOJ, as you may have guessed by now. But I also don’t like seditious fascists, and I think the people who tried to violently overthrow our democratically-elected government should be caught and held accountable. And the timing of this blog post kinda makes it look like Moxie — who is famously an anarchist— is giving the fascists ammunition in their legal cases to try to get off the hook. As said, I don’t think it’ll work, and even fascists deserve due process and not to be convicted on the basis of bug-riddled spy machines, but it’s helpful to them nonetheless. 

Every tool is dual-use, of course, so a hack to try to reduce the snooping power of the police, or to keep oppressive governments from persecuting their innocent citizens, will also help the bad guys. That’s also the same fundamental trade-off underlying Signal, after all: making it way harder to surveil people’s conversations means you also end up helping some bad guys. I believe Signal provides a net good for democracy and humanity, but there is a downside too. That’s something that doesn’t come up so often when Signal talks about privacy and freedom and human rights. 

CONCLUSION

If you work at Signal, or some other E2EE messaging app, you are providing a net good by improving people’s privacy, security, etc. … but you know that not all of your users are using your service for good. That means it’s on you to figure out how you can reduce people’s abuse of your service to harm other people— not just how you can help people use your service to keep from being harmed.

If you work at Cellebrite, on the other hand: get down off your high horse, stop it with the “we’re the good guys” shtick, quit selling to authoritarian governments, and for god’s sake, fix your shit.


Viewing all articles
Browse latest Browse all 85

Latest Images

Trending Articles





Latest Images