Unable to load image

Pluralistic: Holy CRAP the UN Cybercrime Treaty is a nightmare

https://pluralistic.net/2024/07/23/expanded-spying-powers/

If there's one thing I learned from all my years as an NGO delegate to UN specialized agencies, it's that UN treaties are dangerous, liable to capture by unholy alliances of authoritarian states and rapacious global capitalists.

Most of my UN work was on copyright and "paracopyright," and my track record was 2:0; I helped kill a terrible treaty (the WIPO Broadcast Treaty) and helped pass a great one (the Marrakesh Treaty on the rights of people with disabilities to access copyrighted works):

https://www.wipo.int/treaties/en/ip/marrakesh/

It's been many years since I had to shave and stuff myself into a suit and tie and go to Geneva, and I don't miss it - and thankfully, I have colleagues who do that work, better than I ever did. Yesterday, I heard from one such EFF colleague, Katitza Rodriguez, about the Cybercrime Treaty, which is about to pass, and which is, to put it mildly, terrifying:

https://www.eff.org/deeplinks/2024/07/un-cybercrime-draft-convention-dangerously-expands-state-surveillance-powers

Look, cybercrime is a real thing, from pig butchering to ransomware, and there's real, global harms that can be attributed to it. Cybercrime is transnational, making it hard for cops in any one jurisdiction to handle it. So there's a reason to think about formal international standards for fighting cybercrime.

But that's not what's in the Cybercrime Treaty.

Here's a quick sketch of the significant defects in the Cybercrime Treaty.

The treaty has an extremely loose definition of cybercrime, and that looseness is deliberate. In authoritarian states like China and Russia (whose delegations are the driving force behind this treaty), "cybercrime" has come to mean "anything the government disfavors, if you do it with a computer." "Cybercrime" can mean online criticism of the government, or professions of religious belief, or material supporting LGBTQ rights.

Nations that sign up to the Cybercrime Treaty will be obliged to help other nations fight "cybercrime" - however those nations define it. They'll be required to provide surveillance data - for example, by forcing online services within their borders to cough up their users' private data, or even to pressure employees to install back-doors in their systems for ongoing monitoring.

These obligations to aid in surveillance are mandatory, but much of the Cybercrime Treaty is optional. What's optional? The human rights safeguards. Member states "should" or "may" create standards for legality, necessity, proportionality, non-discrimination, and legitimate purpose. But even if they do, the treaty can oblige them to assist in surveillance orders that originate with other states that decided not to create these standards.

When that happens, the citizens of the affected states may never find out about it. There are eight articles in the treaty that establish obligations for indefinite secrecy regarding surveillance undertaken on behalf of other signatories. That means that your government may be asked to spy on you and the people you love, they may order employees of tech companies to backdoor your account and devices, and that fact will remain secret forever. Forget challenging these sneak-and-peek orders in court - you won't even know about them:

https://www.eff.org/deeplinks/2024/06/un-cybercrime-draft-convention-blank-check-unchecked-surveillance-abuses

Now here's the kicker: while this treaty creates broad powers to fight things governments dislike, simply by branding them "cybercrime," it actually undermines the fight against cybercrime itself. Most cybercrime involves exploiting security defects in devices and services - think of ransomware attacks - and the Cybercrime Treaty endangers the security researchers who point out these defects, creating grave criminal liability for the people we rely on to warn us when the tech vendors we rely upon have put us at risk.

This is the granddaddy of tech free speech fights. Since the paper tape days, researchers who discovered defects in critical systems have been intimidated, threatened, sued and even imprisoned for blowing the whistle. Tech giants insist that they should have a veto over who can publish true facts about the defects in their products, and dress up this demand as concern over security. "If you tell bad guys about the mistakes we made, they will exploit those bugs and harm our users. You should tell us about those bugs, sure, but only we can decide when it's the right time for our users and customers to find out about them."

When it comes to warnings about the defects in their own products, corporations have an irreconcilable conflict of interest. Time and again, we've seen corporations rationalize their way into suppressing or ignoring bug reports. Sometimes, they simply delay the warning until they've concluded a merger or secured a board vote on executive compensation.

Sometimes, they decide that a bug is really a feature - like when Facebook decided not to do anything about the fact that anyone could enumerate the full membership of any Facebook group (including, for example, members of a support group for people with cancer). This group enumeration bug was actually a part of the company's advertising targeting system, so they decided to let it stand, rather than re-engineer their surveillance advertising business.

The idea that users are safer when bugs are kept secret is called "security through obscurity" and no one believes in it - except corporate executives. As Bruce Schneier says, "Anyone can design a system that is so secure that they themselves can't break it. That doesn't mean it's secure - it just means that it's secure against people stupider than the system's designer":

The history of massive, brutal cybersecurity breaches is an unbroken string of heartbreakingly naive confidence in security through obscurity:

https://pluralistic.net/2023/02/05/battery-vampire/#drained

But despite this, the idea that some bugs should be kept secret and allowed to fester has powerful champions: a public-private partnership of corporate execs, government spy agencies and cyber-arms dealers. Agencies like the NSA and CIA have huge teams toiling away to discover defects in widely used products. These defects put the populations of their home countries in grave danger, but rather than reporting them, the spy agencies hoard these defects.

The spy agencies have an official doctrine defending this reckless practice: they call it "NOBUS," which stands for "No One But Us." As in: "No one but us is smart enough to find these bugs, so we can keep them secret and use them attack our adversaries, without worrying about those adversaries using them to attack the people we are sworn to protect."

NOBUS is empirically wrong. In the 2010s, we saw a string of leaked NSA and CIA cyberweapons. One of these, "Eternalblue" was incorporated into off-the-shelf ransomware, leading to the ransomware epidemic that rages even today. You can thank the NSA's decision to hoard - rather than disclose and patch - the Eternalblue exploit for the ransoming of cities like Baltimore, hospitals up and down the country, and an oil pipeline:

https://en.wikipedia.org/wiki/EternalBlue

The leak of these cyberweapons didn't just provide raw material for the world's cybercriminals, it also provided data for researchers. A study of CIA and NSA NOBUS defects found that there was a one-in-five chance of a bug that had been hoarded by a spy agency being independently discovered by a criminal, weaponized, and released into the wild.

Not every government has the wherewithal to staff its own defect-mining operation, but that's where the private sector steps in. Cyber-arms dealers like the NSO Group find or buy security defects in widely used products and services and turn them into products - military-grade cyberweapons that are used to attack human rights groups, opposition figures, and journ*lists:

https://pluralistic.net/2021/10/24/breaking-the-news/#kingdom

A good Cybercrime Treaty would recognize the perverse incentives that create the coalition to keep us from knowing which products we can trust and which ones we should avoid. It would shut down companies like the NSO Group, ban spy agencies from hoarding defects, and establish an absolute defense for security researchers who reveal true facts about defects.

Instead, the Cybercrime Treaty creates new obligations on signatories to help other countries' cops and courts silence and punish security researchers who make these true disclosures, ensuring that spies and criminals will know which products aren't safe to use, but we won't (until it's too late):

https://www.eff.org/deeplinks/2024/06/if-not-amended-states-must-reject-flawed-draft-un-cybercrime-convention

A Cybercrime Treaty is a good idea, and even this Cybercrime Treaty could be salvaged. The member-states have it in their power to accept proposed revisions that would protect human rights and security researchers, narrow the definition of "cybercrime," and mandate transparency. They could establish member states' powers to refuse illegitimate requests from other countries:

https://www.eff.org/press/releases/media-briefing-eff-partners-warn-un-member-states-are-poised-approve-dangerou

β€”β€”

This work – excluding any serialized fiction – is licensed under a Creative Commons Attribution 4.0 license. That means you can use it any way you like, including commercially, provided that you attribute it to me, Cory Doctorow, and include a link to pluralistic.net.

https://creativecommons.org/licenses/by/4.0/

Quotations and images are not included in this license; they are included either under a limitation or exception to copyright, or on the basis of a separate license. Please exercise caution.

53
Jump in the discussion.

No email address required.

Many regard international human rights law as one of our greatest accomplishments, but, in reality, it's one of our greatest failures. International human rights law was written largely by the Soviet Union, one of the most oppressive regimes in human history. Not only did the USSR make sure that carrying out genocide based on political opinion would be permitted under international human rights law, but they also used international human rights law as a tool to attack democratic values worldwide, enshrining speech restrictions into international human rights law, with Islamic dictatorships now pushing to have those speech restrictions expanded to cover any "defamation of religion" (Jacob Mchangama outlines this in great detail in his outstanding (and frightening) essay "The Sordid Origins of Hate Speech Laws" on Hoover.org).

Trans women are women in the same way that urinal cakes are cakes.

Jump in the discussion.

No email address required.

:ipgrabber:

Jump in the discussion.

No email address required.

Thanks for cross linking. I have sniffed your current IP: 209.167.176.70

You're soon to be crossed out (snap)

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.