How can we make technology that frees us, rather than enslaves us? - Tor/Forge Blog
Close
WalkawayHeader 49A

How can we make technology that frees us, rather than enslaves us?

How can we make technology that frees us, rather than enslaves us?

Place holder  of - 21Written by Cory Doctorow

In the Foundation series, Isaac Asimov posited three rules to protect humans from robots. As our own technology advances exponentially every day, how can will we make technology that frees us, rather than enslaving us?

Let us begin by cleaving this problem into two pieces, only one of which I am qualified to address:

1. How can we make technology that works well?
2. How can we make technology that fails well?

I only know about #2.

The Second Law of Thermodynamics is a thing. Security—like all forms of experimentally derived knowledge—is a process, not a product. Computers with no known flaws are not flawless: their flaws just have not yet been discovered and reported.

Computers have metastasized. Software is eating the world. Your toaster, pacemaker, car, tractor, insulin pump and thermostat are (or soon will be) computers in fancy cases that have the power to inflict enormous pain and harm upon your person and life. It is correct to view software as a nexus of control for solving your problems. When books become digital objects, publishers attempt to solve their problems by controlling the both the code embedded in the ebooks themselves and the devices that can play them back.

But those problems aren’t your problems. The fact that some publishers don’t like the used book market and perceive an opportunity to kill it by using software to keep people from giving away, selling, or lending digital books doesn’t mean you benefit when they attempt it. Their security from used books is your insecurity of not getting to read used books.

What the entertainment companies started, the rest of the world has cottoned onto. Today, a startling variety of technologies use digital countermeasures to control their owners: insulin pumps stop you from reading your coronary telemetry except by manufacturer-authorized doctors with paid-up software licenses. GM stops you from visiting independent mechanics who diagnose your engine with unauthorized tools and repair it with third-party replacement parts. Voting machine vendors stop independent researchers from validating their products.

This only works if you can’t replace the software the manufacturer specifies with software from someone else—say, a competitor of the manufacturer—that gives you back the freedom the software has taken away. That’s because the computer the software is running on is a general purpose computer: that’s the only kind of computer we know how to build, and it can run any program that can be expressed in symbolic language.

A computer that won’t obey you—a DVD player that won’t play an out-of-region disc; a phone that won’t accept apps that come from third-party app stores—isn’t a computer that’s incapable of obeying you. That computer can readily do all the things on the forbidden list. It just refuses to do them.

This is what controlling people with their computers means: designing disobedient computers that view their owners as adversaries, that obfuscate their operations from those owners, that prefer the orders they get from distant third parties to the policies set by the person holding the computer, having paid for it.

It’s hard to keep people from changing the software on computers they own—even software that’s designed to hide from its owner and refuse to shut down can eventually be located and neutralized. If you let skilled adversaries play with a computer whose software is skulking in the operating system’s shadows, the skilled adversary will eventually find its spider hole and flush it out and kill it with extreme prejudice. Then that expert will tell everyone else how to do it with their computers.

So it was that in 1998, the US Congress enacted the Digital Millennium Copyright Act (DMCA), whose Section 1201 makes it a serious crime to figure out how the computers you own work and tell other people what you’ve learned. Under DMCA 1201, it’s a potential felony (punishable by a 5 year sentence and a $500,000 fine for a first offense) to weaken or bypass a system that restricts access to a copyrighted work.

Every device with software in it has a copyrighted work in it—software is a copyrighted work. Manufacturers who want to force their customers to use their property in ways beneficial to the manufacturer (and not the device’s owner) can configure those devices so that using them in any other way involves tampering with a copyright lock, which makes using your computer in the way you want into a potential felony.

That’s why John Deere tractors are designed so that getting them fixed by non-authorized repair people requires breaking a copyright lock; thus Deere can force farmers to pay $230, plus $130/hour for simple service calls. The farmers are just the start: add a vision-system to a toaster and it can prevent you from using third-party bread, and make disabling the bread-enforcement system into a felony.

As software metastasizes into every category of goods, an entertainment industry law from the late XXth Cen is turning into an existential threat to human liberty: we are being Huxleyed into the Full Orwell.

That’s for starters. But security is a process, not a product. You can only make a device secure by continuously prodding at it, looking for its defects, and repairing them before they are exploited by your adversary.

DMCA 1201 is now the leading reason that security researchers fail to disclose the vulnerabilities they discover. Once a device has a copyright-protecting lock on it, reporting that device’s defects makes you potentially liable to bowel-watering criminal and civil penalties. In 2015, security researchers told the US Copyright Office that they are sitting on potentially lethal bugs in insulin pumps and cars, on bugs in thermostats and voting machines, in entertainment consoles whose unblinking eyes and ever-listening ears witness our most intimate moments.

By providing an incentive to companies to add copyright locks to their systems, we’ve also given them a veto over who can reveal that they have sold us defective and dangerous products. Companies don’t view this as a bug in their digital monopolization strategy: it is a feature.

Isaac Asimov started from the presumption that we’d make positronic brains with a set of fixed characteristic and that this design would be inviolable for millennia, and then wrote several books’ worth of stories about which unchanging rules these positronic brains should follow. He was wrong.

Designing computers to treat their owners as untrustworthy adversaries, unfit to reconfigure them or know their defects, is a far more dangerous proposition than merely having computers with bad software. Asimov was interested in how computers work. He should have been paying attention to how they fail.

The failure mode of prohibiting the owners of computers from changing which programs they run, and of knowing whether those computers are secure, is that those computers are now designed to control their owners, rather than being controlled by them.

This is the key difference between computers that liberate and computers that enslave.

Asimov had three laws. I propose two:

1. Computers should obey their owners
2. It should always be legal to tell the truth about computers and their security

Neither of these laws is without potential for mischief. I could write a hundred stories about how they could go wrong. But the harms of following these rules are far worse than the harms of deliberately setting computers to control the people they are meant to serve.

I charge you to be hard-liners for these rules. If they aren’t calling you unreasonable, a puritan, a fanatic for these rules, you’re not trying hard enough.

The future is riding on it.

Order Your Copy

Placeholder of amazon -79 Poster Placeholder of bn- 15 Poster Placeholder of booksamillion- 48 ibooks2 58 indiebound powells

Follow Cory Doctorow on Twitter, on his website, and on the blog Boing Boing.

15 thoughts on “How can we make technology that frees us, rather than enslaves us?

  1. As someone who has made it a hobby of destroying TPMs since buying my first computer (an Atari 800) in 1982, I could not agree more with your article.
    I shall continue to ply my trade until my last breath, stupid laws be damned.

  2. Don’t you mean “But the harms of following these rules are far [LESS] than the harms of deliberately setting computers to control the people they are meant to serve”?

      1. I agree – Cory’s current sentence seems opposite to his intention. Cory? Maybe your proof-reader goofed.

  3. Although a great deal of Asimov’s fiction is linked together in the same universe, the Three Laws were introduced in the Robot series, not the Foundation series.

    1. Was about to post the same thing. The three laws appear in the series of short stories that were turned into the novel “I, Robot” (really interlinked stories) that started with “Robbie.” There’s not a lot of robot stuff in Foundation.

      1. It’s true that Foundation stories aren’t about positronic brains and their problems, but positronic brains appear in Foundation; they also appear in far-future Robots stories that indicate that the three laws are still being built into robots millennia after their invention.

  4. Thanks for a timely piece. Now I know it is okay to be called paranoid, hater, etc. I always say better that than to be sheep!

  5. What do you suggest as a practical political solution? How would you pressure lawmakers to change the laws and be more open about security? As a non technology geek, I don’t have the capacity to mess with my computer’s code, and I don’t have the ability to be absolutely secure. I can go offline entirely, or I can work with others to change the laws.

    I think many people are aware there is a problem. We need practical solutions.

    1. I agree. I suggest supporting pressure groups that sue the USG to invalidate bad laws and pressure them to adopt better ones (and also to stop adopting worse ones):

      * EFF (eff.org)
      * ACLU (aclu.org)
      * Demandprogress (demandprogress.org)
      * Fight for the Future (https://www.fightforthefuture.org/
      * Software Freedom Law Center (https://www.softwarefreedom.org/)
      * Free Software Foundation (fsf.org)
      * Software Conservancy (https://sfconservancy.org/)

      Many many others!

  6. I feel your pain: everybody looks at me like I am a weirdo when I say it is my first thing to root my android phone and tablet to use their full potentional.

    Please vorfige me for rgammar miskates; ingiliz not is ym tanive ganluage.

Comments are closed.

Leave a Reply