When Tesla announced recently that it would make "full self-driving capability" available on new S and X models for a mere $3,000 extra, optimists hailed the development as a watershed moment for consumer technology.
But embedded in the press release was a little-noted catch: While customers may use the self-driving feature to pick up friends or family members, "doing so for revenue purposes will only be permissible on the Tesla Network, details of which will be released next year." If you want to earn some extra money — perhaps to help pay off your robot car's $75,000 price tag — by selling rides to strangers, you won't be allowed to use Uber or Lyft.
On its face, this demand may seem ridiculous. For over a century, buying a car meant that as soon as you left the lot, you could drive it wherever and however you liked as long as you obeyed traffic laws. Yet, because of the way copyright law handles software, Tesla may have wide latitude to control your behavior by conditioning use of the products you presumably own on your willingness to following its (literal) rules of the road.
Tesla is not the first company to use its power over software to restrict what consumers can do with the products they buy. John Deere prohibits farmers who buy its software-enabled tractors from doing their own repairs. General Motors informed the U.S. Copyright Office that motorists who purchase their cars "mistakenly conflate ownership of a vehicle with ownership of the underlying computer software in the vehicle"—even though, without the software, your GM car is essentially a giant paperweight. Keurig sold coffee makers that deny you your daily fix unless you use its proprietary coffee pods. HP and Lexmark printers are programmed to reject other makers' ink cartridges. Even Google, considered by many to be one of the more open tech firms in the country, permanently disabled every single Revolv home automation hub sold by its subsidiary Nest when it unilaterally decided to pull support for the device.
How did we end up in a world where device makers can dictate how we use the products we buy and reasonably believe we own? Starting in the 1980s, software companies began routinely attaching End User License Agreements to their programs without much pushback from regulatory agencies or courts. Since then, EULAs have expanded both in their length and their ubiquity. Today, these catch-all documents are flooded with thousands of words of legalese attempting to minimize corporate liability and limit consumer rights. What's more, EULAs are no longer confined to standalone software products. Our phones, watches, TVs and even our vehicles come with complex license terms attached.
Although few if any of us ever bother to read these agreements, there is one consistent message in nearly all of them: Software isn't sold to you, it is merely licensed. From your copy of Microsoft Office to the code embedded in your Tesla, SmartTV, or even the latest Barbie doll, these license terms insist that you don't own the copies of code that make your devices work. You've just been granted temporary permission to use them, even if you paid the same price that you used to pay to own these items outright, and sometimes even more.
As if that weren't bad enough, Digital Rights Management technology allows device makers to actually enforce restrictions through code. In the 1990s and early 2000s, DRM was deployed to prohibit everything from having devices read e-books aloud to opening your garage door with a competing remote to using videogame cheats to fast-forwarding through DVD previews.
Fortunately, we have of late seen some attempts to safeguard digital consumers' rights. For example, in response to carmakers' leveraging their control over software to clamp down on independent repair shops, some states and industry groups fought and won a "right to repair" for software-enabled cars. But such responses have been few and far between.
Silicon Valley venture capitalist Marc Andreesson once famously wrote that "software is eating the world." That statement describes not only new technologies, but also the legal and policy frameworks that enable them. If license terms are allowed to control how we use the digital goods we buy, they may well eat away at the very notions of ownership and personal property.
Aaron Perzanowski is a professor of law at Case Western Reserve University School of Law. Jason Schultz is a professor of clinical law at New York University and director of NYU's Technology Law and Policy Clinic. They are the authors of "The End of Ownership: Personal Property in the Digital Economy."