IÂ recently readÂ The End of Ownership, a new book about how companies are using contract law, Digital RightsÂ Management, and End User License Agreements to strip away the very concept of “owning”Â the things we buy. When our Kindle books mysteriously disappear overnight, or we’re prevented from installing unauthorized software on an iPhone, we can surmise that we don’t really own the things we’ve bought. But the internet of things has created new and interesting ways for tech manufacturers to assert ownership not just of MP3s, ebooks, and software, but of the physical goods we have in our homes, our garages, and even inside out bodies. After you read the excerpt, check out our podcast with the authors.Â – Jason Koebler
Excerpted fromÂ The End of Ownership: Personal Property in the Digital Economy by Aaron Perzanowski and Jason Schultz published by TheÂ MITÂ Press. All rights reserved.
Cars, refrigerators, televisions, Barbie dolls. When people buy these everyday objects, they rarely give much thought to whether or not they own them. We pay for them, so we think of them as our property. And historically, with the exception of the occasional lease or rental, we owned our personal possessions. They were ours to use as we saw fit. They were free to be shared, resold, modified, or repaired. That expectation is a deeply held one. When manufacturers tried to leverage the DMCA to control how we used our printers and garage door openers, a big reason courts pushed back was that the effort was so unexpected, so out of step with our understanding of our relationship to the things we buy.
But in the decade or so that followed those first bumbling attempts, we’ve witnessed a subtler and more effective strategy for convincing people to cede control over everyday purchases. It relies less—or at least less obviously—on DRM and the threat of DMCA liability, and more on the appeal of new product features, and in particular those found in the smart devices that make up the so-called Internet of Things (IoT).Â
Radio Motherboard is also available on iTunes and all podcast apps.
Your car is a computer with wheels; a plane is a computer with wings; your watch, your child’s toys, even your pacemaker are all computers at their core. And as computers, they are susceptible to the same sort of external limitations and controls we’ve witnessed with previous generations of digital goods. Even if we resist it, we’re accustomed to software telling us whether we can watch a digital movie. But what happens when computer code dictates when your light bulbs have to be replaced? Or how fast you can drive? Or whether you can fly your drone in a particular neighborhood? Or what brand of cat litter you can use? What are the social consequences of a smart mattress that collects and analyzes heart rate and breathing data, monitors your movements, and provides you a nightly summary? That’s what Samsung’s Sleepsense device promises. Samsung even suggests you track your loved ones by “simply put[ting] the sensor under their mattress … to receive an analysis of the quality of their sleep via email.” What could possibly go wrong?
This walled-garden approach was a dramatic departure from the approach of general-purpose computers,Â which allowed third-party applications and considerable freedom for user modification
With so many networked devices in their homes, consumers are relying on home automation hubs—devices that allow them to control their home security systems, lights, garage door openers, and entertainment systems from any place with an Internet connection. The maker of one such device, Revolv, was acquired by Google-owned IoT company Nest in 2014. The Revolv hub sold for $300 and touted a “lifetime subscription” for updates and new features. But in April of 2016, Nest announced it would no longer support the Revolv. What’s more, Nest planned to exercise its software-enabled remote control over the devices to render them entirely inoperable. After a May 15 software update, it explained, “The Revolv app won’t open and the hub won’t work.” Alphabet, Google’s parent company, which has its sights set on the self-driving car and medical device markets, decided it was within its rights to reduce a device that consumers bought to nothing more than an overpriced paperweight. Consider that before you buy a Google car.
Let’s look at a small sampling of IoT devices across a wide range of sectors and consider their consequences for ownership and consumer welfare more broadly. In many cases, these technologies offer real benefits. Yet the core cultural and legal shifts they represent strike a blow against ownership in the digital economy.
Jailbreaking Is Not a Crime
The exact origin of the Internet of Things is difficult to pinpoint, but one significant moment in its early history was the introduction of the iPhone on January 9, 2007. Steve Jobs told the assembled crowd, “Today, Apple is going to reinvent the phone.” He proceeded to wow them with “a revolutionary mobile phone, a widescreen iPod with touch controls, and a breakthrough Internet communications device” combined in a single product. But like nearly every Apple product, the user experience was carefully choreographed and tightly controlled. iPhone users could only run Apple’s iOS. They could only configure the settings Apple allowed them to access. They could only use Apple-approved mobile carriers. And they could only run the applications Apple provided. And later, once Apple launched its App Store, they could only install software that Apple approved—on the basis of opaque and inconsistent standards. What you could do with this remarkably powerful pocket computer depended entirely on what Apple let you do.
This walled-garden approach was a dramatic departure from the approach of general-purpose computers, including Macs, which allowed third-party applications and considerable freedom for user modification. In some ways, Apple’s approach to the iPhone was more in line with an earlier phone maker, AT&T. During its decades-long reign as a telecommunications monopolist, AT&T—nÃ©e Bell Telephone—used a number of strategies to maintain strict control over telephones. As the holder of Alexander Graham Bell’s patents, AT&T had total control over the design, production, and distribution of phones. And even after those patents expired, it extended that control by leasing phones rather than selling them, making certain that users didn’t acquire property rights in their devices. They also used contractual provisions and legal threats to stamp out innovation, no matter how innocuous.
In the 1940s, AT&T exercised this power by targeting the Hush-a-phone, a small non-electronic accessory that attached over a telephone receiver to increase privacy and cut down on noise. AT&T forbade its use, and it took nearly a decade of legal battles before the DC Circuit rejected that restriction as an “unwarranted interference with the telephone subscriber’s right reasonably to use his telephone in ways which are privately beneficial without becoming publicly detrimental.” This case, along with the FCC’s subsequent Carterphone decision, which permitted the attachment of wireless technology to AT&T’s phones, paved the way for competition and individual ownership of landline phones.
In some ways, Apple’s control over the iPhone is a throwback to these bad old days. But it’s one that many consumers happily accepted in exchange for the convenience of integrating all of their online activities into a single device. But not everyone was willing to go along quietly. Apple’s restrictions sparked a movement to “jailbreak” iPhones in order to regain some semblance of ownership. “Jailbreaking” refers to the act of eliminating software restrictions and DRM that limit how phone owners can use their devices. With a jailbroken iPhone, you can install any software you choose, replace Apple’s operating system with one you prefer, and customize the look and feel of your phone. Jailbreaking is related to, but distinct from, unlocking a mobile phone—the process of removing software locks that prevent you from switching wireless carriers—from AT&T to T-Mobile, for example.
Jailbreaking is not a new practice. Similar homebrew communities formed around other devices long before the iPhone launched, from Xbox hacks to do-it-yourself DVRs. But nothing galvanized that community more than the thought of turning Apple’s powerful and ubiquitous product into an open platform. The first iPhone jailbreak was announced on July 10, 2007, just eleven days after the device launched. With each inevitable Apple software update, the jailbreaking community would free that new version within weeks, if not days.
Monsanto claimed its seeds were licensed for a single season, not sold.
Although it didn’t file suit, Apple insisted that jailbreaking was illegal. In 2009, the Electronic Frontier Foundation (EFF) filed a petition with the U.S. Copyright Office requesting formal permission for iPhone owners to jailbreak their devices without fearing anti-circumvention liability. This provoked Apple to explain precisely why jailbreaking should be banned. Despite referring to consumers as “iPhone owners” throughout its filing, Apple asserted that “iPhone users are licensees, not owners, of the copies of iPhone operating software.” In other words, when you buy an iPhone, all you own is the physical hardware. The software stored on it that make it work and account for much of its value still belong to Apple.
While perhaps shocking to those with an iPhone in their pocket, this stance was a logical conclusion for Apple, a company with one foot in the software industry and a commitment to controlling the user experience that bordered on zealotry. And because Apple has consistently proven its nearly unrivaled skill as a designer of end user experiences, it succeeded in selling us DRM in the guise of a smart device. It made us believe that a bug was a feature. Consumers recoiled at the idea of these sorts of restrictions when Chamberlain and Lexmark tried to sneak them into our garage door openers and laser printers, but when Jobs offered us the same vision, we lined up to give Apple our money.
Tired of losing revenue to industrious farmers who repaired their own tractors or bargain hunters who took their equipment to an independent repair shop, John Deere decided to force their customers to have their equipment serviced by authorized John Deere dealers
Eventually, the Copyright Office ruled in favor of the right to jailbreak phones. However, in doing so, it sidestepped the contentious issue of ownership and focused on jailbreaking as a fair use of Apple’s copyrighted iOS. And in 2014, an otherwise hopelessly gridlocked Congress passed, and President Obama signed, the Unlocking Consumer Choice and Wireless Competition Act in response to a petition signed by over 100,000 Americans. Although each of these measures suggests both that people still care deeply about owning their devices and that government can be responsive to those concerns, they are temporary fixes. Both the Copyright Office exemptions and the unlocking legislation expire after three years.
Apple’s battle for ownership of our phones signaled the beginning of a much broader shift. Every day, we learn of yet another object that will come with embedded software, location detection sensors, and network connections that limit consumer control and surreptitiously communicate back to its corporate mother ship. And while companies like Apple are slowly making their devices more open and user-configurable as a result of public pressure and competitive threats from open-source mobile operating systems such as Android, whole other areas of our lives are becoming constrained and preconfigured for us, often without our knowledge.
Old MacDonald Licensed a Farm
Farmers have enough to worry about. Banks are coming to foreclose on their land. Locusts are eating their crops. Immigration policy is complicating their hiring practices. And corporate agri-business long ago redefined the economics of their way of life. On top of all of this, today’s farmers have to contend with intellectual property.
It began with seeds. For years, Monsanto successfully sold Roundup, an herbicide that helped farmers control weeds and other unwanted vegetation. But Roundup also often damaged the crops themselves, so Monsanto began manufacturing crops resistant to Roundup. It patented so-called Roundup Ready soybeans and later added alfalfa, canola, corn, cotton, and sugar beets to the list of Roundup-resistant products. Initially welcomed by many farmers, some were troubled by Monsanto’s claim that its seeds were licensed for a single season, not sold. This meant that no matter how many seeds you saved, they couldn’t be replanted the following year, a centuries-old farming practice. Instead, you had to buy new seeds from Monsanto or else contend with pests and less-effective pesticides.
Seed patents were just the beginning of the IP frustrations facing farmers. Software has also found its way onto the farm. The iconic John Deere tractor now contains no less than eight control units—hardware and software components that regulate various functions, ranging from running the engine to adjusting the armrest to operating the hitch. When tractors were purely mechanical, farmers could easily maintain, repair, and modify their own equipment as needed. But now, software stands in their way. That barrier is no accident. Tired of losing revenue to industrious farmers who repaired their own tractors or bargain hunters who took their equipment to an independent repair shop, John Deere decided to force their customers to have their equipment serviced by authorized John Deere dealers. By interposing a software layer between farmers and their tractors, John Deere created a practical hurdle. And by wrapping its software controls in DRM, it created a legal one. A quick glance at the John Deere owner’s manual gives you a good indication of the result. Almost any problem—from high coolant temperature to a parking brake that’s not working or a seat that’s too firm—ends the same way, with a trip to the John Deere dealer.
Keurig’s machines would only accept pods embedded with a code that verified your coffee came from a licensed supplier
Fed up with John Deere’s tactics, a group of farmers petitioned the Copyright Office in February of 2015 for a temporary DMCA exemption, like the one granted to smartphone jailbreakers, that would give them clear legal authority to repair, upgrade, and modify their tractors. John Deere responded with adamant opposition, insisting that tractor owners had no right to look under the digital hood, even if the fix was quick and technically simple. Its argument hinged on ownership. John Deere claimed it owns the software, and not just as an abstract matter of copyright law. It owns the copies of its code embedded in the tractors it sells to farmers, code that is essential to the functioning of the equipment. Farmers, in John Deere’s words, merely had “an implied license for the life of the vehicle to operate the vehicle.” That means you get to keep driving the tractor you bought from John Deere for tens of thousands of dollars unless and until it tells you otherwise.
John Deere’s attitude toward ownership has a number of important implications that typify the core risks presented by the Internet of Things. Most obviously, by denying farmers the right to repair—a right entrenched enough that even patent protection can’t disturb it—John Deere has effectively raised the price of its products for farmers. It has also done serious harm to the market for repair services, which are less competitive since farmers have no real choice of mechanics.
Free as in Coffee
Those in the free software movement are fond of distinguishing between two ways in which we use the word “free.” “Free as in beer” refers to price. “Free as in speech” refers to liberty, the freedom you have to use a thing as you choose. Until recently, you could be confident that if you overheard someone talking about free coffee, it meant Starbucks was running a promotion. But thanks to Keurig, the maker of the popular K-Cup brewing system, conversations about coffee now have to account for questions of liberty as well.
The Keurig saga began in 2012, when several of the coffee company’s key patents expired. Those patents covered its pod-based brewing system. Users placed single-serving portions of coffee or other brewed beverages in the machine, hit a button, and got a consistent drink each time. Without patent protection, Keurig had to contend with competition. As it turned out, Keurig wasn’t a fan. Rival companies started producing compatible pods and undercutting Keurig’s prices. In response, Keurig released new machines featuring “Keurig 2.0 Brewing Technology which reads each lid to deliver on the promise of excellent quality beverages.” Marketing speak aside, what that meant was that Keurig’s machines would only accept pods embedded with a code that verified your coffee came from a licensed supplier. And it also killed off its generic pod that let you supply your own coffee grounds. If you tried to brew rogue coffee, your Keurig machine greeted you with this cheerful message:
The public reaction was swift and vicious. Angry Facebook posts and irate Amazon reviews flooded the Internet. As Brian Barrett wrote, “A coffee maker limiting your choice of grind seems as out of place as a frying pan dictating your eggs.” It didn’t take long for competitors to capitalize on this outrage by cracking the Keurig DRM. Coffee drinkers even figured out how to defeat it with a single piece of tape. Soon Keurig was persuaded to reverse course, at least in part. It appears to be sticking to its guns when it comes to blocking pods from competitors, but it announced plans to reintroduce the My K-Cup product that allowed coffee drinkers to fill their own pods. Nonetheless, the company and its investors have paid a price for its overreach. Keurig stock dropped by 10 percent in the wake of the DRM controversy.
ToyTalk claims to own anything you, your child, or even their friends say to Barbie
The Keurig example shows that people still care deeply about owning and controlling their devices and that they have the potential to make their voices heard in the marketplace. But it also cautions that market pressure is often only partly effective in protecting consumer interests.
Open the Pod Bay Doors, Barbie
At this point, it should come as no surprise that the Internet of Things threatens our sense of control over the devices we purchase. However, those threats aren’t limited to intellectual property and DRM; they also include battles for control over information about our behavior and our inner lives. One troubling example is the Wi-Fi-enabled Hello Barbie doll from Mattel. This IoT Barbie looks like many of her predecessors but offers a unique feature. She can engage in conversation with a child and learn about them in the process. Barbie does this by recording her conversations and transmitting them via network connections to ToyTalk, a third-party cloud-based speech recognition service. ToyTalk then uses software and data analytics to analyze those conversations and deliver personalized responses. It’s an impressive trick, but the implications for our sense of ownership are quite shocking. For many children, talking to toy dolls is a way to share their unfiltered thoughts, dreams, and fears in a safe, private environment. But according to the terms of the Hello Barbie EULA, ToyTalk and its unnamed partners have wide latitude to make use of information about your child’s conversations in ways that few parents would anticipate:
All information, materials and content … is owned by ToyTalk or is used with permission. … You agree that ToyTalk and its licensors and contractors may use, transcribe and store. … Recordings and any speech data contained therein, including your voice and likeness as may be captured therein, to provide and maintain the ToyTalk App, to develop, tune, test, enhance or improve speech recognition technology and artificial intelligence algorithms, to develop acoustic and language models and for other research and development purposes. … By using any Service, you consent to ToyTalk’s collection, use and/or disclosure of your personal information as described in this Policy. By allowing other people to use the Service via your account, you are confirming that you have the right to consent on their behalf to ToyTalk’s collection, use and disclosure of their personal information as described below.
In other words, ToyTalk claims to own anything you, your child, or even their friends say to Barbie. Conversations with the doll are corporate property. The safety and privacy of a child’s bedroom is compromised by the collection, sharing, and commercial use of those conversations. And while these services may offer benefits, they come with significant new risks. Shortly after the IoT-enabled Barbie shipped, security vulnerabilities that could allow hackers to intercept a child’s conversations with the doll were revealed. And those worries aren’t just hypothetical. Around the same time, VTech—maker of the children’s smartwatch Kidizoom and InnoTab mobile device—disclosed that more than six million children had their personal information, including photos and chat messages, stolen from VTech’s servers.
Our Bodies, Our Servers
As if our connection to the Internet of Things wasn’t intimate enough, network-enabled and software-dependent devices are now inside our bodies. When open source advocate Karen Sandler found out at age thirty-one that she could die suddenly from a heart condition, she did what most of us would do. She went to the doctor to fix it. In her case, that meant implanting a pacemaker-defibrillator in her chest to give her heart a jolt in the event it gave out. The device—about the size of an avocado—was literally a life-saving invention. But because it ran proprietary software, Sandler had no way to tell how it worked or how likely it was to fail. As she explained in an interview, “A statistic came out recently that 25 percent of all medical device recalls in the last few years have been due to software failure. When you read these statistics it becomes very personal.”
It turns out that Sandler’s questions about her pacemaker weren’t so easy to answer. Much like Apple and its iPhone, pacemaker manufacturers won’t let patients look inside or test the devices they purchase. Nor are you allowed to read the data from your own device while you are at home or on the road—even in the midst of a medical emergency. Instead, you can only access your health data from manufacturer-approved sources. And until recently, you couldn’t even test your device to make sure it is functioning correctly or was running the latest software or security update. The reason for such restrictions? According to a filing with the Copyright Office, the Advanced Medical Technology Association “believe[s] that patients have an inherent right to access their own medical data, however, this in and of itself does not necessitate bypass of any intellectual property protections.” In other words, even if you own the physical parts of the pacemaker, the manufacturer’s copyright trumps any claim you might have to see how it works or what data it collects on you—even when it is implanted inside your body.
Dana Lewis proved what patients can do when they own their devices and control their care. Lewis is a diabetic living in Seattle who relies on a glucose monitor and a handheld wireless device to alert her when her blood sugar is too high or low. Yet Lewis often wasn’t able to hear the alarm, especially when she was sleeping. So she and her partner, Scott Leibrand, built a new program that displayed blood sugar levels with new louder alarms and a snooze button. They even added the ability to send the information to other mobile devices, such as Leibrand’s Pebble watch. Next they turned to Lewis’s insulin regime. Traditionally diabetics control their insulin levels manually. But Lewis and Leibrand began experimenting with the data to devise an algorithm specific to Lewis’s needs—something that would automate and adapt based on the data her device was sending out. It could predict her insulin needs thirty, sixty, and even ninety minutes in the future. Eventually they hope to produce an artificial pancreas that will essentially automate this process. No IP law, and certainly not one designed to stop infringers from sharing movies online, should stand in the way of patients adapting equipment they own to keep them alive.
Network security has also become an issue for medical devices. From insulin pumps to cochlear implants and powered prosthetic joints, more and more medical devices rely on transmitting medical data to providers through Wi-Fi and Bluetooth protocols. These connections have already opened the door to numerous security issues. Even former Vice President Dick Cheney claims to have switched off the wireless functionality on his own pacemaker to prevent terrorists from hacking it. Fortunately, much like with vehicle security testing, the Copyright Office granted an exemption for testing exterior medical devices and passively testing those that are implanted in ways that don’t affect functionality. The ability to innovate and improve these devices, however, remains highly contested.
Karen Sandler’s dream of an open source pacemaker may inspire us, but it also presents complications. Open source could allow patients to examine, test, and improve devices in ways far more flexible and permissive than the current proprietary model, but they don’t give us autonomy in quite the same way as analog ownership. Instead they offer a future with different, more user-friendly restrictions to navigate. Focusing on medical devices, the argument for individual ownership and control resonates more viscerally.
For the rest of the stuff we buy, the stakes may be lower, but the arguments are the same. If you don’t own your devices, you can’t repair or customize them. You can’t innovate with them. And in the end, the products you buy may end up using you more than you use them.