Regulations to protect personal data don’t inspire much love. Companies frequently regard them as a nuisance, a needless expense, and a hindrance to innovation. Governments think the rules should apply to everyone but themselves. And ordinary people often act as if they don’t care whether their data is safeguarded at all.
But such regulations matter now more than ever. The world is increasingly defined by technological asymmetries; a huge gulf has opened up, with big corporations and powerful governments on one side and ordinary individuals on the other. Even in wealthy democratic societies, individual autonomy is at risk now that even simple choices, such as what news stories to read or what music to listen to, are dictated by algorithms that operate deep within software and devices—so deep that users are usually unaware of the extent to which data processing shapes their decisions and opportunities. Today, technology “is being used to control what we see, what we can do, and, ultimately, what we say,” the cryptographer and privacy specialist Bruce Schneier has written. “It makes us less safe. It makes us less free.”
Most people have yet to realize that truth. In the era of the Internet and mobile communications, people tend to focus more on the goods, services, and experiences that technology offers and less on the ways in which privacy is imperiled by software, code, and devices that have become an invisible but integral part of everyday life. Although many people want to have a sense of how data processing affects them, most aren’t interested in the details.
The trouble is that, to paraphrase Leon Trotsky, although you may not be interested in big data, big data is interested in you. Companies and governments are constantly finding new ways to gather and exploit more information about more people. Sometimes they have good intentions; sometimes they do not. I’ve learned this firsthand: as the data protection commissioner for Ireland, which is home to the European headquarters of many of the world’s most powerful technology firms, I have had to push back against a steady erosion of privacy as companies and governments have become hungrier for data and bolder in how they obtain and use it.
Companies and governments are constantly finding new ways to gather and exploit more information about more people.
Preventing the misuse of personal information—intentional or otherwise—is the reason the EU recently introduced the General Data Protection Regulation (GDPR), a new set of rules that went into effect in May. That bland name is misleading: the GDPR is an ambitious attempt to shape a crucial part of contemporary life. In a world increasingly defined by digital technology, the protection of private data is not merely a luxury; it is “a fundamental right,” as the text of the GDPR notes. The GDPR has opened a new chapter in the history of the Internet, creating a blueprint that other states and organizations will study closely as they, too, seek to properly balance individuals’ rights to data protection with their other rights and with the legitimate interests of business and government. The world’s governments must start to converge on laws regarding data protection, ideally taking inspiration from the GDPR. Otherwise, authoritarians and unscrupulous tech giants will stand to gain, and democratic states and ordinary people will lose out.
Why data protection matters
In the Internet age, ordinary people have become extraordinarily vulnerable, because participating in the digital economy and broader society now frequently involves revealing personal information to large organizations that can easily store it, process it, and share it without any input from individuals. Market forces and the risk of negative public opinion have not deterred companies and governments from abusing this immense power; only laws have prevented the worst misuse. Regulations to protect personal data prevent crooked officials from easily scouring government databases for damaging information about their critics and rivals. They stop corrupt law enforcement authorities from gaining unfettered access to anyone’s phone and Internet records. And they make it harder to use unverified or inaccurate data to wrongly deny people insurance policies, loans, or jobs.
Yet efforts to create a comprehensive regulatory system to protect personal information, including earlier attempts by the EU, have failed to fully deliver. The EU’s 1995 Data Protection Directive was somewhat vague, failing to identify the harms it sought to prevent or mitigate, and it suffered from a lack of consistency when it came to advising individual EU member states on how to integrate data privacy into their national laws. The GDPR mostly avoids both of those problems. It makes clear that it intends to combat discrimination, identity theft or fraud, financial crime, and reputational harm. (The European Commission has also emphasized that the GDPR will strengthen European economies by maintaining people’s trust in the security of digital commerce, which has suffered in the wake of a steady stream of high-profile data breaches.)
What is more, the GDPR is a “direct effect” law, meaning that individuals can invoke it in national courts without reference to national laws, thus making it generally unnecessary for EU member states to pass new national legislation to mirror the GDPR. And the law has extra-territorial reach, applying to any organization that operates in the EU, even if it is not physically located in the EU. All such organizations, in every sector and of every size, that process personal data must comply with the GDPR: governments, Internet service providers, media outlets, banks, medical offices, universities—any entity that collects digital information about people.
The law’s main innovation is to establish a bedrock principle of accountability. It places responsibility for properly collecting and processing personal data squarely on organizations and extends to individuals the right to prevent their data from being collected or processed. For example, if a company collects data about me for the purposes of marketing goods or services to me, I have the right to object at any time, which would compel the company to stop both the data collection and the marketing. The GDPR also gives individuals the right to insist that their data be deleted. For example, if I provide a company with my e-mail address in the course of doing business with it, I can later demand that the company remove it from its files. And in certain circumstances, the GDPR gives individuals the right to move their data from one organization to another. If, for instance, I wish to switch to a new bank or a new mobile phone provider, my account and history information must be handed over at my request by my original provider to my new service provider.
The GDPR requires many organizations to employ a data protection officer to ensure compliance with the new rules. It also compels them to conduct impact assessments, to determine what effects certain kinds of data processing will have on individuals. For example, if a large organization wished to introduce a system that required employees and visitors to provide biometric signatures—retinal scans, say—in order to enter a facility, it would need to conduct an assessment to consider the necessity of such a system, its likely impacts and risks, and how they could be mitigated. The GDPR obliges organizations to design their systems so as to limit the amount of personal data they collect and to maximize the security of that data. For instance, if a company requires potential customers to fill out an online form to request a price quote, the form can ask only for information that is strictly necessary to fulfill that request. The new rules also require organizations to immediately notify the authorities whenever they experience a data breach that poses a risk to individuals. Finally, the GDPR mandates that organizations provide the public with clear, detailed information about the personal data they collect and process—and precisely why they do so. Crucially, the GDPR backs up all of these rules by giving regulators new enforcement tools, including the ability to issue injunctions to force compliance with the law and the authority to impose steep fines on organizations that run afoul of it.
Meeting the accountability standards imposed by the new law will require large organizations to make significant investments; for the sake of efficiency, they will have an incentive to apply the new procedures and systems not just in Europe but also in every market where they operate. What is more, the GDPR stipulates that if an organization transfers personal data out of the EU, it must ensure that the data will be treated in the new location just as it would be in the EU. In this way, the standards embodied by the GDPR will be exported to the rest of the world.
Making the New Law Work
The GDPR is not perfect. For one thing, it doesn’t solve the perennial challenge of clearly defining what counts as personal data. This reflects the fact that three decades into the digital age, governments, regulators, theorists, companies, and ordinary people are all still struggling to figure out what kind of information must be protected. The new law also leaves it to supervisory authorities and courts to clearly distinguish between serious violations and mere technical infringements. The GDPR also fails to make it easier for data protection authorities to prioritize complaints based on the severity of the alleged abuse. For this reason, the interpretation of the statute ought to be guided not only by legal doctrine but also by common sense, as Michal Bobek, an advocate general for the European Court of Justice, has suggested.
There are also broader, less technical problems with the law. For one thing, the fact is that, in practice, upholding EU privacy standards in other parts of the world will often require political and diplomatic tools that the GDPR alone cannot provide. And then there is the problem posed by the incredibly fast pace of technological change: as artificial intelligence and machine learning advance, the very meaning (and even relevance) of “consent” and “privacy” will change, and it will be hard for laws and regulators to keep up with both the technological progress and the evolution of social norms surrounding technology.
Considering such obstacles, one could be forgiven for thinking that the GDPR is doomed. But such pessimism is unwarranted. The GDPR gives regulators what they need to tackle even the hardest cases—and most cases won’t be all that hard. It also gives regulators the chance to help various sectors develop codes of conduct that reflect the GDPR’s ethos of accountability and to educate the public about how individuals can exercise their rights.
In a recent op-ed in the Financial Times, U.S. Secretary of Commerce Wilbur Ross complained that the GDPR “creates serious, unclear legal obligations for both private and public sector entities, including the US government. We do not have a clear understanding of what is required to comply.” His concerns are warranted. But it’s precisely the role of capable regulators to guide industry and governments—and to learn along with them. The need for smart, dynamic regulation has become clear even just a few months after the GDPR took effect, as the threat of heavy fines has resulted in too much fear and caution on the part of businesses. Data protection authorities in the EU must not only focus on enforcement but also embrace the task of educating companies about how best to interpret the new law. Regulators should seek strict compliance with the letter of the law but also fundamental changes in behavior that reflect the spirit of the law. Only a balance between enforcement and education can achieve that goal.
To make good on the GDPR’s promise, regulators need to articulate clear standards and develop codes of conduct to help companies and officials adapt to the new reality. Regulators should publish case studies that illustrate how they have put into practice the GDPR’s principles when investigating complaints. Over time, a new body of case law will emerge as individuals take organizations to court to challenge their data practices and as organizations, in turn, take data protection authorities to court to challenge their enforcement actions. Such legal proceedings might lack the high drama of a murder trial or a celebrity divorce. But make no mistake: the future of human autonomy and dignity will be shaped by their outcomes.
You are reading a free article.
Subscribe to Foreign Affairs to get unlimited access.
- Paywall-free reading of new articles and a century of archives
- Unlock access to iOS/Android apps to save editions for offline reading
- Six issues a year in print, online, and audio editions