Discussion: (0 comments)
There are no comments available.
A public policy blog from AEI
The latest on technology policy from AEI, published daily.
More options: Share,
View related content: Technology and Innovation
Peter Winn, director of the Office of Privacy and Civil Liberties and acting chief privacy officer at the Department of Justice spoke at AEI last week. He noted that trust is fundamental to the efficacy of any institution, whether a firm, a country, or the DOJ itself. He described the DOJ’s duty to “enforce the law, defend the interest of the United States according to the law, and to ensure fair and impartial administration of justice. . . . That’s the value we create, and our brand has to be trusted.” Winn’s office is responsible for the DOJ’s adherence to many privacy and data protection laws including the 1974 Privacy Act, among the world’s first laws regulating government use of computerized information. The act itself does not guarantee a right of privacy, but rather ensures the mutual interest and partnership of individuals and government to maintain accuracy of information.
Leviathan state, property rights, or multi-stakeholder governance?
Winn noted the false choice that online privacy governance must either be a leviathan state (social control with an absolute sovereign) or a free-market system based purely on property rights. He highlighted the work of Nobel economist Elinor Ostrom on common pool resources as a trust-based alternative and noted how such a model is already used in the Federal Information Processing Standards, or FIPS, and the FBI Domestic Investigations and Operations Guide, the set of standard operating procedures informed by the experience of FBI agents.
For illustration, Winn described the “blabbermouth system,” which schoolyard children use to keep each other honest through trust and graduated system for punishments. While the teacher is on hand to intervene in serious situations, if she intervenes too much, trust among the children breaks down, and they start to game the teacher. The blabbermouth system allows the children to learn, practice good behavior, and keep compliance costs low. “When users are genuinely engaged in decisions regarding rules affecting their use, the likelihood of them following the rules and monitoring others is much greater than when an authority simply imposes rules,” notes Ostrom. Such multi-stakeholder governance has been the norm for the internet itself, standard-setting organizations, and other areas of emergent science and technology in which knowledge about the future is limited.
The US risk-based model versus the EU’s command and control model
The US has over time implemented many rules to protect individuals’ information from overreach by the administrative state while recognizing the benefits of some information being in the public domain (for example, phone numbers and addresses in telephone books, subject to opt out for a fee). Hence, policy has generally promoted the importance of trust, a confidence in the reliability of a system, rather than the absolute right of an individual to seclude information about oneself.
Over time a system to analyze risk emerged, and, recognizing the sensitivity of certain information, rules were adopted to manage sensitive data, notably that of children, health, and finance. Meanwhile other sectors and industries remain subject to discipline based on actual harms and tests for unfairness and deception. Notably, this framework of permissionless innovation has been important for consumers and entrepreneurs.
The European Union’s General Data Protection Regulation is predicated on information being finite and extinguishable, hence the demand of a right to erase it or retire it at the end of its life. Its underpinnings can be traced to privacy laws in West Germany that were adopted in response to the horrific surveillance conducted by the Stasi, the East German secret police.
Given its experience with two devastating world wars, Europeans can be understandably pessimistic about the future and take a negative view of how personal information can be abused. The US, on the other hand, having the longest-lived constitutional democracy and the good fortune of two relatively peaceful neighbors, has a relatively optimistic view of the future.
The right to know versus the right of privacy
Winn noted that knowledge about people is what businesses use to create economic value for customers, and it’s what law enforcement agencies use to ensure public safety. He contrasted the benefits of America’s historical risk-based model for online data with the GDPR’s command and control model and its universal rules for all contexts, which has created negative consequences that go far beyond their intended purposes.
For example, since the GDPR’s implementation, the contact information of domain registrars published in the WHOIS online database is no longer available. This frustrates the DOJ’s ability to carry out its mission to detect and deter cyberthreats from malicious online actors, who previously could be discerned from now-obscured public information. Indeed, a strict interpretation of the GDPR means that artificial intelligence is illegal.
The US model has avoided this kind of collateral damage because it has focused regulatory resources on actual risks and harms to consumers rather than assuming that all data collection is suspect and must be regulated with a one-size-fits-all approach. Moreover, America’s federal policy process requires significant buy in from the major stakeholders before proposals can become law. This ensures greater compliance to the law and hence greater trust in the system.
The GDPR’s compliance costs are high, about $3 million for medium to large enterprise. An estimated 20 percent of firms will probably never comply because of this high cost. Winn noted that European data protection authorities are woefully underfunded given their mandate to regulate information in the public and private sectors. Indeed, the OPCL budget to ensure compliance for 100,000 DOJ staff is larger than that of a typical European data protection authority.
This is important to keep in mind as reports suggest that a low-end estimate of the implementation costs of the California Consumer Privacy Act is $100,000 per firm. A preliminary cost-benefit analysis suggests that at this level, the cost is four times greater than the projected benefit.
Any comprehensive federal privacy legislation should be informed by all stakeholders and be based on reasonable standards for consumer preferences and legitimate uses of information. It should avoid adversarial notions of privacy as control and instead create structures that promote trust. It also requires a real-world discussion about the costs of implementation, for without compliance to the law, there is no trust.
There are no comments available.
1789 Massachusetts Avenue, NW, Washington, DC 20036
© 2019 American Enterprise Institute