Discussion: (0 comments)
There are no comments available.
A public policy blog from AEI
The latest on technology policy from AEI, published daily.
More options: Share,
View related content: Technology and Innovation
The likelihood of federal privacy legislation has waxed and waned many times over the past two decades. It’s presently in a waxing phase, so it’s worth reviewing again what privacy and privacy legislation are about. Not the legislative “horse race,” as it were, but where the horses are going.
Privacy is best defined, in my opinion, as the subjective condition people enjoy when they have power to control information about themselves and when they exercise that power consistent with their interests and values. That’s the definition I developed in a paper published nearly 15 years ago, at another time of legislative foment. There’s a lot to that definition, including the role of law and legislation in protecting or undermining people’s power to control information about themselves and thus their privacy. But the first and last parts of the definition are probably the most interesting: Privacy is subjective — people define their own privacy preferences for themselves — and it reflects their own interests and values.
As a thought experiment, imagine the federal government being involved in protecting other subjective values. How would we know that the Happiness Protection Act of 2019 had been a success? Might some measure of gross national piety be expected to rise if Congress were to take up the challenge of protecting people’s reverence? Happiness and piety are subjective and reflective of values, just like privacy. So far what we’ve gotten from privacy legislation is an abundance of forms to fill out, in the case of health privacy, and buttons to dismiss, in the case of European internet privacy.
When considering legislation, one’s policy checklist could take a number of forms. A legal checklist would ask whether there is cognizable harm being done and then whether existing law did not already address that harm. In our immature information economy, there is much to be offended by, but not as much that is clearly harmful. The privacy torts capture truly heinous privacy-invasive behavior, such as when Rutgers University students broadcast the sexual encounter of one of their peers in 2010. Many in the legal academy speak awkwardly of “privacy harms,” a turn of phrase meant to stretch the concept of harm over new territory. Common conceptions of what’s harmful are always subject to revision, but expansions or contractions must stand the test of time.
More popular than the legal policy checklist is the economic policy checklist. That one asks if there is some externality that is subject to correction through clever turns of law or regulation. Are market dynamics not arriving at optimal results? And can that be corrected? This is an issue that Federal Trade Commissioner Christine Wilson emphasized at a recent AEI event, Perspectives on data privacy from the Federal Trade Commission and Department of Justice.
“I think consumers are very much in the dark about the way their information is collected and used,” Commissioner Wilson said. “The level of consumer confusion is significant.” It’s undoubtedly true that most don’t know how information moves in our rapidly changing society. There are significant information asymmetries. And consumers can’t predict well what the consequences will be.
But this rationale for regulation proves too much. There isn’t a dimension of the commercial world about which consumers couldn’t be educated just a little more, perhaps squeezing slightly better decisions out of them. But consumer time is a scarce resource, and consumers’ true interests are as unpredictable to policymakers as the consequences of information flows are to consumers.
There is an argument for hemming in information asymmetries when nondisclosure constitutes fraud — when the consumer isn’t getting what he or she bargained for. But in other matters, such as the privacy consequences of information sharing, the level of priority given to some dimension of a product or service is not subject to expert prediction. It is best determined by consumers themselves.
I would like consumers to care more about privacy, and I’ve dedicated a lot of years to encouraging people to be more aware and more vigilant — better positioned to defend their interests and values. When government seizure of data is involved, people’s normal steps to conceal private information are defeated, so the warrant requirement of the Fourth Amendment is appropriately applied.
But I don’t see that privacy legislation will materially improve people’s privacy, much less their overall welfare. If it does improve privacy in gross, it might do so at a cost to interaction, social life, and commerce that leaves consumers marginally worse off. Indeed, at its best federal privacy legislation might deliver consumers someone else’s sense of privacy, some collection of federal politicians and administrators who have done their level best to figure out what society needs.
Beyond the legal checklist and the economic checklist, there is a third sort of checklist, which is the social one. True privacy is a product of personal responsibility. It is protected by keeping silent, declining to interact, and keeping to oneself what is one’s own business. There is a rich life to be lived without social media, and there are no real impediments to understanding enough about data flows in the information economy to make intelligent decisions. The consumers who really want to have privacy are getting it, because it’s theirs to keep if they want it.
There are no comments available.
1789 Massachusetts Avenue, NW, Washington, DC 20036
© 2019 American Enterprise Institute