Discussion: (0 comments)
There are no comments available.
A public policy blog from AEI
The latest on technology policy from AEI, published daily.
View related content: Technology and Innovation
In this week’s noisy news cycle, you may have missed that the location of US and UK (and other) military bases were revealed to the worldwide public. Moreover, as one writer described it, “Once someone makes a data request for a specific geographic location — a nuclear weapons facility, for example — it’s possible to view the names, running speeds, running routes and heart rates of anyone who shared their fitness data within that area.” How? The revelation of so much sensitive data was not the product of sophisticated espionage or an intentional data dump à la WikiLeaks, but rather the result of an update to a smartphone app called Strava. Before you dismiss the story as “so Google Maps from five years ago,” I’d argue that this time is different.
That so many people who know (or should have known) better did not protect their sensitive location data may be a sign that people don’t care about privacy. That may be true, but a closer look at the app shows something else might also be going on: The app makes opting for privacy confusing and difficult. This is important because our privacy and consumer protection laws allow companies to use personal data unless a user “opts out.” Therefore, to be meaningful, opt-outs need to be clear and understandable. That is not the case with Strava.
As background, Strava, which calls itself “The Social Network for Athletes,” has as many as 27 million users, almost 80 percent of them from outside the United States, who access the app through fitness and activity trackers made by companies such as Fitbit and Garmin. The object is to create a fitness tracking tool that also links users to a community of encouragement and comradery.
Essentially the data of military personnel were revealed because servicemen and women had their app on while exercising on base. If that was all to the story, the lesson would be simple — make sure you check your privacy settings (especially if you are working at a super-secret military facility). For example, a feature Strava thought would be “cool” was to use each user’s activity data to light up a global “heat map.” The default permitting the use of that data is conveniently pre-checked in the terms of service.
Of course you could uncheck that box to opt out, making your data not part of the map. Easy, no big deal, right? Again, unfortunately, that’s not the end of the story. The problem with Strava is the same problem that has caused complaints with many other apps and digital services — the opt-out is cumbersome, multilayered, and difficult to navigate as to be impractical to select.
Over a year ago, Rosie Spinks of Quartz Media wrote a great article about the difficulty of controlling privacy in the Strava app. She initially believed that by clicking “Enhanced Privacy” in the app she had limited her data to friends in her network, but she soon started receiving creepy “likes” from strangers who followed her runs. She also discovered they had (unbeknownst to her) access to her name and picture.
As Spinks wrote:
It’s true that as a user, it’s my responsibility to know what I’m opting into when I use a service, especially when it’s free. But the fact that it took me three rounds of emails with a support rep, a call with Vontz, and a follow-up email round with him to fully understand how to prevent strangers from seeing my running routes is troubling. To borrow a phrase from English law, “a man on a Clapham omnibus”—or an average ordinary and reasonable human being—would likely not assume that their first and last name would be broadcast so indiscriminately despite having an app’s enhanced privacy enabled.
Strava is a social network as much as a fitness tracking tool, and they must balance those functions accordingly. But the multi-layered, opt-out heavy, and rather unclear nature of their settings still seems like a problem.
It does seem like a problem. And Strava is not alone — I have written on the Electronic Privacy Information Center’s case against Google that cites difficult opt-outs as part of the facts supporting their claims. Many users simply do not know what information is tracked by a given app or device. For example, iPhones are set to track how often and for what length of time a given location is visited to create a map of “Frequent Locations.” This feature facilitates the offering of useful information to the user, but it is also a “divorce lawyers dream” and may not be as confined to the device as Apple portrays. To turn it off takes about five steps, but the bigger problem is knowing the feature exists in the first place.
We could say that all these privacy risks are just the price of participation in the digital era, but why then is there a national debate on whether the default rule on privacy should not be the current opt-out model, but rather an opt-in system? It does seem that at least some people care about privacy (and some people, such as the military, need to care more about privacy) and that an opt-out system is an efficient balance of the data desires of businesses and the privacy concerns of individuals. But if opt-outs are cumbersome, arguably because companies lose money if you exercise them, the balance is broken — arguably too much transactions cost is being levied on the individual. It’s not hard to imagine that companies would make opt-ins quite streamlined if they needed them to make money. Too much corporate resistance to privacy concerns may land companies in a more draconian legal landscape.
To be sure that does not happen, companies would be wise to heed the warning of Strava’s own CEO, who when asked in a 2017 interview “What’s something that you saw at Facebook or Instagram that you want to do differently at Strava?” said:
I think the first thing is just avoid surprises. So you remember [Facebook’s] social reader? Social reader was when you’d read an article and then it would post back to your feed. I had one where my wife had read a Cosmopolitan article that was very badly titled … [it] posted to her friends. That was a surprise. People don’t like surprises.
No they don’t.
There are no comments available.
1789 Massachusetts Avenue, NW, Washington, DC 20036
© 2018 American Enterprise Institute