Data breaches are now literally more than a daily occurrence—2,216 happened in the past year alone, according to Verizon’s annual , released on April 10.
Health-related organizations accounted for a quarter of those incidents. Three weeks ago, that group claimed a troubling new member: Under Armour’s MyFitnessPal, which had exposed information (usernames, email addresses, and passwords) of up to 150 million users.
This is the first known data breach of a major consumer fitness app. (In 2016, Fitbit , but the issue involved just a few dozen accounts where hackers stole user passwords independently, and used the accounts to seek fraudulent refunds.) But it likely won’t be the last. And what’s at stake is data far more intimate than passwords and usernames. Fitness trackers know our heart rates and step counts. They know we didn’t sleep well last night, maybe because they also know we ate barbecue and had a couple beers. And they track detailed patterns on when and where we like to work out, with whom, and where we live.
It’s important first to understand what data mobile fitness apps and websites collect. The answer: A lot.
That data, in the hands of malicious actors, seems like it could be devastating for users’ security and privacy. Four years ago, there was a after some Strava athletes left their accounts public. Thieves were able to access GPS tracks starting and ending at users’ homes, and even had information on what kind of bikes they owned. In 2016, Russian hackers embarrassing medical details on elite British athletes, including records of so-called therapeutic use exemptions for otherwise banned substances. As the Verizon report notes, ransomware attacks now comprise 39 percent of all malware attacks, a sharp rise from just three years ago.
It raises the question: Is your personal health data at risk of falling into hackers’ hands? What are companies doing to protect against that? And what, if anything, can you do yourself?
To begin answering that, it’s important first to understand what data mobile fitness apps and websites collect. The answer: A lot. ϳԹ contacted five major companies in the health and fitness app space—Apple, Fitbit, Under Armour, Garmin, and Strava—for comment on what information they gather and how they protect it. Of those, only Fitbit made a representative available for comment. Garmin responded to some written questions, while Apple and Strava both pointed to their legal and privacy policies. Under Armour declined to comment, saying it was focused on its users in the wake of the breach.
Those privacy policies are, thankfully, often written in plain English. But they’re also often broad, without specific details on what information is collected. “They’re high-level statements about what is done and how,” says Chris Pierson, a cybersecurity expert and founder of advisory firm Binary Sun. They do give a general sense of what you’re sharing with companies when you use an app or service.
Generally, most apps collect:
- Any data required to set up an account (username, password, gender, height, weight, etc.). Some collect payment data, while others use third-party vendors.
- Any activity data collected as part of providing the service (like, say, your heart rate and GPS track of this morning’s run).
- Personal device information like the Unique Device ID and MAC address (a kind of physical network address for each of your electronic devices).
- If you allow it, access to other applications, like your contacts list. This can be an explicit opt-in, or, in some instances, you grant permission when you click “agree” on the privacy policy and terms of service (that long thing nobody reads).
Much of that information is stored two ways: in a personally identifiable format, which is necessary to serve your account when you sync or upload, and in an aggregated, anonymous form, where it’s used for analytics, advertising, or other revenue purposes.
Hackers tend to go for the former, but the key point is what they’re seeking: information that can be used to make money. And therein lies the saving grace for athletes. “Hackers target things that are scalable,” Pierson says. “It’s scalable to steal 100,000 credit card numbers, or usernames and passwords.” Because your private health data is private and personal, it’s far harder to scale; it’s a batch of one, and—so far, at least—it’s not clear how a hacker can exploit it financially. “There has to be a criminal business purpose, and we have not yet seen this be a focus,” Pierson says.
That doesn’t mean your workout apps aren’t sharing sensitive data. Issues arise when the app maker gives your info to third parties (see: the Cambridge Analytica and Facebook fiasco). There are two broad categories in the latter: third-party vendors and third-party app integrations—like when you link your Garmin Connect and Strava accounts, or sign in to Fitbit with your Facebook login. Third-party app integrations are selected by you, the user. That still presents issues, because it’s not always clear what you’re sharing when you click “agree.”
If you’ve linked accounts, you may be sharing data across many categories that you may not realize or intend to, like your phone’s contact list, or your location, or access to your photo album. Once that data is in the hands of a third party, what happens to it is largely governed by that company’s privacy policy.
You have even less control over the vendors. Companies with them, and sometimes they shouldn’t. Grindr recently when it was found to be sharing users’ HIV status with third-party vendors.
Even if a company is using best-in-class encryption and other security tools, it may not matter all that much.
Most companies won’t discuss how they protect that data other than pointing to their privacy and legal policies. “Generally, most companies keep security practices private unless forced to bring it to light,” Pierson says. “It’s shared with vendors, but not the public, because sharing the security protocols could put the company at a disadvantage.”
It’s important for companies to use up-to-date security protocols. The exact method of how hackers breached MyFitnessPal isn’t publicly known, but we do know that Under Armour had been using an —or hashing—for some of its password data. But the truth is that even if a company is using best-in-class encryption and other security tools, it may not matter all that much. That’s because most breaches bypass those protections.
The Verizon report estimates that 70 percent of data breaches start with phishing, where an employee of the company whose app you use—or one of its vendors—unwittingly clicks a link in a fraudulent email, which activates malware that lets the hacker into the computer to steal credentials and encryption keys. Phishing is how Russian hackers entered the Democratic National Committee’s server network. (A surprisingly large share of the rest involves inside actors, according to the Verizon report.) “Once someone , regardless of the encryption, they get access,” says Marc Bown, director of security for Fitbit. They can either access the keys the application uses and apply that to raw (encrypted) data on the disk, or they can just ask the application to access the data on their behalf.
On the corporate side, Pierson points out several practices he’d like to see, which mostly start with having the right team and security culture in place to protect against lazy practices. Good culture might include participating in a “bug bounty” program to pay people to find vulnerabilities, as Fitbit does.
we looked at are written in plain language, but they’re also long and hard to scroll and read on mobile. Pierson would like to see clearer, more concise language so users won’t just click “accept” without reading. And since the data is only as secure as the weakest link in the system, he’d like to see more companies start auditing their third-party contractors and requiring the same security practices of them. (Some, like Fitbit, already do.)
One looming change is regulatory. A new law in the European Union called the General Data Protection Regulation (GDPR) requires service providers to of data privacy and security practices. The law only applies to data on users in the European Union, but the way cloud storage works, it’s actually most practical to implement across all users. In its reply to ϳԹ, Garmin said it would deploy GDPR standards worldwide as of May 25, when it takes effect. (Apple’s new iOS 11.3 and MacOS 10.13.4 operating systems will meet GDPR standards as well, first in the EU and later worldwide.)
So, what can you do? Other than being an informed consumer, not much. But that counts for a lot. More than four years after the Strava bike-theft vulnerability came to light (and long after Strava added a “privacy zones” function for workout GPS tracks that, even on public profiles, starts and ends tracking outside a bubble around the user’s home), England’s Sky News reported in March that, yep, had failed to control his privacy settings, and his bike got stolen.
The rest is common sense. Don’t use identical or even similar passwords across sites. (The MyFitnessPal breach exposed passwords that the hackers will likely try against users’ other accounts, in the hope of getting data like credit card numbers.) Don’t log in to your account over public WiFi, at least not without using a reputable VPN—public networks allow anyone to view traffic, unencrypted. Read those privacy policies; don’t just skim and click “agree.” Periodically check your phone’s privacy settings to ensure you haven’t unwittingly granted some random app access to, say, your microphone. Kudos to Apple for starting to make those permissions and prompts more clear. And stop signing in to Strava from your Facebook account.