What started out as Strava sharing its data science prowess has turned into a mobile cyber security incident for governments. On January 27, using Strava’s data science information, student Nathan Ruser connected the dots on military base locations recorded by fitness trackers. In addition to the military locations, it is believed by some experts that there are additional risks.
As noted in a recent article in Wired, “This is the part that is perhaps most worrisome, that an individual’s identity might be pullable from the data, either by combining with other information online or by hacking Strava—which just put a major bullseye on itself,” says Peter Singer, strategist and senior fellow at New America, a think tank based in Washington, DC. “Knowing the person, their patterns of life, etc., again would compromise not just privacy but maybe security for individuals in US military, especially if in the Special Operations community.”
What we are left with are several key questions:
- Is policy enough?
- What are the data risk points and what data can be exposed?
- How can this data be exploited?
- Whose data is available?
- Whose responsibility is it to fix?
Is policy enough?
Many groups affected by the Strava issue had policies in place that state how usage of mobile devices should be restricted in secure areas. These policies exist for governments and enterprises alike. In the end though, policies are a benchmark to measure your success against. It may be possible for a very small, highly focused group to follow policies, but beyond a small group it’s generally not possible to achieve 100% policy compliance. (So, if you’re relying solely on data security policies, get your notifications ready now for when your data breach goes public). Policy and regulation have to be supported with effective tools for monitoring and incident response.
What are the data risk points and what data can be exposed?
Strava lists 27 million users around the world, which is a small fraction of the exposure occurring regularly with mobile devices and apps. At Appthority, we track tens of thousands of mobile applications, sourced from Google Play and the Apple App Store that make available this type of data and far worse every day. Our HospitalGown research details this for exposed NoSQL backends such as ElasticSearch and Redis. Our Eavesdropper research covers this for hard coded credentials to relational databases, Amazon S3, EC2, as well as cloud services, messaging, voice call/recording, navigation, and 2 factor authentication APIs.
Data leaked via these mobile application backends includes:
- Fitness tracker data
- Navigation data showing live and historical user location
- Enterprise meeting audio recordings
- Dating app communications
- Federal and municipal law enforcement communications
- Phone call records
- Enterprise VPN records and users
The install volume for these vulnerable applications ranges from small to upwards of 100 million. Some applications are even preloaded on phones preventing users from removing them.
How can this data be exploited?
Peter Singer, stated “Knowing the person, their patterns of life, etc., again would compromise not just privacy but maybe security for individuals in US military, especially if in the Special Operations community.” This is absolutely true. Using navigation data from exposed application backends, an attacker can determine where you go every day and when you go. Attackers can figure out where you work, when you work, when and where you pick your kids up from school, and even if you’re having an affair.
In our research at Appthority, we have come across some very dangerous examples of critical and highly sensitive data being exposed. In one case, exposed backend data servers for federal and municipal law enforcement apps were exposing data that an attacker could use to track unit communications in real time.
It is almost trivial to get enough information from these data sources to move your exploits from data to people and then to their companies or relationships.
These exposures are so frequent and so large that it actually turns into a data science problem for the attacker. The attack can expose enough information to actually profile your life and find the best attack vectors. For example, using information sampled from a data leak, we were able to do the following as showcased visually in the video below:
- Identify individual users
- Identify their connections such as calls, shared locations, or emails
- Identify the most important individuals based on the number of connections
- Group individuals into ‘network neighborhoods’ that indicate who someone is talking to
- Create an attack vector map for reaching those individuals
Whose data is available?
Strava is an example of how personal device data can lead to a compromise of enterprise information. While many of the apps we track are targeted at businesses or even white labeled for Fortune 500 companies, many are personal apps. Despite apps targeting personal devices, the threat extends to the companies and people with whom that person interacts.
Whose responsibility is it to fix?
Strava has announced they are “working with military and government officials to address potentially sensitive data”. This is good news, although this news is also an outlier. During the course of Appthority investigations of similar data leaks, we disclose our findings to the developers. Most often, the disclosures are directly to the companies as they are not participating bug bounty type programs. The vast majority of these direct disclosures go completely unanswered. Many of these companies have no security contact. At best, they have a support@ or info@ email address and they send an automated help desk response. Other companies respond with legal threats. In the minority of responses, you see a fix, most often from large companies with mature security processes. This leaves the ecosystem in a precarious position where security companies can protect their customers but not positively affect the rest of the mobile ecosystem.
Examples of fitness trackers that expose data include the Zikto fitness tracker:
We disclosed to Zikto that their data was available via their unauthenticated backend over HTTP on August 18, 2017 and published our findings on September 20, 2017. This backend is still exposed with some user data metrics increasing by more than double. In many cases, this scenario is the norm.
August 18, 2017 screenshot of backend data.
February 6, 2018 screenshot of backend data.
Ideally we would be in an application ecosystem where policies, developers, and mobile app stores operated in step with each other. There would be a store mandated security email contact for every app. The stores would have a policy that outlined disclosure best practices and a method to remove apps from the store, even popular ones, if they did not resolve high priority vulnerabilities.
For now though, that is a dream. After decades of previous industries attempting this with network and desktop security we haven’t solved this and this is a case where past performance is a likely indicator of future success. The more connected our apps are to our lives and businesses, the more data they collect, the more personal and effective mobile becomes as an attack vector. While we are not yet operating in unison, the ecosystem and tools to help protect it are moving forward. Appthority’s part in this includes not just tools to help our customers detect these apps live in app stores, but also regularly reaching out to the parties who are either impacted by these data exposures and/or have a role to play in closing them down.