Personal data is collected and used more than ever before and in an incredibly diverse number of ways. Traditionally, government and employer ERP systems held our Social Security Numbers, an insurance company database had medical information, and scattered around were records held in the databases of utility companies and others we paid regularly. Security was just as important then, but it required a different approach because data operated on a pretty rigid lifecycle of in-and-out usage; it mostly was just “in” a repository, but was pulled out when needed to bill us or change information.
But look at this very, very small list of things that can collect and transact private data:
- Wearable health devices
- Door locks and doorbells
- Furbys (yes, those creepy toys)
OK, to be fair, Furbys don’t store your SSN or blood type. But they transmit data about your usage and location [?], just as Ring and Nest devices can detect when people are and are not home. Where sensitive, personal data used to be the domain of software, it is now being used to operate and deliver value for physical devices which are embedding software functionality to make them personalized for users.
As Kate Fazzini reported in the Wall Street Journal, a casino in North America was the victim of a cyberattack conducted through a decorative fish tank that was remotely controlled. There have been instances of surveillance cameras and even air-conditioning systems that have been used to gain access to internal systems and data. The combination of physical technological capabilities combined with easy connectivity through APIs and cloud platforms means that data is accessible at increasingly more and more endpoints. Those endpoints are no longer just hidden inside of a database; they are, sometimes, the physical structure of the database itself.
These physical devices contain data, are dependent upon data, and are connected, which means they are also potential threats to your organization’s security. Software is no longer the thing behind the scenes that holds data in esoteric, binary scripts. This is creating new threat opportunities because of the growing number of endpoints these devices provide. Those endpoints are what enable the interaction among software and devices and has created the IoT market, but they’re also increasing the surface area for attacks.
In many organizations, physical and cyber security are treated as separate functions, but that’s no longer a sustainable approach. First off, software is really driving the hardware; it’s the digital elements that make IoT devices and physical resources more than just dumb boxes. But more importantly, it’s about an organization having a mindset of security that functions over everything they do.
As with any technology, part of the onus of security lies with users who must be savvy about the information they provide and the tools they choose to use. But organizations must realize that devices, some of which are necessary to complete one’s job, can be as vulnerable as an unprotected server or a phishing email.
Consider the case of Strava, whose app was used by many U.S. military personal. The app is based on community activity and sharing data about locations and workouts. Through no inherent fault of the technology, the app inadvertently provided locations of military locations and patrol routes used by active duty soldiers. One might reasonably guess that that information, were it operated via a traditional enterprise application, would be surely locked down. But in this case, a conveniently wearable technology that delivered valuable information to the military wasn’t properly given the security controls it needed. How many other technology advancements operate in the same way? Convenience leads to rapid adoption, and the more rapid, the less likely to have been put through the rigorous protocols and policies that govern more commonplace types of applications.
Strava provides a global heat map of the exercise activity of it’s tens of millions of users in almost 200 countries who wear Fitbits and other personal exercise wearables. According to Wired, some military and intelligence workers using Strava and the corresponding devices were revealing their locations and daily patterns. This could help bad actors pinpoint strategic and even secret locations, creating national security vulnerabilities and putting individuals and countries potentially in harm’s way.
Once the initial discovery was made, researches and amateur analysts began to uncover a massive number of international examples that showed Strava activity for military and intelligence agency activity. A journalist for the Daily Beast described how Chinese military analysts would be able to track a Taiwanese soldier’s daily activities at a publicly known missile base along with activity to unknown locations. The thinking is that as soldiers rotate shifts, they are moving around to different missle bases. Those that are not publicly known could easily be detected by identifying the soldier’s location.
It appears that U.S. military policies allow for fitness wearable devices, and that the U.S. Army has even encouraged the use of Fitbits in a sanctioned pilot program. The failure here is not thinking through the ramifications of how and where data is now being used. It’s in using old thinking that looks at devices and physical components of technology as dumb boxes. They are most certainly no longer that; with the rise of IoT and even artificial intelligence, the footprint of sensitive data broadens continuously and must be addressed with the same mindset used for all types of security.
While, to some degree, these are missteps in a burgeoning field and improvement will be made. But a good first step is for security teams to adopt a collaborative and supportive strategy and consider that if any part of the organization is at risk, then the whole operation is vulnerable.