Security and Connected Devices: Why you can't always take it for granted
Our personal health data is considered to be some of the most sensitive and precious data despite having limited commercial value to hackers, which is why regulations like HIPAA exist. While most people in the Healthcare IT field take this responsibility very seriously, there are unfortunately some who don’t. There are probably more who believe that they do, but they may not have a detailed understanding or their risks, or they may trust one of their vendors more than they probably should.
There are some unique considerations to privacy and security in the connected device space that a responsible RPM provider needs to consider on top of all of the regular considerations they must undertake. Below are some of the issues that we’ve seen come up, or that we can anticipate based upon our knowledge of the space.
This is by no means an exhaustive list, but may provide some additional starting points for assessing potential security risks of your connected device operations.
Do your devices store PHI?
Many connected devices, including most cellular-based devices do not store any PHI. If a device does store PHI for some reason then this amplifies the risk. Assessing the security around this may be difficult for a SaaS-type of company to assess, as it may require an audit of the device’s firmware.
Are all of your suppliers HIPAA compliant and are willing to sign a BAA?
If you are providing your software to covered entities then you are covered by HIPAA, and any of your suppliers that touch patient PHI must also be covered by HIPAA. This means that they must also sign a BAA. Some companies may be tempted to take shortcuts by working with companies that will not do this, including companies that sync data from Bluetooth-based devices, but this is generally not okay. In order to be compliant, every company in the chain of PHI must sign a BAA.
For example, if you are planning to use a Bluetooth-based device that connects to a mobile app, then you must have a BAA in place with the company that produces the app and the device in order to remain compliant with HIPAA. The same goes with any type of intermediary company that pushes patient data and PHI through their platform (e.g. Apple Health Kit). Unfortunately, the salespeople at these companies are often either unaware of these issues or unwilling to tell you until after you’ve invested in performing an integration with them, at which point you may find out either 1) they are unable to sign a BAA or 2) there is a significant fee for doing so. It’s not always their fault - in many cases these companies are used to working with direct-to-consumer or general wellness applications, in which case HIPAA may not actually apply. In general, if you are working with healthcare providers (aka “covered entities”), then HIPAA applies to you. If you’re not certain, then it’s generally better to assume that you and all of your vendors need to comply.
Are your suppliers adequately secure?
When we dug into it, we were disappointed to find out that many of the legacy suppliers in our space were claiming a level of device security that unfortunately was overrepresented and in reality didn’t exist that relates to authenticating that a device is what it says it is.
If this vulnerability was discovered by the right hacker, the implications of this could have been really severe–any RPM software syncing data from these devices could be fed false data and they would have no way to know it. The result could be disastrous and dangerous - thousands could be created and there would be no way of knowing which of the millions of data points being tracked are real and which are false. In other words, it could render the entire industry worthless.
When our CTO, Aziz, realized this, he calmly but firmly required all of the participating companies to fix their issues, which was rooted in their implementation of their certificate system. This system requires both parties to exchange mutually verified certificates. We also automatically rotate the certificates of our service on a yearly basis, which is an industry best practice to maximize security.
In this case, the companies claimed to have a certificate system, but after Aziz’s close inspection of their codebase, he confirmed that a gamut of critical security checks were not fully enforced, as admitted by its developers. They had been faking it with hundreds or thousands, or maybe millions of devices that had been released into the field. Their customers, who distributed these devices to other RPM companies, either never noticed it or decided not to worry about it. They were “industry people” but they were non-technical and didn’t employ top software engineers in-house, resulting in the type of vulnerability that could take down an entire company, or perhaps even an industry.
In this case there was no way for the RPM company to know this unless they inspected the codebase throughout the entire chain of information. They were being (falsely) told that the level of security was much higher than it really was.
Fortunately, Aziz was able to provide these suppliers with direction to ensure that our bar for security was met, even though it meant extra time and work, and even though it amounted to providing some valuable free consulting to them. These technical improvements are now available to all of the companies in the space and we hope that they are adopted.
Do your suppliers use offshore subcontractors to develop their software?
While it’s not illegal, companies that are highly concerned with security often insist on hiring their own developers in-house. There is a logical reason for this, it’s far more difficult to control policies and processes when a software developer is located on the other side of the planet. If U.S. privacy or security laws are broken they may be difficult or even impossible to enforce.
That’s why many health systems and medical groups prohibit purchasing software that is developed or maintained by non U.S.-based developers as a rule of thumb. Even if it’s not technically illegal, it may be a smart business move to ensure that all of the software you use that touches PHI is developed and maintained in the U.S. It may end up leaving more doors open for business as your company expands and grows. If your suppliers do use offshore developers then it may be wise to ask to see their credentials, and to examine the BAA that they have in place with them.