A variety of tools known collectively as “health tech” offer promise in improving patient care and operating efficiency, but widespread misconceptions about privacy laws and how data can be used are holding back adoption of these tools by healthcare providers.
Technologies such as remote care, as well as the use of artificial intelligence and machine learning in areas including diagnostics, patient management and medical research, offer compelling advantages in helping providers deliver (and patients receive) care more efficiently by reducing manual effort and potential errors.
However, there is misunderstanding among providers and health tech companies about the implementation of privacy regulations such as the U.S. Health Insurance Portability and Accountability Act (HIPAA) and the European Union’s General Data Protection Regulation (GDPR). These privacy misconceptions are a significant roadblock for innovation in the healthcare industry.
The term “HIPAA” has become a generalized shorthand for “privacy regulation,” with many people believing incorrectly that the law prohibits organizations from accessing or sharing personal information. In reality, HIPAA applies only to specific entities (e.g., healthcare providers, healthcare clearinghouses, business associates).
HIPAA does not prohibit the use or sharing of personally identifiable information (PII). Instead, the law seeks to properly balance the public interest in promoting medical research and the efficiency of care with patients’ rights to privacy by outlining requirements concerning the use and the disclosure of PII.
A medical record without an associated patient name is not automatically anonymous if other data provides potential indications of the patient’s identity.
For instance, MRI images that are shared between a provider and its business partners may not have an associated patient name, but the health data contained in the image provides a reasonable basis to believe the MRI can provide enough information to identify a patient. Therefore, image files must be protected appropriately under applicable privacy laws (e.g., HIPAA or GDPR).
If personal data is pseudonymized, this does not mean that the privacy laws (e.g., HIPAA or GDPR) are out of scope. Pseudonymized data can potentially be reversed.
Under GDPR, anonymous data can only be used if it is blurred sufficiently to prevent singling out an individual, linking any records related to an individual or providing enough data to infer information related to a specific data subject.
Under HIPAA, pseudonymized data refers to encryption and scrambling methods, which can be reversed. Anonymizing data requires data blurring techniques that make it impossible to identify an individual.
A health tech company processing patient data on behalf of a healthcare provider cannot assume that it owns that personal data and has the ability to share it with other organizations as it finds appropriate.
Agreements between organizations will dictate the responsibilities of each partner to process and store data appropriately under the provisions of the applicable laws. Health tech companies acting on behalf of healthcare providers should understand and strictly follow their instructions with respect to the use and disclosure of personal data.
Many providers and health tech companies inaccurately believe they cannot use information without specific patient permission. But HIPAA and GDPR both contain provisions for the processing of patient data based on other legal grounds depending on the circumstances. Such legal grounds should be carefully reviewed before rejecting the processing of personal data.
Breaking down these common misconceptions is the first step toward a proper understanding of privacy laws. This understanding can help healthcare organizations and ambitious health tech startups deploy innovative health-related solutions to improve patient care, medical research and operational efficiency in the healthcare industry.
If you have questions or want to learn more about how to protect your data and maintain regulatory compliance, contact our data privacy experts.