Privacy Analytics > Resources > White Papers > Perspectives on Health Data De-identification
Perspectives on Health Data De-identification
Has your organization been considering de-identification? It’s not a simple undertaking to use Protected Health Information (PHI) for secondary purposes. There is a lot of risk when dealing with PHI. There are legal considerations, financial considerations, and reputational considerations. The costs of a breach are staggering, but there are valid reasons to utilize health data for secondary purposes. Understanding the risks is important but also understanding the reasons and methods to safely and responsibly releasing PHI.
Exploring this topic is not strange to us – in fact, we created the white paper, Perspectives on Heath Data De-Identification with the intention of delving deeper into reoccurring topics. In true salon fashion, we gathered three pervasive themes into one place. The pieces feature the voice of Dr. Khaled El Emam, a leading expert in the space of HIPAA’s Expert Determination and strong advocate for a risk-based approach.
In the first article, On the Limits of the Safe Harbor De-Identification, Dr. El Emam takes a critical view at the Safe Harbor standard so many organizations are using. He looks specifically at how it lacks a risk measurement, key to stopping adversaries from reversing Safe Harbor techniques and re-identifying individuals in datasets.
In Benefiting from Big Data while Protecting Individual Privacy, he enforces the fact we do live in the age of Big Data – but also big privacy. Both can co-exist, but only when the right protections are placed on PHI.
His last article, De-Identification and Data Masking, explains the key differentiators in these two methods. He reviews the right – and wrong – techniques for effective masking and steers his audience into a discussion of why masking is just not enough for most organizations. He wraps it up with key takeaways to remember when applying a defensible strategy.
Discussion around de-identification has been limited to small circles. Come on in and learn more from the experts.
Situation: California’s Consumer Privacy Act inspired Comcast to evolve the way in which they protect the privacy of customers who consent to share personal information with them.
Situation: Integrate.ai’s AI-powered tech helps clients improve their online experience by sharing signals about website visitor intent. They wanted to ensure privacy remained fully protected within the machine learning / AI context that produces these signals.
Situation: Novartis’ digital transformation in drug R&D drives their need to maximize value from vast stores of clinical study data for critical internal research enabled by their data42 platform.
Situation: CancerLinQ™, a subsidiary of American Society of Clinical Oncology, is a rapid learning healthcare system that helps oncologists aggregate and analyze data on cancer patients to improve care. To achieve this goal, they must de-identify patient data provided by subscribing practices across the U.S.
Situation: Needed to ensure the primary market research process was fully compliant with internal policies and regulations such as GDPR.
Situation: Needed to enable AI-driven product innovation with a defensible governance program for the safe and responsible use
of voice-to-text data under Shrems II.
This course runs on the 2nd Wednesday of every month, at 11 a.m. ET (45 mins). Click the button to register and select the date that works best for you.