Privacy Analytics > Resources > White Papers > The Definitive Guide to De-identification
The Definitive Guide to De-identification
More and more organizations are leveraging their health data for secondary purposes. They know just locking away their data in silos will no longer suffice. De-identification can solve a host of healthcare’s most challenging issues – but it needs to be done properly first.
Under HIPAA’s privacy rule are two arms – Safe Harbor and Expert Determination. Safe Harbor is prescriptive approach to de-identification; it can easily be set up and it ensures the organization stays HIPAA compliant. However, for those previously mentioned secondary purposes, the Expert Determination method offers the chance to review the nuances in the data and apply the right techniques.
Before applying de-identification, a key component is measuring risk. Measuring the risk in Protected Health Information (PHI) is essential for effective de-identification. Understanding re-identification risks from a financial, legal and reputational standpoint is crucial to the process. By measuring the risk, organizations can apply a risk-based de-identification strategy that simultaneously protects the individual but also enables the secure release of granular data.
You may have read our de-identification series (101, 201, 301, and 401) and are unsure where to go from there. Now that you have taken the first steps in the expert’s journey, it’s time to take it up a notch. The white paper, The Definitive Guide to De-Identification, authored by health data expert and Privacy Analytics’ founder, Khaled El Emam, offers graduate level education on the subject.
The Definitive Guide to De-Identification covers which organizations would benefit from solid de-identification strategies, the risks therein and reviews the standard approaches under the HIPAA privacy rule. It then outlines the right methodology to take and techniques that should be used to maximize use while minimizing privacy concerns
Situation: California’s Consumer Privacy Act inspired Comcast to evolve the way in which they protect the privacy of customers who consent to share personal information with them.
Situation: Integrate.ai’s AI-powered tech helps clients improve their online experience by sharing signals about website visitor intent. They wanted to ensure privacy remained fully protected within the machine learning / AI context that produces these signals.
Situation: Novartis’ digital transformation in drug R&D drives their need to maximize value from vast stores of clinical study data for critical internal research enabled by their data42 platform.
Situation: CancerLinQ™, a subsidiary of American Society of Clinical Oncology, is a rapid learning healthcare system that helps oncologists aggregate and analyze data on cancer patients to improve care. To achieve this goal, they must de-identify patient data provided by subscribing practices across the U.S.
Situation: Needed to ensure the primary market research process was fully compliant with internal policies and regulations such as GDPR.
Situation: Needed to enable AI-driven product innovation with a defensible governance program for the safe and responsible use
of voice-to-text data under Shrems II.
This course runs on the 2nd Wednesday of every month, at 11 a.m. ET (45 mins). Click the button to register and select the date that works best for you.