Privacy Analytics > Resources > White Papers > De-Identification 101: How to Protect Private Health Information
De-Identification 101: How to Protect Private Health Information
The growth in the use of electronic medical records, electronic insurance claims processing and other hospital software systems has led to a rise in the collection and storage of personal health information. Beyond the provision of patient care, this information can be invaluable in driving innovative research and providing new insights to address challenging healthcare problems.
De-identification 101 is designed to give you a broad overview of what de-identification is, what legal requirements are, and how it can impact healthcare. Learn to manage risks, stay compliant, and protect the privacy of individuals.
This primer is part one of our De-identification series. You’ll get an overview of current legislation, up-to-date de-identification strategies, and tips for minimizing attacks on your data, both malicious and inadvertent.
“Without this technology a lot of research we want to do would grind to a halt.”
-Dr. Mark Walker, Scientific Director and Co-director of the BORN Registry
Situation: California’s Consumer Privacy Act inspired Comcast to evolve the way in which they protect the privacy of customers who consent to share personal information with them.
Situation: Integrate.ai’s AI-powered tech helps clients improve their online experience by sharing signals about website visitor intent. They wanted to ensure privacy remained fully protected within the machine learning / AI context that produces these signals.
Situation: Novartis’ digital transformation in drug R&D drives their need to maximize value from vast stores of clinical study data for critical internal research enabled by their data42 platform.
Situation: CancerLinQ™, a subsidiary of American Society of Clinical Oncology, is a rapid learning healthcare system that helps oncologists aggregate and analyze data on cancer patients to improve care. To achieve this goal, they must de-identify patient data provided by subscribing practices across the U.S.
Situation: Needed to ensure the primary market research process was fully compliant with internal policies and regulations such as GDPR.
Situation: Needed to enable AI-driven product innovation with a defensible governance program for the safe and responsible use
of voice-to-text data under Shrems II.
This course runs on the 2nd Wednesday of every month, at 11 a.m. ET (45 mins). Click the button to register and select the date that works best for you.