Privacy Analytics > Resources > White Papers > Differential Privacy and Risk Metrics for Creating Safe Data
Differential Privacy and Risk Metrics for Creating Safe Data
Sensitive data can be reused in many ways to improve healthcare services, uncover new insights and opportunities that can influence healthcare strategies, and develop data products that address societal health needs. Health data can be particularly sensitive as it can reveal a lot about an individual’s medical history and lifestyle.
There are many dimensions to the safe and responsible reuse of data, which can also be thought of in terms of defense in depth, ie, protecting data from unauthorized access and misuse through layers of administrative and technical controls. Technical privacy models are one such control as they are used to assess the risk of disclosure and determine appropriate data transformations that will eliminate those risks.
Differential privacy is a technical privacy model that protects individuals by requiring that the information contributed by any individual does not significantly affect the output. More specifically, differential privacy is a mathematical property that defines an adjustable information limit.
Situation: California’s Consumer Privacy Act inspired Comcast to evolve the way in which they protect the privacy of customers who consent to share personal information with them.
Situation: Integrate.ai’s AI-powered tech helps clients improve their online experience by sharing signals about website visitor intent. They wanted to ensure privacy remained fully protected within the machine learning / AI context that produces these signals.
Situation: Novartis’ digital transformation in drug R&D drives their need to maximize value from vast stores of clinical study data for critical internal research enabled by their data42 platform.
Situation: CancerLinQ™, a subsidiary of American Society of Clinical Oncology, is a rapid learning healthcare system that helps oncologists aggregate and analyze data on cancer patients to improve care. To achieve this goal, they must de-identify patient data provided by subscribing practices across the U.S.
Situation: Needed to ensure the primary market research process was fully compliant with internal policies and regulations such as GDPR.
Situation: Needed to enable AI-driven product innovation with a defensible governance program for the safe and responsible use
of voice-to-text data under Shrems II.
This course runs on the 2nd Wednesday of every month, at 11 a.m. ET (45 mins). Click the button to register and select the date that works best for you.