Privacy Analytics > Resources > White Papers > Solutions for Complex Data Environments
Solutions for Complex Data Environments
The reuse of personal data is often essential to an organization’s data strategy, but organizations can only reuse personal data if they meet the demands of privacy regulations. Fulfilling these demands in an ad hoc way can lead to bottlenecks and can block organizations from unlocking the true potential of their data.
A well-designed risk-based anonymization solution can effectively address an organization’s needs while fulfilling regulatory requirements. The purpose of a risk-based approach is to replace an otherwise subjective gut check with a more guided decision-making approach that is scalable and proportionate, resulting in solutions that ensure data is useful while being sufficiently protected.
This paper describes anonymization as risk management and presents the Five Safes, an established data-sharing framework that can be used to integrate anonymization seamlessly into an organization’s data protection and privacy processes. The paper also briefly explores real-world applications involving data lakes and hub-and-spoke data collection.
Situation: California’s Consumer Privacy Act inspired Comcast to evolve the way in which they protect the privacy of customers who consent to share personal information with them.
Situation: Integrate.ai’s AI-powered tech helps clients improve their online experience by sharing signals about website visitor intent. They wanted to ensure privacy remained fully protected within the machine learning / AI context that produces these signals.
Situation: Novartis’ digital transformation in drug R&D drives their need to maximize value from vast stores of clinical study data for critical internal research enabled by their data42 platform.
Situation: CancerLinQ™, a subsidiary of American Society of Clinical Oncology, is a rapid learning healthcare system that helps oncologists aggregate and analyze data on cancer patients to improve care. To achieve this goal, they must de-identify patient data provided by subscribing practices across the U.S.
Situation: Needed to ensure the primary market research process was fully compliant with internal policies and regulations such as GDPR.
Situation: Needed to enable AI-driven product innovation with a defensible governance program for the safe and responsible use
of voice-to-text data under Shrems II.
This course runs on the 2nd Wednesday of every month, at 11 a.m. ET (45 mins). Click the button to register and select the date that works best for you.