RWE and Risk: A Primer
As pharma companies have become more sophisticated in their use of real-world evidence (RWE), they have moved beyond standard datasets from vendors to access a deeper and wider variety of datasets, often working in partnership with local health systems. Indeed, leading pharma companies have built comprehensive networks and data platforms can that provide a shared understanding to teams across the organization about the reality of what is happening in healthcare. The increasing number and variety of datasets analyzed — including novel sources from social media through to medical imaging — are delivering ground-breaking insights.
However, accessing a growing range of data sources will necessitate new capabilities, including the critical need to protect patient privacy. Many pharma companies cite protecting privacy as one of their primary imperatives in building RWE into their capabilities, but also a key barrier to making progress. Real-world data (RWD) is patient-level data drawn from a variety of sources that all contain varying amounts of protected health information (PHI). Removal of PHI is a critical first step to using this data in RWE analysis. The challenge is how to effectively anonymize the data without diminishing data quality in an exponentially increasing number of contexts.
Fortunately, new software enabled capabilities now exist to address this urgent challenge. Best practice approaches and guidelines have emerged advocating for a risk-based approach to de-identification in order to balance the competing goals of anonymity and quality. Levering an automated de-identification process that uses a risk-based methodology ensures a continuous — and legally compliant — flow of data for RWE analysis.
In this primer, we describe the techniques, software platforms and highlight example use cases of how pharma companies can take both sustainable and secure approaches to access new datasets and build RWE data networks.
Situation: California’s Consumer Privacy Act inspired Comcast to evolve the way in which they protect the privacy of customers who consent to share personal information with them.
Situation: Integrate.ai’s AI-powered tech helps clients improve their online experience by sharing signals about website visitor intent. They wanted to ensure privacy remained fully protected within the machine learning / AI context that produces these signals.
Situation: Novartis’ digital transformation in drug R&D drives their need to maximize value from vast stores of clinical study data for critical internal research enabled by their data42 platform.
Situation: CancerLinQ™, a subsidiary of American Society of Clinical Oncology, is a rapid learning healthcare system that helps oncologists aggregate and analyze data on cancer patients to improve care. To achieve this goal, they must de-identify patient data provided by subscribing practices across the U.S.
Situation: Needed to ensure the primary market research process was fully compliant with internal policies and regulations such as GDPR.
Situation: Needed to enable AI-driven product innovation with a defensible governance program for the safe and responsible use
of voice-to-text data under Shrems II.
This course runs on the 2nd Wednesday of every month, at 11 a.m. ET (45 mins). Click the button to register and select the date that works best for you.