Schrems II Decision: Incentive to Anonymize

Schrems II Decision: Incentive to Anonymize

By Jordan Collins

The EU-U.S. Privacy Shield was a framework for transferring personal data from the EU to the U.S. while complying with data protection requirements. Since the program’s inception, more than 5400 companies had signed up, including over 1000 in the last year. The Court of Justice of the European Union (CJEU) has recently declared the EU-U.S. Privacy Shield invalid (JUDGMENT OF THE COURT (Grand Chamber)), known as the Schrems II decision. The Privacy Shield no longer serves a legal basis for the transfer of personal data from Europe to the U.S., no grace period has been announced so far for the invalidation the EU-U.S. Privacy Shield, which means that companies with transfers to the U.S. are left scrambling in legal limbo. (What Privacy Shield organizations should do in the wake of ‘Schrems II’)

What about contracts?

The drive behind the CJEU decision to no longer recognize the Privacy Shield as a legal basis is partially, or entirely, due to the fact that the U.S. government can take data at will, essentially, under the guise of national security.  Whereas the court affirmed the validity of standard contractual clauses, companies must revisit these contracts, verifying on a case-by-case basis whether the laws in recipient countries ensure adequate protection. It is also unclear how standard contractual clauses could circumvent the U.S. government to take data under the guise of national security. (DPC statement on CJEU decision; Heavy times for the international exchange of data)

The ruling thereby creates a great amount of uncertainty for any EU-based firm wishing to send data to the U.S.. In fact, the ruling portends broader changes for EU companies operating in jurisdictions where government surveillance of data is perceived as a significant risk. Countries other than the U.S. – China or India, for instance – have potentially expansive surveillance powers.

For standard contractual clauses to be employed, it would appear that a legal opinion in the relevant jurisdiction must be sought on a case-by-case basis to assess the data recipient’s legal system. The result of such legal analysis could be that the legal system is deemed inadequate, especially in countries with potentially expansive surveillance powers, necessitating the search for alternative data sharing mechanisms. In any case, such analyses will bear additional costs and likely extensive time delays.

How could anonymization help?

As organizations await clear guidance on how to legally transfer personal data from the EU to the U.S., some will look for alternatives to address these new challenges. One such alternative is to transform the data prior to sharing, using a statistical approach to anonymize data based on all the means reasonably likely to be used to identity an individual directly or indirectly. Such transformed data would fall outside of the present ruling and other GDPR obligations, while supporting innovative uses of data such as new research outcomes, improved features or services.

A statistical approach to anonymization, when done properly, renders the data non-identifiable while accounting for the context of the data sharing scenario (Watch: How Does Risk-Based Anonymization Work). Under this approach, the strength of contractual agreements, the possibility of government surveillance, and other environmental risks and controls can be accounted for to determine the right level of transformation required to render the data anonymous.

Simple pseudonymization—the masking or removal of direct identifiers from the data set—although necessary, is not sufficient to anonymize data (Watch: Data-masking, De-identification and Anonymization) . Pseudonymized data is still personal data under GDPR. The key to a successful approach to statistical anonymization resides in the rigor of the method, as well as the justification and documentation of the approach to demonstrate that enough has been done to protect the data from all reasonable threats, potentially including the possibility of government surveillance.

Prior to the Schrems II decision, statistical anonymization was already a viable option exercised by many organizations. With this new ruling, this approach is likely to become more attractive, especially as the costs and time delays associated with the legal analysis of standard contractual clauses on a case-by-case basis become heightened.

Conclusion

Businesses no longer have the Privacy Shield as a legal basis and the standard contractual clauses are on shaky ground. As the Hamburg DPA points out, it is inconsistent for the CJEU to strike down the privacy law and uphold the standard contractual clauses, because a contract cannot address the shortcomings of the Privacy Shield. (Heavy times for the international exchange of data) The DPA states:

If the invalidity of the Privacy Shield is primarily due to the escalating secret service activities in the U.S.A, the same must also apply to the standard contractual clauses. Contractual agreements between data exporter and importer are equally unsuitable to protect those affected from state access.

For organizations struggling to find the right guidance from their DPAs, or where the legal analysis of their standard contractual clauses seems to impose an excessive financial burden, statistical anonymization might just be the right fit.

Looking for more great content?

Browse the resource library for articles, case studies, videos and more.

Archiving / Destroying

Are you unleashing the full value of data you retain?

Your Challenges

Do you need help...

OUR SOLUTION

Value Retention

Client Success

Client: Comcast

Situation: California’s Consumer Privacy Act inspired Comcast to evolve the way in which they protect the privacy of customers who consent to share personal information with them.

Evaluating

Are you achieving intended outcomes from data?

Your Challenge

Do you need help...

OUR SOLUTION

Unbiased Results

Client Success

Client: Integrate.ai

Situation: Integrate.ai’s AI-powered tech helps clients improve their online experience by sharing signals about website visitor intent. They wanted to ensure privacy remained fully protected within the machine learning / AI context that produces these signals.

Accessing

Do the right people have the right data?

Your Challenges

Do you need help...

OUR SOLUTION

Usable and Reusable Data

Client Success

Client: Novartis

Situation: Novartis’ digital transformation in drug R&D drives their need to maximize value from vast stores of clinical study data for critical internal research enabled by their data42 platform.

 

Maintaining

Are you empowering people to safely leverage trusted data?

Your Challenges

Do you need help...

OUR SOLUTION

Security / compliance efficiency

CLIENT SUCCESS

Client: ASCO’s CancerLinQ

Situation: CancerLinQ™, a subsidiary of American Society of Clinical Oncology, is a rapid learning healthcare system that helps oncologists aggregate and analyze data on cancer patients to improve care. To achieve this goal, they must de-identify patient data provided by subscribing practices across the U.S.

 

Acquiring / Collecting

Are you acquiring the right data? Do you have appropriate consent?

Your Challenge

Do you need help...

OUR SOLUTIONS

Consent / Contracting strategy

Client Success

Client: IQVIA

Situation: Needed to ensure the primary market research process was fully compliant with internal policies and regulations such as GDPR. 

 

Planning

Are You Effectively Planning for Success?

Your Challenges

Do you need help...

OUR SOLUTION

Build privacy in by design

Client Success

Client: Nuance

Situation: Needed to enable AI-driven product innovation with a defensible governance program for the safe and responsible use
of voice-to-text data under Shrems II.

 

Join the next 5 Safes Data Privacy webinar

This course runs on the 2nd Wednesday of every month, at 11 a.m. ET (45 mins). Click the button to register and select the date that works best for you.