Privacy Analytics > Resources > Webinars > Privacy, Mobility, & Policy: Understanding Re-identification Risks in Mobility Data
Privacy, Mobility, & Policy
Questions of how re-identifiable mobility data is—and whether individual location traces can be sufficiently anonymized—have become increasingly prominent as governments around the world seek to utilize mobility data for various policy and regulatory purposes. To de-mystify this issue, Privacy Analytics recently conducted a study in partnership with Uber to quantify the privacy risks associated with government access to individual-level mobility data. This webinar expands upon the findings of this collaboration and provides insight into the broader context and importance of this research in the mobility sector.
Watch the webinar and download a copy of Privacy Analytics’ summary report, which evaluates the re-identification risks of Uber’s dataset for California Public Utilities Commission (CPUC).
Uttara Sivaram is the Head of Privacy and Security Public Policy at Uber. Her work is focused on developing data protection standards that can be implemented by public, private, and non-profit sectors alike. In this context, Uttara leads Uber’s initiatives in pushing forward research and discourse on data governance, de-identification, and differential privacy techniques with leading research and advocacy organizations around the world. Prior to joining Uber, Uttara worked in the energy sector, developing data and energy-efficiency products for public utilities and contributing to research on emissions modeling at the Carnegie Endowment for International Peace in Washington D.C. Uttara holds Bachelor’s and Master’s degrees in Public Policy from Stanford University.
Uttara Sivaram
Global Head of Privacy & Security Public Policy, Uber
Robin Horton is a policy data science manager at Uber focused on regulatory and legal data reporting. She coordinates and oversees data productions to U.S. and international regulators and other entities and advises on related policy, privacy and legal issues. She previously worked as an attorney at Ropes & Gray LLP and as an Assistant Attorney General in Massachusetts.
Robin Horton
Policy Data Science Manager, Uber
Brian draws on his expertise in privacy regulation and anonymization to determine what combination of Privacy Analytics’ services and software will best support the immediate and long-term needs of each of our clients. Successful implementation is Brian’s key focus, whether the client’s goal is to use sensitive data to improve service delivery, drive product development or grow revenue.
Brian Rasquinha
Associate Director, Solution Architecture, Privacy Analytics
Luk provides strategic leadership to our clients and to our organization on how to responsibly share and use data. He draws from an extensive background in statistics, data science and anonymization, and from having worked on the regulatory side as Director of Technology Analysis at the Office of the Privacy Commissioner of Canada. Clients rely on Luk to help define the architecture that will enable them to meet their privacy obligations while supporting innovative and scalable uses of their data.
Luk Arbuckle
Chief Methodologist
Situation: California’s Consumer Privacy Act inspired Comcast to evolve the way in which they protect the privacy of customers who consent to share personal information with them.
Situation: Integrate.ai’s AI-powered tech helps clients improve their online experience by sharing signals about website visitor intent. They wanted to ensure privacy remained fully protected within the machine learning / AI context that produces these signals.
Situation: Novartis’ digital transformation in drug R&D drives their need to maximize value from vast stores of clinical study data for critical internal research enabled by their data42 platform.
Situation: CancerLinQ™, a subsidiary of American Society of Clinical Oncology, is a rapid learning healthcare system that helps oncologists aggregate and analyze data on cancer patients to improve care. To achieve this goal, they must de-identify patient data provided by subscribing practices across the U.S.
Situation: Needed to ensure the primary market research process was fully compliant with internal policies and regulations such as GDPR.
Situation: Needed to enable AI-driven product innovation with a defensible governance program for the safe and responsible use
of voice-to-text data under Shrems II.
This course runs on the 2nd Wednesday of every month, at 11 a.m. ET (45 mins). Click the button to register and select the date that works best for you.