Loading…
arrow_back View All Dates
Monday, June 9
 

7:30am PDT

Continental Breakfast
Monday June 9, 2025 7:30am - 9:00am PDT
Monday June 9, 2025 7:30am - 9:00am PDT
Mezzanine East/West

7:30am PDT

Badge Pickup
Monday June 9, 2025 7:30am - 5:30pm PDT
Monday June 9, 2025 7:30am - 5:30pm PDT
Santa Clara Ballroom Foyer

9:00am PDT

Opening Remarks
Monday June 9, 2025 9:00am - 9:15am PDT
Monday June 9, 2025 9:00am - 9:15am PDT
Santa Clara Ballroom

9:15am PDT

Privacy Paradigms for Law Enforcement Response
Monday June 9, 2025 9:15am - 9:35am PDT
Lukas Bundonis, Netflix; Ben Ballard, MITRE


The phrase "law enforcement response" is ambiguous. It most often describes the domain of legal engineering that comprises corporate disclosure of data to government authorities in response to a legal request for information. However, this definition oversimplifies an opaque process. Law enforcement requests for information and the legal processes that structure them resist comprehension by design. Wiretap and pen register orders (real-time surveillance), lawful intercept requests, emergency disclosure/data requests, and national security letters all fall into this category. However, basic types of information requests, such as subpoenas and warrants for information, are less opaque, and provide an opportunity for greater standardization of information disclosure. A discussion of law enforcement response systems in the context of data privacy therefore bears merit. This is especially true when viewed through the complementary lenses of a concept Lawfare fellow Alan Rozenshtein coined in 2018—"surveillance intermediaries"—and an increasingly aggressive series of nation-state intelligence operations targeting sensitive corporate infrastructure. This short talk will explore what intermediaries are, why they matter, some of the risks posed to their sensitive systems, and what the speakers believe we all can do as privacy professionals to better defend them.


https://www.usenix.org/conference/pepr25/presentation/bundonis
Speakers
avatar for Lukas Bundonis

Lukas Bundonis

Netflix
Lukas Bundonis is a Senior Privacy Engineer at Netflix and the program lead for Legal and Privacy Engineering (LEAP), which comprises Subject Access Requests (SAR), data holds, law enforcement response, and other legal engineering services. He previously worked on law enforcement... Read More →
avatar for Ben Ballard

Ben Ballard

MITRE
Ben Ballard is a Senior Cybersecurity Engineer at the MITRE Corporation. Ben has served as a Google Public Policy Fellow at the Electronic Frontier Foundation, an X-Force Fellow with the National Security Innovation Network, and a cybersecurity fellow at the Citizen Lab at the Munk... Read More →
Monday June 9, 2025 9:15am - 9:35am PDT
Santa Clara Ballroom

9:35am PDT

Remediating Systemic Privacy Incidents
Monday June 9, 2025 9:35am - 9:55am PDT
Sam Havron, Meta

When a privacy incident occurs, our incident management process kicks in to quickly identify the root cause, mitigate the issue, and conduct a post-mortem review. While a post-mortem ensures the same incident doesn't reoccur, we want to take a more proactive approach and prevent similar incidents and enhance our privacy posture. This however could face challenges resulting from a lack of visibility and insufficient metrics. Incident owners may not be aware of similar incidents that require joint analysis, leading to missed systemic root causes. Furthermore, measurements to determine the frequency of similar incidents could be lacking to assess the effectiveness of our prevention efforts. To address these challenges, we've developed a program along with tooling to identify, analyze, and remediate systemic privacy incidents. In this talk, we'll cover our approach to tackling these clusters, including:

  • Automated Cluster Identification: Using heuristic and LLM-based methods to automatically identify clusters
  • Analysis and Remediation: Analyzing prioritized systemic clusters and holding teams accountable for remediation
  • Regression Alerting: Implementing alerting systems to detect regressions and prevent similar incidents from happening again

Join us as we share our experiences and insights on tackling systemic privacy incident clusters and improving incident management processes.

Authors: Sam Havron, Meta (Speaker); David Huang, Meta (Not Speaking)

https://www.usenix.org/conference/pepr25/presentation/havron
Speakers
avatar for Sam Havron

Sam Havron

Meta
Sam Havron is a Privacy Engineer at Meta, with a focus on developing workflows to scale incident investigation and review. Sam has an M.S. in Computer Science from Cornell University, and a B.S. in Computer Science from the University of Virginia.
Monday June 9, 2025 9:35am - 9:55am PDT
Santa Clara Ballroom

9:55am PDT

Observable...Yet Still Private? An Offensive Privacy Perspective on Observability
Monday June 9, 2025 9:55am - 10:15am PDT
Cat Easdon, Dynatrace Research; Patrick Berchtold, Dynatrace


Observability platforms provide development and operations teams with insights into their distributed systems, typically combining logs, metrics, and traces with additional telemetry for use cases such as runtime security monitoring and understanding user behavior. While this data is tremendously useful for troubleshooting and product development, it also poses privacy challenges. In this session, we'll consider these challenges through an offensive privacy lens, presenting our research conducting reconstruction attacks against aggregated user session data. We'll explore how offensive privacy research can be used to support the business case for a new product privacy feature, discuss the unique aspects of privacy threat modeling in a business-to-business (B2B) setting, and consider runtime mitigations to halt reconstruction attacks earlier in the 'privacy kill chain'.


https://www.usenix.org/conference/pepr25/presentation/easdon
Speakers
avatar for Cat Easdon

Cat Easdon

Dynatrace Research
Cat Easdon is an engineer and researcher working at the intersection of privacy, security, and policy. She leads Dynatrace's privacy engineering team, designing product privacy features and building privacy controls into the software development lifecycle.
avatar for Patrick Berchtold

Patrick Berchtold

Dynatrace
Patrick Berchtold is a software engineer at Dynatrace and a student at TU Graz, researching reconstruction attacks at ISEC in collaboration with Dynatrace in his thesis. His thesis focuses on applying reconstruction attacks in industry scenarios, exploring their risks and implications... Read More →
Monday June 9, 2025 9:55am - 10:15am PDT
Santa Clara Ballroom

10:15am PDT

Using Privacy Infrastructure to Kickstart AI Governance: NIST AI Risk Management Case Studies
Monday June 9, 2025 10:15am - 10:30am PDT
Katharina Koerner, Trace3; Nandita Rao Narla, DoorDash


The NIST AI Risk Management Framework has emerged as a popular choice among US based organizations aiming to build responsible AI governance programs. However, real-word adoption of this very comprehensive framework is both challenging and onerous—often falling on privacy engineers who are voluntold to lead AI governance efforts. This presentation will explore key lessons learned from implementing the NIST AI RMF for different industries, highlighting how existing privacy infrastructure, policies, and other governance frameworks can serve as a foundation for AI risk management and compliance. We will also uncover common pitfalls and present a lightweight approach to jumpstart this framework adoption.


https://www.usenix.org/conference/pepr25/presentation/koerner
Speakers
KK

Katharina Koerner

Trace3
Katharina is a seasoned expert in AI governance, tech policy, privacy, and security, with a background spanning law, public policy, and emerging technologies. She is currently a Senior Principal Consultant - AI Governance and Risk at Trace3, a leading technology consulting firm specializing... Read More →
NR

Nandita Rao Narla

DoorDash
Nandita Rao Narla is the Head of Technical Privacy and Governance at DoorDash. Previously, she was a founding team member of a data profiling startup and held various leadership roles at EY, where she helped Fortune 500 companies build and mature privacy, cybersecurity, and data governance... Read More →
Monday June 9, 2025 10:15am - 10:30am PDT
Santa Clara Ballroom

10:30am PDT

Coffee and Tea Break
Monday June 9, 2025 10:30am - 11:00am PDT
Monday June 9, 2025 10:30am - 11:00am PDT
Mezzanine East/West

11:00am PDT

UsersFirst: A User-Centric Threat Modeling Framework for Privacy Notice and Choice
Monday June 9, 2025 11:00am - 11:20am PDT
Norman Sadeh and Lorrie Cranor, Carnegie Mellon University


Recent privacy regulations impose increasingly stringent requirements on the collection and use of data. This includes more specific obligations to disclose various data practices and the need to provide data subjects with more comprehensive sets of choices or controls. There is also an increasing emphasis on user-centric criteria. Failure to offer usable notices and choices that people can truly benefit from has become a significant privacy threat, whether one thinks in terms of potential regulatory penalties, consumer trust and brand reputation, or privacy-by-design best practices. This presentation will provide an overview of UsersFirst, a Privacy Threat Modeling framework intended to supplement existing privacy threat modeling frameworks and to support organizations in their analysis and mitigation of risks associated with the absence or ineffectiveness of privacy notices and choices. Rather than treating privacy notices and choices as mere checkboxes, UsersFirst revolves around user-centric interpretations of these requirements. It is intended to reflect an emerging trend in privacy regulations where perfunctory approaches to notices and choices are no longer sufficient, and where instead notices and choices are expected to be noticeable, usable, unambiguous, devoid of deceptive patterns, and more. The presentation will include results of a detailed evaluation of the UsersFirst user-centric threat taxonomy with people working and/or trained in privacy.


https://www.usenix.org/conference/pepr25/presentation/sadeh
Speakers
avatar for Norman Sadeh

Norman Sadeh

Carnegie Mellon University
Norman Sadeh is a Professor in the School of Computer Science at Carnegie Mellon University (CMU), where he co-founded and co-directs Privacy Engineering Program. Norman served as lead principal investigator on two of the largest domestic research projects in privacy, the Usable Privacy... Read More →
avatar for Lorrie Cranor

Lorrie Cranor

Carnegie Mellon University
Lorrie Faith Cranor is the Director and Bosch Distinguished Professor in Security and Privacy Technologies of CyLab and the FORE Systems University Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University. She directs the CyLab Usable Privacy... Read More →
Monday June 9, 2025 11:00am - 11:20am PDT
Santa Clara Ballroom

11:20am PDT

Enterprise-Scale Privacy for AI: How Canva Scaled Customer Control of Data for AI Training
Monday June 9, 2025 11:20am - 11:40am PDT
Phillip Ward, Canva


Canva's mission is to empower the world to design. A major challenge to that mission is securing the data required to build the AI-powered tools that modern professionals love. This is not just the newest and fanciest generative-AI tools. Even the humble background remover and template library search functions require data that represents our user community in order to perform at the level our users expect. To create the best experience for users, this data must be as unique and diverse as our community, and as we scale, data from our growing community is essential to building a better product. However, no one can do their best creative work if they do not feel safe and empowered. We want our users to experience high quality protection for their personal information and personal creations, so every use of their content must be carefully considered. In this talk, I will outline the AI consent platform that we have built. I will share how Canva built an end-to-end ecosystem to simultaneously empower users to control their data and power the next generation of AI tools. This ecosystem spans from the user experience of providing consent, to the controls and platforms that ensure our 100+ models respect user consent every day.


https://www.usenix.org/conference/pepr25/presentation/ward
Speakers
avatar for Phillip Ward

Phillip Ward

Canva
Dr. Phillip Ward is the Lead of Privacy Engineering at Canva, specializing in privacy technology with over a decade of experience in software engineering, data science, and computer science. He leads a team focused on creating privacy-enabling infrastructure for the rapidly growing... Read More →
Monday June 9, 2025 11:20am - 11:40am PDT
Santa Clara Ballroom

11:40am PDT

The "Privacy" Offered by PETs and the "Privacy" That Users Want. Why So Different?
Monday June 9, 2025 11:40am - 12:25pm PDT
Don Marti


The "privacy" offered by "privacy-enhancing technologies" (PETs) on the web is remarkably different from the privacy that users want and expect. People seek out privacy to avoid real-world privacy harms such as fraud and algorithmic discrimination, and PETs, focused on more narrow mathematical goals, can actually make the real privacy problems worse and harder to detect. Can today's PETs be fixed, or should the web move to more productive alternatives?


https://www.usenix.org/conference/pepr25/presentation/marti
Speakers
avatar for Don Marti

Don Marti

Don Marti is VP of Ecosystem Innovation at Raptive (the company that used to be CafeMedia), and a former strategist at Mozilla and former editor of Linux Journal. He works on web ecosystem and business issues including collaborative research on the impact of advances in consent management... Read More →
Monday June 9, 2025 11:40am - 12:25pm PDT
Santa Clara Ballroom

12:25pm PDT

Conference Luncheon
Monday June 9, 2025 12:25pm - 2:00pm PDT
Monday June 9, 2025 12:25pm - 2:00pm PDT
Terra Courtyard

2:00pm PDT

My $5MM Differential Privacy Visualizations
Monday June 9, 2025 2:00pm - 2:15pm PDT
Marc-Antoine Paré


Let's face it: explaining that "differential privacy is like blurring an image" doesn't get very far in communicating how and why this technology should be used. This talk breaks down data visualizations used over the course of a three-year differential privacy project for the Department of Energy that were used to unlock $5MM in funding from grant administrators, convince regulators of the effectiveness of privacy protection, and generate excitement from industry partners for adoption. While past work in this sector failed to get past academic discussions, this project culminated in two large-scale data releases, subject to a strong differential privacy guarantee (ε=4.72 and δ=5.06⋅10^−9). Practitioners will walk away with ideas and inspiration for bridging the variety of communication gaps found in real-life privacy projects.


https://www.usenix.org/conference/pepr25/presentation/pare
Speakers
avatar for Marc-Antoine Paré

Marc-Antoine Paré

Marc-Antoine Paré was most recently a staff software engineer leading Cruise's Privacy Infrastructure team. Previously, he was the technical lead for the Department of Energy's "Energy Data Vault", which brought differential privacy to the energy efficiency sector.
Monday June 9, 2025 2:00pm - 2:15pm PDT
Santa Clara Ballroom

2:15pm PDT

Establishing Privacy Metrics for Genomic Data Analysis
Monday June 9, 2025 2:15pm - 2:35pm PDT
Curtis Mitchell, xD, United States Census Bureau


The ability to work with genomic datasets across institutions is a promising approach to understanding and treating diseases such as rare cancers. However, the sharing of genomic data raises challenging legal and ethical concerns around patient privacy. In this talk we will present on ongoing work between the National Institute of Standards and Technology (NIST), the US Census Bureau, and other organizations to explore metrics and use cases for privacy-preserving machine learning on genomic data. We will discuss the goals of the project, the technical architecture of the project using privacy-preserving federated learning, and the initial results on performance and privacy metrics we have obtained using plant genomic data as an initial stand-in for human genomic data.


Additional authors: Gary Howarth and Justin Wagner, NIST; Jess Stahl, Census; Christine Task and Karan Bhagat, Knexus; Amy Hilla and Rebecca Steinberg, MITRE


https://www.usenix.org/conference/pepr25/presentation/mitchell
Speakers
CM

Curtis Mitchell

xD, United States Census Bureau
Curtis Mitchell is an Emerging Technology Fellow on the xD team at the US Census Bureau where he is contributing to a variety of projects involving privacy-enhancing technologies, artificial intelligence, and modern web applications. He has over 15 years of experience in software... Read More →
Monday June 9, 2025 2:15pm - 2:35pm PDT
Santa Clara Ballroom

2:35pm PDT

Practical Considerations for Differential Privacy
Monday June 9, 2025 2:35pm - 2:55pm PDT
Alex Kulesza


What happens when the philosophical aspirations of differential privacy collide with practical reality? Reflecting on seven years of experience building and deploying differential privacy systems at Google, I will describe in this talk some of the ways in which a focus on worst-case outcomes both enables and discourages an honest accounting of privacy risk.


https://www.usenix.org/conference/pepr25/presentation/kulesza
Speakers
AK

Alex Kulesza

Alex Kulesza is a research scientist at Google NYC.
Monday June 9, 2025 2:35pm - 2:55pm PDT
Santa Clara Ballroom

2:55pm PDT

Unlocking Cross-Organizational Insights: Practical MPC for Cloud-Based Data Analytics
Monday June 9, 2025 2:55pm - 3:15pm PDT
Daniele Romanini, Resolve


In today's data-driven landscape, organizations often seek collaborative analytics to gain cross-organizational insights while upholding stringent privacy standards. This talk introduces a practical approach to adopting a Secure Multi-Party Computation (MPC) system for cloud-based data analytics. Leveraging open-source frameworks such as Carbyne Stack and MP-SPDZ, we have developed features to enable non-cryptographic and non-MPC expert developers to perform private analytics using intuitive, Python-like code. In this talk, we focus on the practical features that a real-world MPC solutions should have, presenting lessons learned and the key modification to an existing framework to reach a stable deployment.

We explain how we enhanced usability and functionalities of the existing framework (Carbyne Stack with MP-SPDZ), such as implementing the support for a semi-honest security model, which is less expensive and more practical than a malicious one in some real-world settings. We also address practical considerations of cost and performance, presenting strategies to optimize infrastructure deployment and algorithm-level enhancements to improve costs and enable complex analytics. Moreover, we illustrate a practical example on how the platform can be leveraged in the AdTech world. This presentation aims to demonstrate that secure and efficient cross-organizational data analytics are achievable, even for developers without specialized MPC expertise.

Authors: Adrián Vaca Humanes, Gerardo González Seco, Daniele Romanini, Goran Stipcich


https://www.usenix.org/conference/pepr25/presentation/romanini-unlocking
Speakers
avatar for Daniele Romanini

Daniele Romanini

Resolve
Daniele Romanini is a Senior Privacy Engineer at Resolve, with expertise in both data science and software engineering. His background includes experience in academia, government organizations, and the AdTech industry. Daniele is an advocate for privacy-by-design and a privacy tech... Read More →
Monday June 9, 2025 2:55pm - 3:15pm PDT
Santa Clara Ballroom

2:55pm PDT

Coffee and Tea Break
Monday June 9, 2025 2:55pm - 3:45pm PDT
Monday June 9, 2025 2:55pm - 3:45pm PDT
Mezzanine East/West

3:45pm PDT

Building an End-to-End De-Identification Pipeline for Advertising Activity Data at LinkedIn
Monday June 9, 2025 3:45pm - 4:05pm PDT
Saikrishna Badrinarayanan and Chris Harris, LinkedIn


Advertising platforms rely heavily on activity data to measure and optimize ads performance. With current privacy regulations and platform requirements, LinkedIn is held to increasingly rigorous standards in the handling of our members' personal data. This is acute for our ads business, as we adhere to strict regulations that necessitate stringent measures when handling user data, including data minimization, which is almost expanding into a global requirement. These regulations continue to evolve, requiring constant adaptation to new standards, while our data pipelines were originally established in a time when the use of personal data was less regulated.

Motivated by the principle of building privacy by design, we undertook a comprehensive project involving numerous stakeholders to address these challenges, and built an end-to-end robust pipeline that de-identifies advertising activity data. The goal of this project was to ensure that user information is protected while still enabling processing on this de-identified data to generate valuable analytics and enable advertisers to learn the effectiveness of their ad spend. We have onboarded products such as performance reporting and billing as the hero use-cases onto this pipeline. This talk will cover the design, implementation and innovative aspects of this pipeline. We will discuss the various privacy enhancing technologies we applied, our system architecture, challenges faced such as scalability (to process billions of events a day) and balancing privacy with the needs of the business. Finally, we will also highlight the outcomes and practical insights gained from this project.


https://www.usenix.org/conference/pepr25/presentation/badrinarayanan
Speakers
avatar for Saikrishna Badrinarayanan

Saikrishna Badrinarayanan

LinkedIn
Saikrishna Badrinarayanan is a Staff Privacy Engineer at LinkedIn. He has spent the last two years building privacy-preserving systems for problems in ads measurement and responsible AI. Before LinkedIn, he worked on privacy/security teams at Snap and Visa. He is a cryptographer by... Read More →
avatar for Chris Harris

Chris Harris

LinkedIn
Chris Harris is a Senior Staff Engineer at LinkedIn, where they have spent the past nine years working on ads measurement, privacy, and data governance. Passionate about hands-on coding and system performance optimization, they focus on building scalable, privacy-conscious solutions... Read More →
Monday June 9, 2025 3:45pm - 4:05pm PDT
Santa Clara Ballroom

4:05pm PDT

Network Structure and Privacy: The Re-Identification Risk in Graph Data
Monday June 9, 2025 4:05pm - 4:20pm PDT
Daniele Romanini, Resolve


In graph data, particularly those representing human connections, the structure of relationships can inadvertently expose individuals to privacy risks. Recent research indicates that even when traditional anonymization techniques are applied, the unique patterns within a user's local network—referred to as their "neighborhood"—can be exploited for re-identification. This talk delves into the complexities of anonymizing graph data, emphasizing that connections themselves serve as distinctive features that can compromise user privacy.
This talk examines the relationship between a network's average degree (i.e. the amount of nodes' connections) and the severity of uniquely identify a node in it solely based on the network's structure. We discuss how understanding these risks can inform the design of privacy-aware data collection and anonymization methods, ensuring that the benefits of data sharing are balanced with the imperative to protect individual privacy.

Authors: Daniele Romanini and Sune Lehmann, Technical University of Denmark; Mikko Kivelä, Aalto University


https://www.usenix.org/conference/pepr25/presentation/romanini-network
Speakers
avatar for Daniele Romanini

Daniele Romanini

Resolve
Daniele Romanini is a Senior Privacy Engineer at Resolve, with expertise in both data science and software engineering. His background includes experience in academia, government organizations, and the AdTech industry. Daniele is an advocate for privacy-by-design and a privacy tech... Read More →
Monday June 9, 2025 4:05pm - 4:20pm PDT
Santa Clara Ballroom

4:20pm PDT

Data Classification at Scale: Taming the Hydra
Monday June 9, 2025 4:20pm - 4:40pm PDT
Daniel Gagne, Meta


This talk goes into detail about the data classification processes at Meta, where we assign metadata about the semantics, actor, and other attributes of the data. We start by defining a taxonomy to support categorization based on the nature of data and regulatory requirements which will be used to ensure appropriate data usage. This supports a wide variety of privacy policies such as access control, deletion, and purpose limitation. We then take a bytes up approach to scan data, extract features, and infer labels from the taxonomy. We also detail challenges with different data storage patterns, classification approaches and quality measurement.

Additional Author: Giuseppe M. Mazzeo


https://www.usenix.org/conference/pepr25/presentation/gagne
Speakers
avatar for Daniel Gagne

Daniel Gagne

Meta
Danny Gagne is a Software Engineer on the Privacy Infrastructure team at Meta. He holds a B.S. in Computer Science from Northeastern University. He has worked on large scale data classification at the MITRE Corporation and at the International Atomic Energy Agency.
Monday June 9, 2025 4:20pm - 4:40pm PDT
Santa Clara Ballroom

4:40pm PDT

Harnessing LLMs for Scalable Data Minimization
Monday June 9, 2025 4:40pm - 5:00pm PDT
Charles de Bourcy, OpenAI


This talk explores how Large Language Models can enhance Data Minimization practices compared to traditional methods. Advanced contextual understanding can accelerate data classification across an organization's storage locations, improve de-identification of text corpora, and streamline internal governance mechanics. The talk will propose architectures for combining LLM-based tools of various kinds with other techniques like lineage tracing to facilitate proactive data minimization and prevent data sprawl.


https://www.usenix.org/conference/pepr25/presentation/bourcy
Speakers
avatar for Charles de Bourcy

Charles de Bourcy

OpenAI
Charles de Bourcy is a Member of Technical Staff at OpenAI. He enjoys exploring new ways to improve privacy protections. He received his PhD from Stanford University.
Monday June 9, 2025 4:40pm - 5:00pm PDT
Santa Clara Ballroom

5:00pm PDT

Conference Reception
Monday June 9, 2025 5:00pm - 6:30pm PDT
Monday June 9, 2025 5:00pm - 6:30pm PDT
Terra Courtyard

6:30pm PDT

Birds-of-a-Feather Sessions (BoFs)
Monday June 9, 2025 6:30pm - 9:30pm PDT
Monday June 9, 2025 6:30pm - 9:30pm PDT
Alameda Room

6:30pm PDT

Birds-of-a-Feather Sessions (BoFs)
Monday June 9, 2025 6:30pm - 9:30pm PDT
Monday June 9, 2025 6:30pm - 9:30pm PDT
Camino Real Room
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -