The Implications of Privacy-Aware Choice

Rachel Cummings - PhD Candidate, Computing and Mathematical Sciences, California Institute of Technology

Jan. 18, 2017, 10 a.m. - Jan. 18, 2017, 11 a.m.

McConnell Engingeering Building, Room 437


ABSTRACT:

 

Privacy concerns are becoming a major obstacle to using data in the way that we want. It's often unclear how current regulations should translate into technology, and the changing legal landscape surrounding privacy can cause valuable data to go unused. In addition, when people know that their current choices may have future consequences, they might modify their behavior to ensure that their data reveal less --- or perhaps, more favorable --- information about themselves. Given these concerns, how can we continue to make use of potentially sensitive data, while providing satisfactory privacy guarantees to the people whose data we are using? Answering this question requires an understanding of how people reason about their privacy and how privacy concerns affect behavior.

 

In this talk, we will see how strategic and human aspects of privacy interact with existing tools for data collection and analysis. I will begin by adapting the standard model of consumer choice theory to a setting where consumers are aware of, and have preferences over, the information revealed by their choices. In this model of privacy-aware choice, I will show that little can be inferred about a consumer's preferences once we introduce the possibility that she has concerns about privacy, even when her preferences are assumed to satisfy relatively strong structural properties. Next, I will analyze how privacy technologies affect behavior in a simple economic model of data-driven decision making. Intuition suggests that strengthening privacy protections will both increase utility for the individuals providing data and decrease usefulness of the computation. I will demonstrate that this intuition can fail when strategic concerns affect behavior. Finally, I'll discuss ongoing behavioral experiments, designed to empirically measure how people trade off privacy for money, and to test whether human behavior is consistent with theoretical models for the value of privacy.

Rachel Cummings is a Ph.D. candidate in Computing and Mathematical Sciences at the California Institute of Technology. Her research interests lie primarily in data privacy, with connections to machine learning, optimization, economics, decision-making, and information systems. Her work has focused on problems such as strategic aspects of data generation, incentivizing truthful reporting of data, privacy-preserving algorithm design, impacts of privacy policy, and human decision-making. She received her B.A. in Mathematics and Economics from the University of Southern California and her M.S. in Computer Science from Northwestern University. She won the Best Paper Award at the 2014 International Symposium on Distributed Computing, and she is the recipient of a Simons Award for Graduate Students in Theoretical Computer Science.