Super excited to share that the paper “Investigating the Impact of Differential Privacy Obfuscation on Users’ Data Disclosure Decisions” by Michael Khavkin and Eran Toch was just accepted toย Decision Support Systems.
As researchers focused on privacy and data ethics, we wanted to explore how people make decisions about sharing their personal data when itโs protected by Differential Privacy (DP)โa leading method for providing individuals with some solid privacy gurentees. While much technical work has gone into defining and implementing DP, thereโs still little understanding of how real users perceive it, especially in contexts where theyโre asked to trade data for money, like in data markets.
What We Set Out to Do
We designed a large study (588 participants) carried out in the United States (US), the United Kingdom (UK), and India, Simulating realistic data-sharing scenarios. Each participant had to choose between different offers to share their personal data, where:
-
The payment amount varied
-
The level of privacy protection (via DP) changed
-
The type of data being collected differed
We also followed up with a second, smaller study (146 participants), where people made real disclosure decisions and received actual payments, using our same experimental design.
What We Discovered
-
Privacy protection levels were the strongest influences on peopleโs choices after monetary rewards. The better the privacy, the less compensation people demanded to share their data. These results were consistent in the three countries.
-
Increasing the level of DP protection by just one unit could reduce the required compensation by over 60%, though the benefit of each additional unit decreased at higher levels (a diminishing returns effect).
- The same effect was also found when participants were offered real money in exchange for their data. However,ย ย they were willing to accept smaller payments for the same level of privacy protection.
Why It Matters
Our findings suggest that clear communication of DP levels can have a real impact on usersโ willingness to share data and can help balance privacy protection with compensation costs. Our findings also show thatย conjoint analysis is a useful tool for exploring privacy preferences at scale, even if it inflates how much people say they need to be paid.ย These insights can guide companies, policymakers, and developers in configuring DP to respect privacy while also managing financial sustainability in data-driven systems.
Comments are closed.