You are logged in as an event manager. This page is cached for performance until Mon, 29 Nov 2021 20:34:39 GMT. Preview latest contents by clicking Refresh.
Logout

Track: Usability

What:
Talk
Part of:
When:
12:40 PM, Thursday 16 Jul 2020 EDT (1 hour 15 minutes)
How:
Discussion:
1

Use the red “Join on YouTube” button above to join the livestream. If you cannot see this button, make sure you are logged in (see the upper-right corner of your screen).


Session Chair: Florian Schaub


Privacy at a Glance: The User-Centric Design of Data Exposure Visualizations for an Awareness-Raising Screensaver
Daricia Wilkinson (Clemson University), Paritosh Bahirat (Clemson University), Moses Namara (Clemson University), Jing Lyu (Clemson University), Arwa Alsubhi (Clemson University), Jessica Qiu (Clemson University), Pamela J. Wisniewski (University of Central Florida), and Bart Knijnenburg (Clemson University)


Pre-Recorded Presentation


Summary:Smartphone users are often unaware of mobile applications’ (“apps”) third-party data collection and sharing practices, which put them at higher risk of privacy breaches. One way to raise awareness of these practices is by providing unobtrusive but pervasive visualizations that can be presented in a glanceable manner. In this paper, we applied Wogalter et al.’s Communication-Human Information Processing model (C-HIP) to design and prototype eight different visualizations that depict smartphone apps’ data sharing activities. We varied the granularity and type (i.e., data-centric or app-centric) of information shown to users and used the screensaver/lock screen as a design probe. Through interview-based design probes with Android users (n=15), we investigated the aspects of the data exposure visualizations that influenced users’ comprehension and privacy awareness. Our results shed light on how users’ perceptions of privacy boundaries influence their preference regarding the information structure of these visualizations, and the tensions that exist in these visualizations between glanceability and granularity. We discuss how a pervasive, soft paternalistic approach to privacy-related visualization may raise awareness by enhancing the transparency of information flow, thereby, unobtrusively increasing users’ understanding of data sharing practices of mobile apps. We also discuss implications for privacy research and glanceable security.


Multiple Purposes, Multiple Problems: A User Study of Consent Dialogs after GDPR
Dominique Machuletz (University of Münster) and Rainer Böhme (University of Innsbruck)
Pre-recorded presentation

Summary:The European Union's General Data Protection Regulation (GDPR) requires websites to ask for consent to the use of cookies for specific purposes. This enlarges the relevant design space for consent dialogs. Websites could try to maximize click-through rates and positive consent decision, even at the risk of users agreeing to more purposes than intended. We evaluate a practice observed on popular websites by conducting an experiment with one control and two treatment groups (N=150 university students in two countries). We hypothesize that users' consent decision is influenced by (1) the number of options, connecting to the theory of choice proliferation, and (2) the presence of a highlighted default button ("select all"), connecting to theories of social norms and deception in consumer research. The results show that participants who see a default button accept cookies for more purposes than the control group, while being less able to correctly recall their choice. After being reminded of their choice, they regret it more often and perceive the consent dialog as more deceptive than the control group. Whether users are presented one or three purposes has no significant effect on their decisions and perceptions. We discuss the results and outline policy implications.


The Best of Both Worlds: Mitigating Trade-offs Between Accuracy and User Burden in Capturing Mobile App Privacy Preferences
Daniel Smullen (Carnegie Mellon University), Yuanyuan Feng (Carnegie Mellon University), Shikun (Aerin) Zhang (Carnegie Mellon University), and Norman Sadeh (Carnegie Mellon University)
Pre-recorded presentation

Summary:In today’s data-centric economy, data flows are increasingly diverse and complex. This is best exemplified by mobile apps, which are given access to an increasing number of sensitive APIs. Mobile operating systems have attempted to balance the introduction of sensitive APIs with a growing collection of permission settings, which users can grant or deny. The challenge is that the number of settings has become unmanageable. Yet research also shows that existing settings continue to fall short when it comes to accurately capturing people’s privacy preferences. An example is the inability to control mobile app permissions based on the purpose for which an app is requesting access to sensitive data. In short, while users are already overwhelmed, accurately capturing their privacy preferences would require the introduction of an even greater number of settings. A promising approach to mitigating this trade-off lies in using machine learning to generate setting recommendations or bundle some settings. This article is the first of its kind to offer a quantitative assessment of how machine learning can help mitigate this trade-off, focusing on mobile app permissions. Results suggest that it is indeed possible to more accurately capture people’s privacy preferences while also reducing user burden.

Who's Attending 

  • 10 Others