SOUPS 2019 Roundup
SOUPS 2019, the 15th Symposium on Usable Privacy and Security, was held in Santa Clara, CA, alongside USENIX Security, PePR, FOCI and a few other conferences. For those not familiar with these events, they are the conferences hosted by USENIX, the advanced computing systems association, focused on supporting the advanced computing systems communities and furthering the reach of innovative research. The sessions range from deep technical research to industry presentations to usability research and attitudinal surveys. A few members of the Simply Secure community were in attendance, so we wanted to pull together some quick takeaways, either for other folks in attendance or for people who weren’t able to come.
All of the SOUPS 2019 papers and presentations are available online, as well as the other USENIX events.
Workshop: Designing for Extremes of Risk
The USABLE team at Internews (https://usable.tools/) facilitated a half-day workshop at SOUPS. The workshop, entitled “Designing for Extremes of Risk,” was an interactive exploration of what it means to work with at-risk communities, particularly in the context of designing and developing privacy and security tools. The nearly 25 attendees represented a diverse range of professions, from academics to UX researchers to product teams and engineers.
Main takeaways from the workshop included:
- From a product perspective, it can be difficult to identify at-risk communities, connect with them, and build the level of trust needed to collect accurate feedback. Using trusted intermediaries, such as digital security trainers, can provide tool teams with access to relevant feedback without jeopardizing the safety of at-risk communities.
- Tool teams can consider offering incentives (such as small stipends) for individuals to participate in user research. If offering incentives, be sure to research local implications for end users. For example, will accepting a stipend impact any government support that low-income participants may be receiving?
- International events and UX convenings are a good entry point for interested parties to meet at-risk users and begin to build trust. It is helpful for people outside the existing Internet Freedom (IF) community to understand what events exist, who attends each one, and what the purpose of these events are.
- Always be clear about what will be shared from meetings or gatherings (participant list, notes, attribution, etc.) and set clear ground rules from the beginning to foster a sense of trust among attendees.
- Co-design is most effective when it is implemented throughout the entire process. Utilize the co-design process not just for feature or tool development, but also developing the larger feedback collection process.
- User personas allow design and development teams to understand users without requiring direct access or communication.
- User engagements or any feedback collection activities should always take place in a trusted environment or location.
- When designing alerts, it is important to consider the cross-cultural interpretation of the language/design. Images and graphics can be used to send a more universal message or serve users who may be illiterate.
- There is no universal catalog for usability security bugs, analogous to the Common Vulnerabilities and Exposures (CVE) catalog for security threats. Are there ways to identify and/or automate the testing of usability failures by referencing “chaos engineering” style approaches (see https://en.wikipedia.org/wiki/Chaos_engineering#10-18_Monkey).
- Reframe “edge cases” as “stress cases.” Account for how people operate under stress, as this is a more universally applicable approach. Though levels of stress can differ, all users face stress at some point.
Authors: Sarah Pearman, Shikun Aerin Zhang, Lujo Bauer, Nicolas Christin, and Lorrie Faith Cranor, Carnegie Mellon University
- Participants in the study used password managers built into web browsers or operating systems (12) and 3rd-party separately installed password managers (9).
- They “were reluctant to use password managers for [….] concerns over the single point of failure (i.e. the master password required to use password managers)” and they were “reluctant to give up control of their passwords” to the point where people don’t know their passwords.
- Participants did not feel that their passwords are at much risk and they don’t know where “save this password?” prompts are coming from.
- Some participants who used 3rd-party password managers were using old passwords for their master password, or kept copies of their master passwords in an email folder or on a notepad.
- The researchers recommended that password manager developers focus deeply on user experience design and usability testing.
- Another summary, on cmu.edu.
Paper: “Something isn’t secure, but I’m not sure how that translates into a problem”: Promoting autonomy by designing for understanding in Signal
Authors: Justin Wu, Cyrus Gattrell, Devon Howard, and Jake Tyler, Brigham Young University; Elham Vaziripour, Utah Valley University; Kent Seamons and Daniel Zappala, Brigham Young University
- The researchers mapped out all the flows in the application that trigger users to confirm the identity of other chat participants, known as the “authentication ceremony”.
- The participants in the study were paired up and given different scenarios that prompted the need to go through the authentication ceremony; different designs for this process were tested.
- Overall, participants didn’t really understand the process. They did not tend to get around to it organically.
- The alternative design developed by the researchers was more effective than the existing designs. Among the alterations they suggested were letting the user know they may want to exchange security numbers in person or through another channel; explaining what it means when the numbers don’t match; and getting rid of the “this number has been verified” toggle.
- The researchers evaluated the GDPR identity verification process of 55 organizations in finance, entertainment, retail and others. They attempted to impersonate targeted individuals who have their data processed by these organizations, using only forged or publicly available information extracted from social media and other public data sources.
- For 15 out of the 55 organizations, the researchers were able to successfully impersonate a subject and obtained full access to their personal data.
- The leaked personal data contained a wide variety of sensitive information, including financial transactions, website visits, and physical location history.
- The researchers shared recommendations for consumer and organizational practices to limit leaking personal data, such as requiring login credentials, strictly verifying ownership of email address, and calling subjects to request specific user data.
Paper: Cooperative Privacy and Security: Learning from People with Visual Impairments and Their Allies
Authors: Jordan Hayes, Smirity Kaushik, Charlotte Emily Price, and Yang Wang, Syracuse University
- Researchers shadowed people with visual impairments and their allies (e.g. family members, friends and professional support) to understand how they protect their privacy and security.
- Existing tools are designed for individuals and focused on independence, often lacking accessibility support.
- Multifaceted disability identities need to be considered in design and research. Participatory action research and design can support inclusive design practices that address otherwise marginalized groups.
- Allies (friends, family, support staff) are important for privacy/security.
- We need to think about designs that allow for cooperative privacy & security, rather than just individual-centric use cases.
- For more, see the Privacy for All Project
Authors: Alisa Frik, International Computer Science Institute (ICSI) and University of California, Berkeley; Leysan Nurgalieva, University of Trento; Julia Bernd, International Computer Science Institute (ICSI); Joyce Lee, University of California, Berkeley; Florian Schaub, University of Michigan; Serge Egelman, International Computer Science Institute (ICSI) and University of California, Berkeley
- The researchers did semi-structured interviews with 46 participants from senior centers and senior residences in the San Francisco Bay Area, aged 65-95.
- The themes of the interviews were common security and privacy concerns, threat models, behaviors and strategies to mitigate perceived risks, usability issues with current protections, learning and troubleshooting approaches, and misconceptions regarding security and privacy.
- They found that older adults had similar attitudes and concerns to general population, but amplified risks, particularly due to complex trade-offs of privacy, safety, and autonomy. Older adults’ misconceptions about data flows lead to blind spots in mitigation strategies, and that the difficulty they experience in using technology decreases their self-efficacy around privacy & security.
Authors: Camelia Simoiu, Stanford University; Christopher Gates, Symantec; Joseph Bonneau, New York University; Sharad Goel, Stanford University
- The researchers ran a detailed survey of a representative sample of 1,180 American adults.
- Based on that, they estimated that 2-3% of respondents were affected by ransomware over a 1-year period (2016 - 2017).
- The average ransom amount demanded was $530, and only a small fraction of affected users (about 4% of those affected) reported paying.
- Cryptocurrencies were typically only one of several payment options, which suggests that they may not be a primary driver of ransomware attacks.
- The paper includes a proof-of-concept method for ransomware risk-assessment based on self-reported security habits.
Authors: Lucy Qin, Andrei Lapets, Frederick Jansen, Peter Flockhart, Kinan Dak Albab, and Ira Globus-Harris, Boston University; Shannon Roberts, University of Massachusetts Amherst; Mayank Varia, Boston University
- Researchers worked with the Boston Women’s Workforce Council on technical strategies for an open data effort around gender wage equity in Boston.
- They used multi-party computation (MPC) techniques to blind and anonymize sensitive (wage) data, allowing companies to share data securely and privately, but the public effort to collect city-wide data for a large study.
- Their platform for data collection focused on the usability of privacy-enhancing technologies (MPC).
- They focused not only on the security and privacy of the data, but also on developing analytics tools in privacy-enhanced ways.