Usability and Security: Not Binary Properties

People who think about computer security for a living sometimes cringe when they read about the subject in the popular press. Security is a complex and nuanced topic, and it’s easy to make assertions that don’t hold up to careful scrutiny.

One basic-but-unintuitive principle is that security is not a binary property: in the absence of other context, it’s hard to definitively say that a particular system or piece of software is “secure” or “insecure”. We can only say that a system is secure against a particular threat, or – more usefully – against a collection of threats, known as a “threat model”.

Justitia, Tehran Courthouse.  Image CC BY-SA 3.0, Abolhassan Khan
Justitia, Tehran Courthouse.

For example, some people might say that using a VPN while browsing the web from a coffee shop is “secure”, because it prevents the jerk across the street with a cantenna from listening in and seeing what websites you go to. But if your threat model includes listeners with devices housed with internet service providers (or a government that operates VPNs), you might instead refer only to an option like Tor as “secure”.

As someone who has spent a lot of time thinking about security, it’s tempting to dismiss things as “insecure” when they don’t protect against the threats that I’m personally concerned about. Go too far down that path, though, and we find ourselves in a world where only the products that protect against the most extreme threats are considered acceptable. As with transportation safety and public health, we have to recognize that getting people to adopt a “good enough” solution – at least as a first step – is usually better than having them not change their behavior at all. In other words: it’s important to not let the perfect be the enemy of the good!

Just as security is not a binary property, it’s also important to not think of usability as an all-or-nothing game. Design thinking encourages us to ask not just whether humans in general find a piece of software usable, but to explore 1) the circumstances in which different groups of users might be motivated to use the software and 2) the needs that a piece of software must meet in order to sustain that motivation.

I think that this distinction is particularly important for software developers to bear in mind. It’s easy to get discouraged when someone tells you that the code you’ve slaved over “isn’t usable”. (Or get defensive – after all, there are plenty of people who seem to find it useable enough, or there wouldn’t be anyone to file all those feature requests.) I challenge you instead to dig deeper, and try to understand exactly what things the user found frustrating about their experience, and what expectations they had using the software that may be mismatched against the assumptions you have made in designing it.

Just as we can only say that software is “secure” against certain threats, so too must we define “usability” as a function of particular users with particular needs, backgrounds, and expectations. Working to understand those users will ultimately help our community build better software.


Encryption is not for terrorists

Recent attacks by Daesh in Turkey, Egypt, Lebanon, and Paris have fanned the flames of an ongoing debate about software that is resistant to surveillance. It seems that some participants in that debate are trying to use these attacks as an excuse to drum up fear around end-to-end encryption. They argue that these events tell us that the general citizenry shouldn’t have access to strong privacy-preserving tools. A lot of people are saying a lot of smart things on the subject, but I want to briefly outline a couple ways in which this call for limiting encryption is problematic.

On Trust and Transparency: Perspectives from Luminate's portfolio

In June 2018, Luminate commissioned Simply Secure to conduct human-centered design (HCD) research focused on uncovering grantees’ experiences of the funding process. The report highlights insights, feedback — including anonymized quotes and comments, and recommendations synthesized from 20 interviews + 53 survey responses.

When User News Is Bad News: Tactical Advice On User Feedback

When you're putting your heart and soul into designing, building, or improving a piece of software, tuning in to feedback from users can sometimes get you down. Imagine waking up one morning and finding your project is being mentioned on Twitter in a slew of messages like these: Thanks Snapchat. Your app officially sucks. — Michael (@michaellorelei) April 21, 2016 The Facebook app sucks — em (@emma0wczarzak) April 19, 2016 You know, the YouTube app really kind of sucks.