Comfortable UX, Not Just Open APIs

Simply Secure focuses its collaborative efforts on open-source, privacy-preserving software projects. In my conversations with designers, developers, and end users, I'm often struck by a divergence in their understanding of what "openness" means in software. For example, last December during a user study, participants reading app store descriptions of secure messaging apps consistently thought that "open source" meant that their messages were public.

The distinction between "source code" and "content generated in apps" isn't always clear to a mass audience, and this confusion has implications for privacy preservation. Many people know that their Facebook login credentials give them access to other services and apps, as "Login With Facebook" is common on everything from babysitter finder Urban Sitter to dating app Tinder. However, most people don't understand connecting Facebook functionality to other apps often allows their personal data to flow between services as well.

Provocative services like Swipebuster, whose creator says they intended to build awareness about privacy issues, illustrate the confusion many people experience about what "open" and "public" mean in the context of the apps they use.

Is Tinder data public?

Swipebuster allows you to search for Tinder profiles based on certain criteria. Armed only with a first name, age, approximate location, and $5, you can get a list of matching Tinder users, complete with their photos, last logins, and whether they're looking for men or women.

The anonymous creator explains, "There is too much data about people that people themselves don't know is available," Vanity Fair reports. "Not only are people oversharing and putting out a lot of information about themselves, but companies are also not doing enough to let people know they're doing it."

According to Vanity Fair, Tinder responded that "searchable information on the Web site is public information that Tinder users have on their profiles. If you want to see who's on Tinder we recommend saving your money and downloading the app for free."

Paying money to not deal with Tinder

Let's put aside for a moment the potential moral or social implications of trying to access Tinder data for purposes other than finding a date, and explore why Swipebuster is so disturbing. I believe that it resonates as an example of the cultural divergence around openness because it occupies a middle ground between two undesirable options. Engaging with Tinder directly as a user – installing the app to see if someone else is using it – can be undesirable if you don't want others to know you're browsing Tinder data (e.g., because you're already in a romantic relationship). The apparent openness of the Tinder APIs presents a nice alternative if you can write code to query them, but this path requires a level of technical knowledge that makes it all but impossible for most people. $5 seems like a bargain when compared to learning Python from scratch or making your neighbors think you're cheating on your spouse.

Swipebuster is what I would call a comfortable user experience. A comfortable UX opens the possibility of accessing data to a broad spectrum of users. An open API may make the underlying app possible, but data alone isn't enough for users to engage. In the case of Swipebuster, filling out a form with a credit card is an easy, routine experience for someone trying to get information. A $5 price adds another barrier, but it is a familiar interaction.

This ease contrasts with Tinder users' perceptions. Many consider the data they share with the dating service to be confidential on some level. The existence of Swipebuster makes many Tinder users feel shocked and vulnerable.

Form from
Form from

As The Guardian's Alex Hearn writes,"Even if it might seem obvious that Tinder, a site which works by showing name, gender, age and location to strangers, doesn't consider that information secret, it's a very different matter to be confronted with a searchable database of that information. Your home is not secret, for example – people see you come and go all the time – but that doesn't mean posting your address online is advisable."

What happens in Vegas does not stay in Vegas

The Las Vegas tourist bureau has lured visitors with the promise that "what happens in Vegas stays in Vegas," encouraging people to engage in behavior they wouldn't want to be associated with in other contexts. By extension, people may believe that what happens in an app like Tinder stays in that app. It may be more realistic to assume is that everything that happens in an app is ultimately accessible on the open web.

Along with confusing permissions, badly-communicated or poorly-designed API integration can be another vector of privacy risk. Vice (and Tinder itself, as quoted above) describe Tinder's web-based API as "open, or "public", but when examined more closely, it appears that it is actually what software developers call "undocumented" or "private". This means that Tinder developers probably neither intended the API to be used by third parties nor made particular efforts to prevent such use; Swipebuster is the result of what many would call "reverse engineering". Part of the shock with Swipebuster is because it shows that security through obscurity isn't working. Whatever social contract of mutual accountability might work when a Tinder user encounters someone they know in the app, or when an app's interface limits access to the data contained within, doesn't hold when it's easy to pay for information from a database.

Building more comfortable experiences

At Simply Secure, we want better privacy-preserving tools that empower people to protect their data. Although Swipebuster may occupy shakey moral ground, it demonstrates that a good UX will always enable more people to access data than an open API (even in cases where the API is more elegant). Programmatic interfaces aren't enough; to get people truly engaged, we need initiatives that work to open not only data, but to make truly comfortable experiences for non-expert users as well.


Encryption is not for terrorists

Recent attacks by Daesh in Turkey, Egypt, Lebanon, and Paris have fanned the flames of an ongoing debate about software that is resistant to surveillance. It seems that some participants in that debate are trying to use these attacks as an excuse to drum up fear around end-to-end encryption. They argue that these events tell us that the general citizenry shouldn’t have access to strong privacy-preserving tools. A lot of people are saying a lot of smart things on the subject, but I want to briefly outline a couple ways in which this call for limiting encryption is problematic.

Victims of Success: Dealing With Divergent Feature Requests

Rather than view feature requests as a set of highly-divergent signals, it can help to try and group requests based on the underlying need that they speak to.

Usability and Security: Not Binary Properties

People who think about computer security for a living sometimes cringe when they read about the subject in the popular press. Security is a complex and nuanced topic, and it’s easy to make assertions that don’t hold up to careful scrutiny. One basic-but-unintuitive principle is that security is not a binary property: in the absence of other context, it’s hard to definitively say that a particular system or piece of software is “secure” or “insecure”.