We are an international team with expertise in research, design, software development, and product management, supported by a diverse set of advisors and partners.
Project: Care & IoT
Sarah extended her research program on infrastructures of care by examining how Internet of Things (IoT) technologies — or, ubiquitous networked objects imbued with computational capacity — are newly shaping intimate bodily experiences. Computing technologies for tracking the body are on the verge of being introduced broadly across public and private spheres and are poised to have unique and lasting effects on the privacy and autonomy of those being sensed, as well as the responsibility of workers entrusted with handling such data. For example, hygiene product manufacturers boast the ability for new models of soap dispensers to track the number of times one washes one’s hands during the day, correlating this information with employee break records. By investigating such technologies as they are being newly deployed, Sarah seeks to engage what computing researchers Paul Dourish and Genevieve Bell (2011) call the “mess” of ubiquitous computing as it unfolds — focusing on how these systems shape experiences on the ground.
Sarah Fox is a Postdoctoral Scholar at the University of California, San Diego in the Department of Communication and The Design Lab. Her research focuses on how technological artifacts challenge or propagate social exclusions, by examining existing systems and building alternatives.
Dan Hassan & Mu
Project: Dark Crystal
Dark Crystal is a secret sharing protocol and prototype application. Dark Crystal takes a novel approach to using Shamir's Secret Sharing algorithm, combining narrative, human relationships and ritual with new peer-to-peer technologies to achieve distributed secret persistence. A living prototype has been built on top of the Secure Scuttlebutt stack, a peer-to-peer offline-first gossip protocol, database and mesh network. Dark Crystal posit that this persistence model is transformative when applied to the wicked problem of effective and secure key management. In its prototype form, Dark Crystal extends the Scuttlebutt ecosystem, enabling peers to securely back up secrets (private keys, seeds, passwords and other sensitive information) using the trust in their social network. It is a resilient system of collective data ownership. Through the residency, Dan and Mu focused on the usability of Dark Crystal and how to regularly build human-centered design into their teams’ development practice.
dan hassan (he/him) is a queer white-passing descendent of indo-caribbean indenture and founder of blockades.org [UK]. He’s an opensource hacker with solarpunk tendencies active in autonomous co-operatives, blockchain research/development & big (enough) data analytics. currently building dark crystal, a fun decent(ralised) peer-to-peer utility for securing secrets with friends in cypherspace and beyond. he is a member of dyne.org think &do tank [NL], a non-profit foundation with more than 15 years of expertise in social and technical innovation.
Increasingly, the applications we use in our day-to-day lives are driven by algorithms that are hard to understand and are optimised for the end-goals of the developers (e.g. advertising money, time-on-site) and not for the users (e.g. keeping in touch with family, finding a new job). Feed.me is an experiment to see what happens when we give users more control over the algorithms that control their feeds. Can we allow a grandparent to look at photos of their kids without being shown political news, or allow a trauma victim to filter for things that are specifically triggering for them, or allow an electrician who was recently made redundant to use their feed to find leads on new jobs rather than waste more time that they could be using to continue pursuing their online degree? Technically all these things are well within reach, but they don’t line up well with the goals of the organisations that provide our social media services. Feed.me hopes to leverage the power of the browser to put the user in control with easy to understand “plug and play” algorithms that will work on all their social media. Guided by some user-research completed during the residency the first concrete goal is to implement a browser-extension that will allow the re-ordering of posts viewed through the browser without directly interacting with the APIs of the social media services.
Alex is a med-student turned neuroscientist turning into a developer. He is currently a Mozilla fellow working on building tools and communities to facilitate the sharing of research.
Project: Unpredictable Things
Iohanna built on a project she started in 2017: Unpredictable Things. The project explored the boundaries of computer recognition, to find new resilient strategies for citizens to keep their privacy. The next step explores visibility: What does it mean to be recognised by AI systems? How do we change our lives to comply with algorithmic standards? These topics will be addressed and discussed through a critical design installation, that is a (museum type) collection of anything that is not recognised by an AI systems, from voices to artefacts. The Unrecognized Collection is a collection of Things (including people, patterns, voices) that are not recognized by algorithms yet – machine vision, voice recognition, etc. The particularity of this collection is that it will decrease every year, as AI learns to recognize more and more. During the residency, Iohanna prototyped a speculative future experience involving a smart toilet.
Iohanna Nicenboim is a designer and researcher, focusing on artificial intelligence and IoT in everyday life. Through design fictions she highlights social and ethical issues we might encounter in the future. She recently worked for the Connected Everyday Lab in TU Delft and is a ThingsCon fellow.
Project: HTTPS Everywhere
With the widespread adoption of free HTTPS certificates, the majority of websites are now encrypted. This means someone visiting a HTTPS site should have privacy from interception to their site connection, protection from the site being tampered with by malicious actors, and authenticity that the site they are on is the site they are intending to connect to. Yet, there’s a lot to be done to ensure that people are protected as they browse sites, and that they are always using the protection afforded by HTTPS. Electronic Frontier Foundation’s HTTPS Everywhere browser extension makes sure that you are always using HTTPS, and that your encrypted connection is intact and not undermined. HTTPS Everywhere is a prominent tool, pre-bundled in Tor Browser, and with over two million users at last count. However, there are areas of growth for its usability that can make it easier for more people to adopt the tool. For the residency, Soraya focused on improving user flow, particularly paying mind to international, non-technical and beginner users, and factoring in accessibility design practices to include people with disabilities.
Soraya Okuda is a designer passionate about educational access. Soraya is excited to support efforts in conveying technical concepts to beginners, creating accessible materials for at-risk and under-resourced groups, and building free security and privacy tools.
Project: Least Authority GridSync
Tiffany worked to overhaul and improve the UI and UX for the “Recovery Key” subsystem of Gridsync -- a Tahoe-LAFS-based secure cloud storage client and GUI for desktop operating systems currently used by Least Authority’s “S4” service. At its core, Gridsync is a Dropbox-like application that enables users to securely store and share folders across distributed storage servers and employs robust client-side encryption and erasure-coding techniques to preserve the confidentiality, integrity, and availability of users’ data against malicious server-side actors. One of the biggest ongoing user-facing problems for Gridsync (which was identified via past usability testing and feedback from users) is one shared by other applications that make heavy-use of client-side encryption: because service operators are unable to read or modify user-data by design, end-users alone become responsible for preserving and safeguarding the cryptographic keys necessary to access their personal data. During the residency, Tiffany used paper prototyping to re-think the user experience of GridSync to simplify the key creation process for users and improve the overall usability of the application.
Tiffany is a freelance designer, currently residing in Los Angeles, California and specializing in information architecture, brand identities, and multi-platform design and production. Aside from a couple corporate clients, her current work mostly revolves around the blockchain and distributed technology space. She’s a huge proponent of open source technology and design and is passionate about supporting projects that promote internet freedom, security and education.
Project: Privacy & GDPR Tools
Web and mobile tracking has gone off the rails and there are limited tools to tell companies that we don't want to be profiled, monitored, exploited. The web is currently dominated by impractical, annoying, borderline legal cookie banners, while mobile apps siphon data to big companies without even us knowing. Privacy rules are poorly implemented and there are a few big players exploiting vast amounts of personal data. Users are seen as an economic asset of a company and the imbalance of power has dramatically increased. Because of poor implementation, the General Data Protection Regulation (GDPR) is building up a bad reputation when in fact, the root of the problem lies with companies incorrectly addressing privacy rules, especially when it comes to online opt out mechanisms. The fundamental protection that users should benefit online - the ability to have a choice and explicitly voice that they do not want to be tracked, profiled, monitored - is shielded behind an artificial barrier, contrary to the original purpose of the law. Valentina worked on digging through the complexity of web and/or mobile third party tracking, to design and prototype a tool that could be the first building block towards a user privacy control panel.
Valentina Pavel is a legal adviser and a digital rights advocate focusing on privacy, freedom of speech and open culture. Currently she is a Mozilla Fellow, working with Privacy International as host organization, exploring concentration of corporate power or “digital feudalism” and the implications for privacy and data protection. Most recent work focuses on third party tracking.
Project: Dice Secrets
This project gives users the ability to generate and store strong secrets (~200 bits) used for authentication and encryption. Users roll 22-25 dice, lock them into a case to preserve their configuration, and then read them into an app via via the camera, which can then be used to perform cryptographic operations. The secret can be used to as a last-resort authentication credential for recovering online accounts, or to backup the master encryption keys used by password managers, encrypted file systems, and electronic wallets. Unlike most trusted cryptographic hardware, this dice-based secrets are transparent to users: the users themselves manually generate their keys' randomness by throwing the dice and, since all operations are deterministic using open protocols and can generate human-readable output, users can have independent implementations replicate the same operation to verify correctness. Also unlike hardware-based key stores that use circuits and batteries, dice last for decades, are resistant to many environmental factors that destroy circuits (radiation, magnetism, and moisture), require only a camera to read, and can be read by humans if slightly degraded.
Stuart Schechter has researched computer security, human behavior, and occasionally gotten lost in such distant topics as computer architecture and research ethics. He is currently leading a small start-up, under the assumption that it’s hard to cause much harm to a venture that is already statistically expected to fail.
Project: Ethical Mechanical Turk
For Underexposed, Caroline worked on exploring machine learning through the lenses of transparency and ethics. This work aligns with an essay, Caroline is working on that defines a methodology on how to design for transparency inside of machine learning. For the residency, she started prototyping an ethical mechanical turk system. Ethical, in the sense that will allow for more transparency in who trains and labels a data set- by allowing the trainers to be authors, and also save data about the data set. This 'data' about the data set includes who labeled or trained it, where are they from, when was the data set 'made' or finished, and what's in the data set itself (is it images of certain kinds of faces, etc). The project is exploring how design can function as an aid towards transparency in product design and critical design for machine learning.
Project: Secure Drop
Nina has been working with Freedom of The Press Foundation since August 2018 on both user research and design for their new, Qubes OS based, Journalist Workstation. The new Workstation project is the most significant endeavor to date for SecureDrop, under FPF’s stewardship. It is also the first time the FPF team has involved UX contributors from early in discovery, through launch. As noted above, Nina wasn’t able to participate in the residency in January, but we hope to have Nina in Berlin soon to support her work on SecureDrop. Nina is focusing on improving the UX of the Qubes Journalist Workstation, designing post-submission journalist tools, and generalized support materials to guide volunteer contributors and FPF staff developers in shaping SecureDrop to be the most user-centric resource, possible, for whistleblowers (“Sources” in journalism nomenclature) and journalists, worldwide.