Superbloom

Would you like to hear how the move from deceptive design to trusted patterns is critical to our shared future?  

Would you like to learn more about how to make encrypted apps more inclusive and accessible?  

Do you need on the spot design and UX aid and advice to improve your tools?

Simply Secure team members will be joining six sessions during RightsCon 2022: June 6-10. We’d love to see you, learn about your work, and collaborate. Come join us!

You can register for RightsCon here and each session we are participating in is linked below.

How “doing design’ is colonial within humanitarian and human rights work

  • Add to my sessions
  • When: June 7 @ 8.15 AM EDT | 2.15 PM CEST
  • Who: Eriol Fox, Ngọc Triệu
  • What: Community Lab
  • The ways in which design is practiced in both professional and academic spheres has a history of colonial power and the ways in which designers ‘conduct’ research, often from western countries visiting other underrepresented majority communities to ‘do co-design’ ‘participatory design’ and other methods with people further embeds these colonial power dynamics in the ways in which one assumes the position of ‘researcher’ or ‘designer’ and the others assume position of ‘subject of research’. Though there are growing methodologies, resources and organizations aimed at balancing the power in human-centered design and research the scales still feel tipped in favor of colonial powers. How can we come together as a humanitarian, Human Rights, technologists and Design community at RightsCon to better understand how we and our organizations participate in this balance and work together to better understand how future approaches to human-centered design and research can be practiced with equity as the priority? 

Our social network: intersectional feminist tech advocacy for journalist safety

  • Add to my sessions
  • When: June 7 @ 9.30 AM EDT | 3.30 PM CEST
  • Who: Ame Elliott
  • What: Community Lab
  • Violence experienced by journalists online disproportionately impacts women, POC and LGBTQIA+ journalists. Existing efforts by tech platforms to keep journalists safe do not address this reality. In this community lab, led by The Coalition Against Online Violence, we want to hear from journalists, digital safety experts, human rights activists, and even the platforms themselves, about how we can co-design an intersectional approach to research and combat online violence against journalists by leveraging our collective experiences. We hope to collect regional case studies and examples on online abuse, finding solidarity in shared challenges and identifying successes in one region that could be ported to other regions. Ultimately our aim is to connect people from different regions around these shared issues to find solutions to combat online violence. We will facilitate breakout rooms in English, Spanish and French, and welcome everyone to participate in whichever language they are most comfortable.

    This community lab will be facilitated by The International Women’s Media Foundation, ARTICLE 19, Simply Secure, and Vita Activa, who are members of the Coalition Against Online Violence. The COAV is a working group of 60+ global organizations working to find better solutions for women journalists facing online abuse, harassment and other forms of digital attack. The Coalition is funded by Craig Newmark Philanthropies and founded by the International Women’s Media Foundation.  

A UX clinic: how to improve digital tools with human rights-centered design principles

  • Add to my sessions
  • When: June 7 @ 1.15 PM EDT | 7.15 PM CEST
  • Who: Human Rights Centered Design Community
  • What: Social Hour
  • Seeking and internalizing community feedback to improve digital tools can be expensive and challenging. This clinic style workshop invites tool teams to join a dialogue with design and user experience (UX) experts with rich experience working in civil society to explore a community-centered, rights-protected approach to gather feedback.

    The workshop takes the format of a speed-dating style focused-group discussion. Participants are encouraged to bring the projects or tools they are working on with questions or challenges they are facing. Experts and Human Rights-Centered Design professionals will provide immediate design and UX aid and advice for the improvement of the tools to further protect the communities they are serving. Further, the workshop welcomes designers and UX professionals to discuss and develop a holistic human rights-driven design approach. We hope the clinic will bring sparks of ideas and ignite opportunities to open up collaborations and long term support.

    Human Rights-Centered Design is an emerging design approach and way of thinking. It could change the way of UX research and development in digital tools. Yet, it is still in an embryo stage where it needs a lot of nutritions (exchanges of ideas, especially from communities who are impacted by digital tools) to grow. We hope the participants can come to this workshop with an interest in Human Rights-Centered Design (HRCD). And through small-group discussions, participants can leave this session connecting HRCD approach and principles with their own experience in using and developing digital products, platforms, and services.  

The web we want: taking action to stop deceptive designs and promote trusted design patterns

  • Add to my sessions
  • When: June 8 @ 12.30 PM EDT | 6.30 PM CEST
  • Who: Web Foundation, 3x3 and Simply Secure plus panellists: 
    Dries Cuijpers (Senior Enforcement Officer, The Netherlands Authority for Consumers and Markets)
    Estelle Hary (Designer, CNIL - Commission Nationale de l’Informatique et des Libertés)
    Dr Jennifer King (Privacy and Data Policy Fellow, Stanford)
    Nnenna Nwakanma (Chief Web Advocate, Web Foundation)
    Khoi Vinh (Senior Director of Product Design, Adobe)
  • What: Panel
  • Since the formal launch of the Web Foundation’s Tech Policy Design Lab in November 2022, we have gathered evidence of the potential harms of deceptive designs (often referred to as ‘dark patterns’) including who these practices impact the most, and how they affect the most marginalized communities in particular. Over the past months, we have brought together experts from companies, governments and civil society to collaborate constructively to develop policy and product solutions and co-create alternatives for more ethical, empathetic, trusted design that puts people and their needs first.

    Join us as we speak with experts from across the world who will discuss how the move from deceptive design to trusted patterns is critical to our shared future. Learn how to participate in the network of civil society, industry, and government practitioners who will carry the work forward from the Tech Policy Design Lab. 

Dismantling deceptive design: global policy design workshop for developing trusted design patterns

  • Add to my sessions
  • When: June 9 @ 9.30 PM EDT | 3.30 PM CEST
  • Who: Web Foundation, 3x3 and Simply Secure (Georgia Bullen, Ame Elliott)
  • What: Community Lab
  • Everything we do online is influenced by how the tools we use are built. Deceptive designs (known as ‘dark patterns’) built into user interfaces alter decision-making or trick users into taking actions they might not otherwise take. While some countries have been focusing on this issue, more needs to be done to ensure that everyone is protected. That is why our work explored a more global scope. Through a series of policy design workshops, we are bringing together stakeholders across sectors and learned directly from those affected by technology to collaboratively redesign our digital spaces. Our facilitators for the session are from India, Nigeria and the United States. We will be sharing evidence of the potential harms of deceptive designs including who these practices impact the most, and how they affect the most marginalized communities.

    We believe that people’s experiences must drive policy and product design, and solutions must take into account the full diversity of those who use digital tools. Let’s come together to share diverse experiences from around the world about the dangers of deceptive design - and how communities counter them - and what we can do as individuals and policymakers to combat these practices globally.

    Let’s build a safe, empowering, and secure web. A web we all can trust.  

Encrypted apps that leave no one behind

  • Add to my sessions
  • When: June 9 @ 1.30 PM EDT | 7.30 PM CEST
  • Who: Georgia Bullen
  • What: Panel
  • Encrypted apps are a powerful tool to enable secure communication amongst human rights defenders globally. Yet, most of the existing technologies have challenges when it comes to inclusivity and accessibility; be that failure to meet the needs of peoples with disabilities, support different languages, or ability to function within contexts of unstable internet access.

    This RightsCon panel session builds on the Encrypted Apps Action Coalition under Tech for Democracy, which aims at bringing together relevant stakeholders to ensure that accessible encrypted communications technologies exist for all people who might wish to communicate privately. The panel will both discuss the existing needs, the works of the Action Coalition thus far, and allow for questions and inputs from participants.