Developing tech for good
29 October 2025In our everyday lives, technology plays a fundamental role in how we connect and collaborate. Unfortunately, news headlines often highlight unethical practices and standards set by major tech companies. These stories raise questions for those developing new technologies: what are the standards we should strive for? Where can we find inspiring examples? One promising answer lies in Public Interest Technology (PIT) – which we can define as “the study and application of technology expertise to advance the public interest, generate public benefits, or promote the public good.”
While we have previously highlighted organizations working in this field, this blog dives deeper into what practical lessons we can learn from PIT. Drawing from our own experiences and the wider ecosystem, we want to share design principles and inspiring real-life examples to help guide ethical and impactful tech development.
Public Interest Technology
Before diving into the design principles, there are certain “ground rules” we wish to put forward that are essential guidelines for technology development.
While technology is often associated with computer science and engineering, PIT stresses the importance of an inter-disciplinary approach. The technologists of the future should be able to draw on insights from different fields, such as ethics to address moral questions about technology use, social sciences to understand how technology affects people and communities, and environmental studies to consider the impact on the environment.
Secondly, while ‘traditional tech’ focuses on speed, embracing these design principles often requires a willingness to take the slower, more thoughtful path. The Silicon Valley approach is often described as move fast and break things, while public interest technology argues for move carefully and fix things. The focus shifts to listening, learning, and continuous improvement. Especially for new and emerging technologies, such as artificial intelligence, the principles should not be overlooked by the potential for accelerating process. In an earlier blogpost we also shared our lessons learned on taking this route.
Lastly, technology encompasses more than just the technology itself; it should include deeper design choices around it. A recent report by Rathenau describes this as design choices ‘upstream’ also strongly influence your choices ‘downstream’. Factors such as your governance, business model, and vision influence your user interface and data collection and processing, and eventually the outcomes for the user. For example, a business model that prioritizes personalized advertising may lead to design decisions focused on extensive data collection. A similar approach can be found in the Responsible Technology by Design Framework by IF. This framework points out five layers in your digital product: UX/UI, technology and data, policy, organization and society.
With these ground rules mind, we can turn our attention to practical design principles that guide technology development from a user’s perspective.
Principles for responsible tech development
Although a definitive set may not exist, several principles frequently arise in discussions. For smaller companies, adopting public interest technology is balancing continuous adaptive learning and resources. Still, by integrating responsible practices early on, you can lead by example and drive positive change within the industry.
Agency
Users want to control experiences and choices within the technology to make informed decisions that best serve their needs. This includes complete control in designing their interactions with the technology, ensuring that they are active participants rather than just a passive receiver.
A court case early October emphasized this principle. Bits of Freedom, an organization that defends privacy and digital rights, sued Meta (the company behind Instagram and Facebook) because users had limited control over the order of posts they saw. Instead of a simple, chronological timeline of friends’ and followers’ posts, Meta’s platforms automatically displayed recommended posts based on user profiles. Marked as a “dark pattern” by the Court, Meta must now make it easier for users to choose and maintain a timeline that doesn’t rely solely on personalized recommendations.
Privacy
As a user, users expect their personal information to be protected and handled carefully, providing safety in interactions with the technology. Think of the personal data that this company has and why, but also whether they sell or share user data. The privacy product reviews by Nothing Personal, the editorial platform by Mozilla Foundation, for example dive into a set of privacy design choices.
Mozilla, developer of the Firefox browser, exemplifies their own commitment by sharing its approach to integrating AI without sacrificing user privacy. They share how tools like text generation, translations, and AI chatbots operate directly on users’ devices instead of relying on cloud processing. This means that sensitive data stays local and is not sent to external servers, helping to keep user information secure. Tip! Also have a look at their podcast where they dive into how AI is reshaping our future and the choices we have to shape it back.
Interoperability
The freedom to easily integrate and switch between different platforms without losing access to my data or functionality is important. Equally important is the ability to communicate across similar services (imagine if we could only send an email from one email provider to the same provider!). But you can also look at interoperability through the entire stack. Instead of having data locked within a single platform, a practice commonly associated with big tech, interoperability allows using technology in a way that better meets the user’s needs and preferences. In all European alternatives, this principle is embedded from the start.
An example is ActivityPub, a protocol used by Mastodon, that allows different online platforms to communicate with one another, enabling users to share their work and engage with content across various sites. In this case, by adopting ActivityPub, scholars and educators are able to collaborate without being limited to a single platform. This approach promotes greater access to academic resources and fosters a richer, more inclusive digital scholarly community.
Transparency
Users want to understand how the technology they use works and how their data is handled in order to trust both the technology itself and its impacts on their lives. Transparency is often linked to open source; for example, Ghost is an inspiring non-profit organization that shares all its code publicly, enabling smaller publishers and independent creators to easily develop their own publishing platforms. However, transparency encompasses more than just code-sharing. It includes sharing data practices, decision-making processes, and potential effects on users. This openness helps build trust and understanding while encouraging collaboration and allowing others to build on shared ideas.
Signal shows its commitment to transparency by openly discussing the costs of their messaging app in their blog post “Signal is Expensive.” They explain what it takes to run a secure messaging service, covering expenses related to infrastructure, development, and privacy. This openness helps users understand how their technology works and why it costs money to maintain. By sharing these details, Signal allows users to see the real financial challenges the company faces and the decisions that prioritize user privacy and security. This approach goes beyond just sharing code; it includes providing insights into how the company operates and how it cares for its users.
Equity and Inclusion
Users want to engage with technology that promotes fairness and equal opportunities, so that everyone, regardless of their background or abilities, can access and benefit from it. This includes recognizing and addressing systemic obstacles that different communities may encounter. Actively involving perspectives of diverse communities in the design process is essential to create solutions that are truly inclusive and effective.
A practical design approach is Design from the Margins. This philosophy emphasizes that by prioritizing the needs of the most marginalized communities, we create solutions that ultimately benefit everyone. A case study illustrating this approach involved researchers enhancing safety features for users of apps like Signal and Grindr, which are often targeted in countries with restrictive laws against LGBTQ+ individuals and marginalized groups. They created a feature that allows users to hide the app’s logo and replace it with a neutral icon. This innovation reduces the risk of users being identified by law enforcement, protecting their privacy and safety.
What values or best practices inspire you?
These are some of the principles we see and discuss in our day to day product development, but there is more! Do you miss any principles here? Or do you have best practices or inspiring examples to share? Please do! We are gathering our own platform promises and learn more about how we strive to create a secure, open, and accessible collaboration environment. We invite you to share your thoughts, concerns, or ideas for improvement with us.
