Data Chats

Our data ethics: Measure what matters and don't be creepy

5 min read
Stylised photo of a real hand pulling an illustration of a golden-heart-shaped slice out of a pie chart.

At Multitudes we know that having good intentions doesn't guarantee positive impact. (If only that were the case!)

We also recognize the particular risks of working with data about people. Many articles and books have shared how algorithms and machine learning have perpetuated systems of oppression, from racism and sexism and poverty to more. (If you’re curious to learn more, see our resources list at the bottom.)

For that reason we knew that data ethics was especially important for us to consider from the very beginning. Our team has been developing data principles to guide us in acting ethically with our data and to encourage accountability for our actions.

We’re now ready to start sharing these principles publicly. I shared them for the first time during a mini-keynote at the Mining Software Repositories Conference, and we’re now sharing them here. Read on for a detailed description, and if you have feedback for us, please get in touch.

You can check out the 8-minute talk here:

PDF of the slides

*Since we first shared this video/post, we’ve updated one of the principles. You’ll notice in the post below that “Collective Benefit” has been replaced with “Transparency”. We explain why we made this change here.

Multitudes Data Principles:

1. Autonomy: We give people control over how their data is used.

This principle is about giving people information about what data we collect and how we use it, and is also about making sure that we have consent for the use of their data. For example, we get consent from the individuals on a team before we start to collect their data, and people can ask us to delete their data anytime.


2. Reciprocity: We only collect people’s data if we can give them value from it.

We won’t collect data from people unless we believe we can create positive impact for them from it. So this means that we won’t take data from individual team members without giving them insights that could help them.

We also want to have a positive impact in how we go about collecting, processing, and storing the data – we care about the whole process, not just the end result. On a direct level, we need to use data and machine learning practices that reduce harm and mitigate things like algorithmic bias (read more about our approach to that here: Mitigating Algorithmic Bias at Multitudes). On an even broader level, we also think about the energy we consume for cloud hosting of our data and pipelines; we’ve chosen a data center powered fully by renewable energy in order to reduce our carbon footprint.

3. Transparency: We are clear about how we use people’s data.

Note: This principle was added after we originally shared this post. You can read about why we changed our data ethics principles here.

We commit to being open and honest about what data we collect and how we use it. In practice, this means that if your manager can see data about you, you should see that data about yourself too. This transparency means that team members are better able to add context to their data (so others don’t make reductive decisions based on it). It also makes it easier for us to follow our accountability principle, since team members can see the analysis we provide and give feedback to us.

4. Context: We support people to put data into perspective.

Data always comes within a certain context. That’s why we shed light on the context around the data; for example, when we share insights in our product, we also provide conversation starters so team members remember to include the people that these data-derived insights are about. 

We also care about the context of the bigger human systems that we live in since our product exists in those systems too. These systems shape what data is available, the questions we ask of the data, and how we interpret it. By and large, the systems we all operate in are still inequitable and continue to produce outcomes where people are marginalized based on the color of their skin, their gender, their disability status, and more. To recognize this context, our team is doing a series of trainings on Te Tiriti o Waitangi (the Treaty of Waitangi) to learn about the history of colonization in New Zealand, where most of us live. We’re committed to similar learning about the systems in the US, UK, Australia, and other countries where we have operations and customers.

5. Accountability: We encourage people to tell us when we make mistakes.

This principle is foundational for all of the others. We’ve put these principles into writing so that we as a team can hold each other accountable, and we’re sharing these principles publicly because we want others to let us know how we can improve. If you have any feedback or thoughts about our data principles so far, you can let us know at hello@multitudes.co.

We see data ethics as a key part of our product journey. We’re still at the beginning of our data ethics journey, but we’re committed to growing and evolving this part of our work as we go. 

Resources to learn more

Articles & Research
Books

Contributor
Lauren Peate
Lauren Peate
Founder, CEO
Support your developers with ethical team analytics.

Start your free trial

Get a demo
Support your developers with ethical team analytics.