Important reminder for anyone working in software development, IT, data analysis, or any other field where you’re looking at data about humans or creating computer/AI tools:
If someone asks you to create a tool or report that gathers people by category, you NEED to ask yourself “what harm could be done if this information got out/ if my employer or funding source aren’t going to use this ethically?”
Before you even ask yourself “how can I protect this data,” you need to think “what could go wrong if I do collect this data, or create this tool?”
Tools designed to identify who’s most as risk for contracting a contagious disease, or what neighborhoods need a greater amount of policing, or facial recognition software to help catch criminals. These can be used to discriminate against people who are part of marginalized groups, or to increase police presence in poor Black neighborhoods, or arrest people who happen to share facial features with someone in a database.
And if someone takes a tool you have made and wants to use it unethically? You don’t let them. You destroy data. You claim proprietary ownership over the software and you keep it.
It’s not just the people who unethically use data who are at fault. If you create a tool, you’re responsible for it.