
When you start using a new piece of technology, how much time do you spend thinking about the work that went into making that technology safe, secure, and ethical?
If an effective product that solves a life problem is handed to us, the general assumption is that it was developed fairly. In fact, the ethics of the technology may not be a thought that even crosses the mind.
Thatâs because, as members of the public, we like to believe that ethics are a given in technology; we donât think about it because we assume that someone already has. Ignorance is bliss, as they say.
While most tech developers are not intentionally creating biased, unsafe, or not secure algorithms, there is a huge potential for natural human bias or error to unexpectedly impact the development of new technology in powerful ways. Ways that weâd never realize unless someone were to point it out.
Luckily, those very people exist.Â
Gemma Galdon Clavell is an algorithm auditor for Eticas, a company that helps create responsible data solutions. And her job is to look into data systems that make decisions about our lives to ensure that they work as they are supposed to. Meaning that they do not reproduce bias or inaccuracies â but improve systems and outcomes in accountable ways.
She recently spoke at one of allWomenâs Friday Night talks to present our community with a comprehensive view of why ethical practices in data are so important.
If you werenât able to attend the talk, hereâs an overview of all of the wisdom that she shared on ethics in data.
How algorithms become active agents in discrimination
When Infojobs first created its algorithm that helps potential candidates find open job positions, they werenât aware of the human bias that automatically went into the technology.
At least not until a user, sick of seeing secretary and administrative positions, changed their gender from female to male within the system. All of a sudden, their list of potential job opportunities included many more positions with higher responsibilities and pay.Â
Unfortunately, this is a symptom of our biased world.
This wasnât the Infojobs team intentionally or maliciously providing women with fewer high-power opportunities. Rather it was the algorithms themselves that exposed the human unconscious bias that lives within us.
Because algorithms learn from reality, they simply reflect back what they see. And something that we certainly know here at allWomen is that our reality has discrimination of gender, sexaulity, and race sewn directly into its seams.
The same has been demonstrated by medical algorithms that are meant to predict the patients that would benefit from extra medical care. Due to a history of the black community not having access to effective health care caused by an unconscious bias, the technology was found to drastically underestimate the health needs of the sickest black patients.
In both of these examples, the technology was an active player in the discrimination of minority groups based on the prejudices that live within our culture.
If the powerful algorithms that we develop are going to continue reflecting back our reality, then we need measures in place to ensure that the reality theyâre reflecting is fair.Â
Why human bias needs fair technology
In her talk, Gemma Galdon Clavell told us a story about when she joined a radio program in Barcelona. The show always included a group of people speaking, and she was normally the only or one of the few women present.
She felt that she was speaking at a normal enough rate within the group, sometimes even wondering if she should speak more. However, when comments started to come in from listeners that she was speaking way too much, she started to rethink her contributions.
It wasnât until later, when the radio show used technology to measure the amount of time that each guest spoke, that Clavell discovered she was nowhere near the person who spoke the most.
In fact, it was just that pesky human bias impacting the view of the listeners. And, unlike the previous examples that weâve covered, in this case the technology actually helped prove this point.
Human bias needs fair technology in order to create a more just future. Both to audit the way that we as humans interact, in the case of the radio technology, but also to make fairer algorithms that account for the inequalities that live within us, like the Infojobs example.
As Clavell put it, âNobody is out there developing evil technologies. Everyone wants to do a good job. But intentions are not enough. Data systems are impacting peopleâs lives in very meaningful ways but also creating risks that need to be addressed and mitigated.â
Quite often developers have no means to consider the possible threats that the existence of their technology creates. For this reason, we need teams who are dedicated to not only pre-determining possible threats â but also auditing for inequalities.
She continued, âJust this week Zoomâs CTO apologized for not having taken privacy, misinformation, and misuse seriously when developing the service. This has meant pedophiles accessing classrooms, for instance. We canât afford to only act afterwards. We need better technologies that put their good intentions into transparent practices.â
We canât limit everything â but we should limit some things
According to Clavell, before building technology, developers should take a step back and ask: Do we want to develop this?
Of course we have the possibility to create incredible technologies that solve the worldâs problems, but we need to consider the long-term impact of this innovation. This is where Clavell and her team of researchers come in.
Her team is there to make sure that technology supports a better future, one that doesnât regurgitate human bias or inaccuracies. While we canât limit everything, otherwise there wouldnât be innovation, we certainly can limit the things that discriminate, oppress, or make the world less safe for any group of people.
As tech-makers ourselves, the allWomen team pledges itself to follow through with this mission of making more ethical and equal products with our Data Science, UX Design, and Product Management students.
If youâd like to do the same, then you too can incorporate Clavellâs better data principles into your workâŠ
- Incorporating diverse teams that understand how technology impacts existing lives, processes, and organizations and can boost the positive impacts while addressing and mitigating negative impacts
- Committing to holding ourselves accountable and understanding our legal obligations
- Respecting our users and building trustworthy relationships through responsible data strategies
With these points in mind, we can make better tech; the kind that supports our mission of making the world a more balanced place.
As Gemma puts it: âData ethics and algorithmic accountability do not limit innovation, they allow it to thrive in ways that are impactful but also respectful of societal expectations. Non-ethical tech is just lazy tech.â
If you found this interesting, join us at future allWomen events for more talks just like this one.