As technology evolves, it always creates new moral and ethical problems for us.
It means we’ll always need people who’ve been trained to think empathetically, laterally, ethically and historically, to civilise our technology.
Take facial recognition systems.
Should we keep rolling out CCTV cameras with facial recognition technology across Australia’s cities and towns?
On one hand, it helps to catch criminals and prevent crime. There’s no doubt about that. But it represents a huge invasion of privacy and comes with obvious human rights implications.
Where’s the data stored? For how long is it stored? Who has access to it?
What are the chances police will end up using the technology to surveil entire crowds at public protests and create a database of protestors?
And consider the technology behind the technology.
In the United States, MIT researcher Joy Buolamwini, recently found most facial recognition technology misidentifies darker-skinned faces, and is far better at correctly identifying the faces of white men.
Such research has prompted Amazon and Microsoft to ban police departments in the US from using their facial recognition technology (for 12 months at least) until the industry is better-regulated.
IBM has decided to pull out of the facial recognition business altogether.
Twenty years ago, we didn’t have to think about these issues. But now the technology exists, we’re forced to.
We’re also beginning to understand how the algorithms that underpin technology like artificial intelligence and facial recognition can reflect the biases of the people who do the coding.
For example, if a tech team with similar backgrounds creates a recruitment program designed to remove human bias from the recruitment process, you can unwittingly get an algorithm that disadvantages women.
Or a team of designers that has never had to think about living with a darker skin tone may develop a soap dispenser that doesn’t recognise black hands.
Or, if you train a self-driving car to identify pedestrians by overwhelmingly showing it images of one race of people, you might end up with a car which doesn’t recognise different skin pigmentations.
To solve these new problems, you don’t just need computer coders, mathematicians and engineers.
You need people who’ve studied ethics, history, politics, and philosophy, and who’ve trained in critical thinking, from across the human spectrum.
Study price increases could jeopardise future workforce
But if the Morrison Government’s proposed changes to Australia’s humanities degrees play out as critics fear, it may become harder to find such people for our future workforce.