Explore UCD

UCD Home >

Are robots out to get us?

Friday, 21 July, 2023

Originally published 2 April, 2019

Noel Sharkey is Emeritus Professor of AI and Robotics and Professor of Public Engagement at the University of Sheffield. He is Chair of the International Committee for Robot Arms Control and Co-director of the Foundation for Responsible Robotics. The head judge on popular BBC series Robot Wars gave a Discovery Talk entitled “Algorithmic Injustice and Artificial Intelligence in Peace and War” in the UCD O’Brien Centre for Science on April 2nd, 2019.

Prof Noel Sharkey’s opening gambit set the tone for his unsettling talk. 

“I hope you’re all feeling very cheerful tonight; it’s my job to make sure you’re all miserable by the time I’m finished,” he quipped, before painting a disquieting picture about the impact of racist and sexist algorithms and the use of robotics in warfare. 

“Algorithms are neutral, right? They’re not human so not biased or prejudicial; algorithms are fair? In your dreams,” he remarked before explaining how algorithms use big data that has already been tainted by societal prejudices and inequalities. 

The 70-year-old computer scientist from Belfast recalled that when he first moved to London in the 1950s there were signs up saying “No Blacks, No Dogs, No Irish”. He told the audience our societal values might be ever-evolving but that probing Google in 2018 with the word “men” still turned up result words like boss, leader and director. Doing the same for “women” prompted the words helper, assistant, employee, coworker and subordinate.  

“Historical data is locked in the internet. Bias is already in the data...This is where big data is coming from,” he explained. “Algorithms are being trained by this kind of data and then used to make decisions that impact on people’s lives.”

Companies use AI for filtering everything from mortgage applications and insurance quotes to job applicants. When submitting CVs to companies using AI to select candidates, Prof Sharkey, in typically disruptive style, advised jobseekers “to write in white writing all over it - write “Oxford”, “Cambridge” - and you’re more likely to be selected.” 

AI biases are particularly insidious when it comes to race. Prof Sharkey played a video of “a racist soap dispenser” in a New York hotel which “refused” to pump soap into an African American man’s hand, while immediately obliging an outstretched caucasian hand.

“This gives you an example of why diversity in the tech workplace is so important.... The ethnicity and gender bias of the programmer is crucial.”

He remarked that research is urgently needed on the impact of algorithms on other gender identities such as LGBTQ and non-binary.

Face recognition software, he pointed out, is already flawed. 

“In the lab you will get 98% recognition provided you are a clean-shaven white male. If you are a woman they will often mistake you for a man. And as your shade of colour gets darker they get worse and worse and worse. And this is not good.”

Prof Sharkey noted that just 20% of people working in Silicon Valley are female, a figure which plummets to 11% in the UK tech industry. This lack of representation of women and people of colour has impacted on the reliability of software. A good example of this, which Prof Sharkey used to illustrate the point, is face recognition software, currently used by police forces in the UK and the US.

He said British NGO Big Brother Watch used a Freedom of Information request to see how effective this software is when trained onto crowds by police. 

“The best they did was 5% accuracy. 95% of people they’re pulling out of crowds are innocent.”

This figure dropped to 2% accuracy in the case of London’s Afro-Caribbean Notting Hill Carnival. 

Prof Sharkey was clearly amused when American civil liberties groups turned the face recognition software onto every member of Congress “and twenty-eight came out as [matching incorrectly with] dangerous criminals. Now the s*** has hit the fan and [the software is] being looked into in a lot more detail.”

But his involvement in an international Campaign to Stop Killer Robots is where alarm bells rang loudest. 

Particularly when countries such as the US, the UK, Israel, Russia and Australia have refused to rule out developing such unsupervised killing machines.  

“All weapons should be controlled by humans, not machines,” Prof Sharkey said, condemning the use or intended use of lethal autonomous weapons.