Recently, a new type of robot was created to analyze somebody’s face, and then sort them into blocks based on their career. But when it came to the faces of black men, they were consistently sorted into the criminal block.
For a while, there were more theories than evidence that robots can become racist and sexist. But that completely changed when one day, a robot repeatedly sorted women and people of color into boxes such as “homemaker” and “janitor”. Ever since, there have been many similar incidents with robots which have led many to believe that robots can indeed become racist and sexist.
In order to find out why, researchers at Johns Hopkins University and the Georgia Institute of Technology teamed up to research bias in AI. Turns out, racist and sexist stereotypes were built into the artificial intelligence systems that control how the robots behave.
Many companies have been building robots to replace tasks that humans do. With the pandemic worker shortage, some companies have had little choice but to enlist robots to do the things that their employees formerly would have done, lest they go bankrupt. But many tech researchers have predicted that if this continues, the adoption of new technology could result in a lot of problems in the future.
However, this isn’t the first time racist and sexist errors have been made by robots. Some robots which use algorithms to predict crime have targeted black and latino people for crimes that they didn’t even do!
Although this is not very common and many robots still remain neutral, researchers decided to ask some questions to robots to see their responses. When asked to identify the block “homemakers”, the robots selected black and latina women more than white men.
In another case, Andrew Hundt, from the Georgia Institute of Technology led a research study to see if robots could interact with the real world by telling them to get things. When a child asked the robot to get a “beautiful doll”, the robot fetched her a white doll.
When they saw the results, even Hundt had to admit, “That’s really problematic.”
With technology becoming more and more advanced, it could start becoming sexist and racist. Therefore, it is best to fix it by erasing the biased parts of their memories or code before it gets out of control. Seeing the robots act this way reflects racism and sexism in our own society.
For a while, there were more theories than evidence that robots can become racist and sexist. But that completely changed when one day, a robot repeatedly sorted women and people of color into boxes such as “homemaker” and “janitor”. Ever since, there have been many similar incidents with robots which have led many to believe that robots can indeed become racist and sexist.
In order to find out why, researchers at Johns Hopkins University and the Georgia Institute of Technology teamed up to research bias in AI. Turns out, racist and sexist stereotypes were built into the artificial intelligence systems that control how the robots behave.
Many companies have been building robots to replace tasks that humans do. With the pandemic worker shortage, some companies have had little choice but to enlist robots to do the things that their employees formerly would have done, lest they go bankrupt. But many tech researchers have predicted that if this continues, the adoption of new technology could result in a lot of problems in the future.
However, this isn’t the first time racist and sexist errors have been made by robots. Some robots which use algorithms to predict crime have targeted black and latino people for crimes that they didn’t even do!
Although this is not very common and many robots still remain neutral, researchers decided to ask some questions to robots to see their responses. When asked to identify the block “homemakers”, the robots selected black and latina women more than white men.
In another case, Andrew Hundt, from the Georgia Institute of Technology led a research study to see if robots could interact with the real world by telling them to get things. When a child asked the robot to get a “beautiful doll”, the robot fetched her a white doll.
When they saw the results, even Hundt had to admit, “That’s really problematic.”
With technology becoming more and more advanced, it could start becoming sexist and racist. Therefore, it is best to fix it by erasing the biased parts of their memories or code before it gets out of control. Seeing the robots act this way reflects racism and sexism in our own society.