In some ways, it’s an odd question. Many of us would think of a robot as the opposite of bias. But the reality is that, because everything the robot is and does is a consequence of human actions, a robot could in fact be very biased. How will we know?
Some AI developers are attempting to deal with this question:
Last summer, hundreds of A.I. and robotics researchers signed statements committing themselves to changing the way their fields work. One statement, from the organization Black in Computing, sounded an alarm that “the technologies we help create to benefit society are also disrupting Black communities through the proliferation of racial profiling.” Another manifesto, “No Justice, No Robots,” commits its signers to refusing to work with or for law enforcement agencies. …
There are A.I. systems enabling self-driving cars to detect pedestrians — last year Benjamin Wilson of Georgia Tech and his colleagues found that eight such systems were worse at recognizing people with darker skin tones than paler ones. Joy Buolamwini, the founder of the Algorithmic Justice League and a graduate researcher at the M.I.T. Media Lab, has encountered interactive robots at two different laboratories that failed to detect her. (For her work with such a robot at M.I.T., she wore a white mask in order to be seen.)David Berreby, “ Can We Make Our Robots Less Biased Than We Are?” at New York Times
It would perhaps be better if such detection systems picked up human body heat, for example, rather than physical appearances. It is harder to fool endothermic metabolism than fleeting appearances on screens.
Anyway, the robot has the biases of its creators, whatever those biases are. As Bulolamwini discovered.
Questions dog the future of police robots: Robots will have all the human judgment flaws but none of the capacity to change. Here’s an issue to consider: In some places in the world, everything is policed except crime. Issues around robotics in policing must take that into account.
How toxic bias infiltrates computer code. Jonathan Bartlett: A look at the dark underbelly of modern algorithms. A new film makes the point that algorithms cannot achieve justice, they can only automate bias.
Robot police dogs spark civil rights questions Boston Dynamics says that its lease agreements require that the robots not be used to “physically harm or intimidate people.”