Guest post: RIRO

Originally a comment by latsot on To be seen.

It’s not difficult to train face recognition to work with black people. If you trained the machine learning systems with plenty of black and white faces, it would be fine with both.

However, most facial recognition software has historically been trained on mostly white people, so has a problem reacting to dark skin. It presumably didn’t occur to the people who trained the systems that some faces are not white.

There are lots of other examples of software that has taken on the racism of its trainers. For example, many police forces in the US (and some in the UK) use software to predict where crime is likely to happen (for the purposes of resource planning and management). It’s trained on historical data and since the police’s arrest records contain a racial bias toward arresting people who aren’t white, the software predicts that future crimes will occur in areas where the population is mostly non-white. And since the police are institutionally racist, they see this as ‘working’.

It could be classified as a GIGO problem, yes, but it’s better classified as Racism In Racism Out.

6 Responses to “Guest post: RIRO”