Published: February 13, 2018 6:28 pm
Commercial facial-analysis artificial intelligence programmes tend to demonstrate skin-type and gender biases, a study has found. In experiments, the error rates of three commercial programmes in determining the gender of light-skinned men were never worse than 0.8 per cent.
For darker-skinned women, however, the error rates ballooned to more than 20 per cent in one case and more than 34 per cent in the other two. The findings raise questions about how today’s neural networks, which learn to perform computational tasks by looking for patterns in huge data sets, are trained and evaluated….