Possibly the worst Machine Learning story I’ve read in public

I cannot believe the column inches this article got. The title is:

Even algorithms are biased against black men

whereas it should have read:

Poorly designed algorithm incorrectly predicts bias but rather than getting a smack on the hand and getting some machine learning experts to do the job properly we’ll blame the problem on the software and create some confusion and mass hysteria by publishing it in the national press

Just because the authors of this algorithm were from ProPublica does not make the algorithm correct. The only sentence worth a modicum of merit in the entire piece is the first sentence of the last paragraph which reads:

The big puzzle is how the bias creeps into the algorithm.

However, it’s not a big puzzle. It’s simply a bad machine learning algorithm.

We might be able to understand how if we could examine it. But most of these algorithms are proprietary and secret, so they are effectively “black boxes” – virtual machines whose workings are opaque

And this is just scare-mongering. The solution is called validation data.

For those wanting to read this piece of claptrap go here: https://www.theguardian.com/commentisfree/2016/jun/26/algorithms-racial-bias-offenders-florida

John Naughton and the Guardian should be ashamed of themselves. This type of rubbish belongs in the Sun.

 

Leave a Reply

Your email address will not be published. Required fields are marked *