"… but the data used to train the algorithm comes from the outcomes of the biased police activity. If the police are stop-and-frisking brown people, then all the weapons and drugs they find will come from brown people. Feed that to an algorithm and ask it where the police should concentrate their energies, and it will dispatch those cops to the same neighborhoods where they've always focused their energy, but this time with a computer-generated racist facewash that lets them argue that they're free from bias."Source: "Racist algorithms: how Big Data makes bias seem objective" Boing Boing
3 December 2015
Comments by Josh Kinal