Was this AI trained on an unbalanced dataset (only black folks?)
It’s probably the opposite. the AI was likely trained on a dataset of mostly white people, and thus more easily able to distinguish between white people.
It’s a problem in ML that has been seen before, especially for companies based in the US where it is just easier to find a large amount of white people as opposed to people of other skin colors.
It’s really not dissimilar to how people work either, humans are generally more able to distinguish between two people who are races that they grew up with. You’ll make more mistakes when trying to identify people of races you aren’t as familiar with too.
The problem is when the police use these tools as an authoritative matching algorithm.
Here in Canada at least, I was taught in elementary school to capitalize all important words (i.e. other than and, or, at, in, etc.) in a title. Is it taught differently in other places?
Hunting and killing methods have been improved to ensure as little harm to the whales as possible.
Apart from the … hunting and the killing.
Using mugshots to train AI without consent feels illegal. Plus, it wouldn’t even make a very good training set, as the AI would only be able to identify perfectly straight images shot in ideal lighting conditions.