“Ridiculously optimistic” machine learning algorithm is “completely bullshit,” says expert.
Thousands of innocent people in Pakistan may have been mislabelled as terrorists by that “scientifically unsound” algorithm, possibly resulting in their untimely demise.
This is REALLY scary, as it means that you can be labelled an extreme terrorist and targeted to be killed by a computer program reading your phone meta-data and applying “artificial intelligence” to it.
Unfortunately, it also has a habit of falsely labeling people as well.
So, you get killed by our government with no due-process, but hey, that’s only bound to happen .18 percent of the time. With a population of 55M in Pakistan, that’s only around 99,000 people falsely targeted.
Aim the same system back at the United States, and well, that number goes up quite a bit. Hmm…
All in the name of “keeping us safe.” Wait a minute, at those numbers, safe from whom?