AI and the "Elephant in the Room"

David Trammel's picture

Will future activists (and criminals) don masks to fool the coming Surveillance State?

---

I ran across this article this week, on how a popular photo sharing site was secretly using the hundred of thousands of photos posted to it, to train their AIs in facial recognition, for sale to the military and law enforcement.

Millions of people uploaded photos to the Ever app. Then the company used them to develop facial recognition tools

Not sure why people are surprised. The holy grail of products now are artificial intelligence type software, whose use is questionable but very profitable. The only way to train these algorithms is with massive databases of photos, often with unwitting help from the very people who post the photos. Tag your friends' photos as friends and the AI can learn to identify a small group of faces from many angles.

There is a big problem with such computer based algorithms though, they don't handle odd things well.

Machine Learning Confronts the Elephant in the Room - A visual prank exposes an Achilles’ heel of computer vision systems: Unlike humans, they can’t do a double take.

"The result takes place in the field of computer vision, where artificial intelligence systems attempt to detect and categorize objects. They might try to find all the pedestrians in a street scene, or just distinguish a bird from a bicycle (which is a notoriously difficult task). The stakes are high: As computers take over critical tasks like automated surveillance and autonomous driving, we’ll want their visual processing to be at least as good as the human eyes they’re replacing.

It won’t be easy. The new work accentuates the sophistication of human vision — and the challenge of building systems that mimic it. In the study, the researchers presented a computer vision system with a living room scene. The system processed it well. It correctly identified a chair, a person, books on a shelf. Then the researchers introduced an anomalous object into the scene — an image of an elephant. The elephant’s mere presence caused the system to forget itself: Suddenly it started calling a chair a couch and the elephant a chair, while turning completely blind to other objects it had previously seen.

“There are all sorts of weird things happening that show how brittle current object detection systems are,” said Amir Rosenfeld, a researcher at York University in Toronto and co-author of the study along with his York colleague John Tsotsos and Richard Zemel of the University of Toronto."

---

This makes me wonder if future activists (and criminals) fighting a repressive government won't don costumes to fool the expanded surveillance system of cameras and drones. A man with a gun might trigger alarms, an elephant with a gun might take down the system.


(From "Suicide Squad")

Might make a good addition to a story.