Advertisement

Neural network takes a stroll, describes what it sees in real time

An American artist and coder decided to test an image recognition system, known as a neural network, to annotate a walk through Amsterdam in real time. Naturally, he launched the results online.

Using a tweaked program built by researchers from Stanford and Google, Kyle McDonald set out for a meander, which he filmed live from the webcam of his laptop. His computer analyzed the footage it captured in real time, using a text scroll on the upper left hand side of the screen.

McDonald used an open source program called NeuralTalk, which was introduced last year.

While much of the footage is accurate (a boat is sitting on the water near a dock), it often makes mistakes. It quickly corrects itself, though.

Neural networks are a new a technology and their use is rapidly expanding, with many companies and institutions investing in them. Facebook recently released a prototype neural network, which is meant to help blind people by annotating pictures.

“I consider the pixel data in images and video to be the dark matter of the Internet,” Fei-Fei Li, the lead researcher behind NeuralTalk, told The New York Times last year. “We are now starting to illuminate it.”

Neural networks aren’t considered artificial intelligence, as the programs used have no real understanding of what’s in the images they’re describing. As the supercut below illustrates, the technology is still very prone to oversights and inaccuracies.