Google has created a machine that learned to master 49 Atari video games to the point where it was able to best professional human players.
According to BBC News, from the Google DeepMind lab, the artificial intelligence (AI) bot was designed to emulate the human brain. Like a person would, the AI learned its task and worked to get better at it.
Google bought DeepMinds Technologies in 2014 for $400 million. Researchers published a study on their AI machine in the journal Nature.
"Up until now, self-learning systems have only been used for relatively simple problems," Demis Hassabis, DeepMind's vice president of engineering, told BBC News. "For the first time, we have used it in a perceptually rich environment to complete tasks that are very challenging to humans.
"The only information we gave the system was the raw pixels on the screen and the idea that it had to get a high score. And everything else it had to figure out by itself."
The games the AI played included all-time classics such as Space Invaders, Pong, Pac Man and Private Eye. In 29 games, the machine was as good or better than its human competitor.
"On the face it, it looks trivial in the sense that these are games from the 80s and you can write solutions to these games quite easily," Hassabis said. "What is not trivial is to have one single system that can learn from the pixels, as perceptual inputs, what to do.
"The same system can play 49 different games from the box without any pre-programming. You literally give it a new game, a new screen and it figures out after a few hours of game play what to do."