Google and Blizzard are going to use the legendary video game StarCraft to teach artificial intelligence. The DeepMind division has been working with the game’s developer to create application programming interfaces (APIs), which A.I. developers can use to test out their projects in a complex real-time strategy game. The news hit during BlizzCon last week and the idea taps a new way to teach A.I.: video games.
“DeepMind is on a scientific mission to push the boundaries of A.I., developing programs that can learn to solve any complex problem without needing to be told how,” said Oriol Vinyals, research scientist at DeepMind. “Games are the perfect environment in which to do this, allowing us to develop and test smarter, more flexible A.I. algorithms quickly and efficiently, and also providing instant feedback on how we’re doing through scores.”
The game is a good fit, as it pushes A.I. to its limits in a number of ways. Players need to plan for several moves ahead, adapting their end goals as the game progresses, and even simple repetition tasks will vary slightly. For example, creating an attacking base requires the A.I. to click the interface in varying ways, while also co-ordinating resources.
Video games are a useful way of training A.I. in simulated environments. In September, Intel revealed it was using Grand Theft Auto to train self-driving car A.I., as the city of Los Santos provided an accurate depiction of real world road layouts. In more traditional games, Google was able to create a system capable of beating a human at the game Go a full decade sooner than expected.
It’s unlikely that a computer will be able to beat a human at StarCraft II any time soon, but the newly-developed “curriculums” will be useful of measuring how A.I. progresses in its training. Researchers will be able to use pre-provided tools to put their projects through their paces by setting challenges and benchmarks.
“We’re really excited to see where our collaboration with Blizzard will take us,” Vinyals said.