Published in Gaming

Deep Mind to learn how to play StarCraft

by on07 November 2016


What could possibly go wrong?

Some bright spark thought it would be a good idea to teach Google’s Deep Mind AI how to play Starcraft.

OK, the game is 18 years old but it is still played by those whose minds can cope with minute decisions, from resource spending sequence to exact placement of buildings, are studied, scrutinized, and adapted by players for new advantage, creating a thriving, adaptive theatre of war. Now Google has announced its collaboration with Blizzard Entertainment to open up StarCraft II to AI and Machine Learning researchers around the world.

Starcraft always shipped with a chance to play against the computer but that AI was basic and giving the game to Google’s DeepMind puts it in a different league. For those who came in late Deep Mind learns how to play over time and has done rather well at games like “Go” which are a little more static.

Starcraft has players managing resources, unit production, exploration, and research at the same time, over maps with terrain that can change.

It is not done yet. For DeepMind to learn the game, it’s going to have an API that lets it see the pixels of the game itself. And, like any new player learning the game, it will have tutorial sessions, as it figures out the basics.

But once it manages it, it could move onto even more complex situations like predicting if Russia is going to launch a nuclear attack and launching a pre-emptive strike. Although that was just the film Wargames and this is a computer which can actually do that. Why, oh, why don’t they just teach it naughts and crosses.

Last modified on 07 November 2016
Rate this item
(1 Vote)

Read more about: