Last year the Alphabet-owned AI company DeepMind revealed to the world a system called AlphaZero that taught itself to outwit DeepMind’s own specialised system for playing Go. While the specialised Go software had been trained by researchers for months, AlphaZero figured out the rules, played from scratch, and went on to beat it in three days.
Another breakthrough means it can now trounce anyone at other games in the same class as Go including chess and Shogi (also known as Japanese chess). Researchers predict it won't be long before AI can learn to expertly play other games that aren't so strictly rules based including video games like Minecraft. In fact, research company OpenAI created an algorithm that beat pro players at an albeit restricted version of online multiplayer battle strategy game Dota 2 (Defence of the Ancients).
"Those multiplayer games are harder than Go, but not that much higher. A group has already beaten the best players at Dota 2, though it was a restricted version of the game; Starcraft may be a little harder. I think both games are within two to three years of solution," Murray Campbell told IEEE Spectrum.
Murray was on the research team that created IBM's DeepBlue, which went on to beat chess champion Garry Kasparov.