A new type of learning artificial intelligence has emerged in Minecraft, named AIRIS (Autonomous Intelligent Reinforcement Inference Symbol). This AI learns to play the game through practice, essentially starting from "nothing" in Minecraft and learning how to play using only the game's feedback loop to teach it.

Early versions of AIRIS were tested in simple 2D grid world puzzle environments. However, to test the system, developers needed to test it in a more complex and open 3D environment. Minecraft fits this description perfectly, being a highly popular game and possessing all the technical requirements needed to integrate AI into it.

QQ20241107-111354.png

AIRIS operates by receiving two types of inputs from its environment and a list of actions it can perform. The first type of input is a 5x5x5 3D grid of block names surrounding the agent. This is how the agent "sees" the world. The second type of input is the agent's current coordinates in the world. This allows us to specify a location we want the agent to reach.

AIRIS begins in a "free roam" mode and attempts to explore the surrounding world. It builds an internal map, recording places it has been to, which can be viewed using a companion visualization tool. It learns how to navigate the world and adapts when encountering obstacles such as trees, mountains, and caves.

Successful use cases for AIRIS could include automatic error and stress testing of software. For instance, if AIRIS could run throughout Fallout 4, it could create error reports when interacting with NPCs or enemies. Although quality assurance testers would still need to review what the AI records, it would speed up the tedious and frustrating process of development.

The emergence of AIRIS marks the first step for AI to autonomously learn in complex, full-scale virtual worlds. This should be exciting for all AI enthusiasts.