environment gif

This environment is part of the magent environments. Please read that page first for general information.

Name Value
Actions Discrete
Agents 121
Parallel API True
Manual Control No
Action Shape (5),(9)
Action Values Discrete(5),(9)
Observation Shape (3,3,21), (9,9,25)
Observation Values [0,2]
Import pettingzoo.magent import tiger_deer_v1
Agents agents= [ deer_[0-100], tiger_[0-19] ]

Agent Environment Cycle

environment aec diagram


In tiger-deer, there are a number of tigers who must team up to take down the deer (two tigers must attack a deer in the same step to receive reward). If they do not eat the deer, they will not survive. At the same time, the deer are trying to avoid getting attacked.

Tiger action options: [do_nothing, move_4, attack_4]

Tiger’s reward is given as:

Deer action options: [do_nothing, move_4]

Deer’s reward is given as:

Observation space: [empty, obstacle, deer, tigers, binary_agent_id(10), one_hot_action, last_reward]

Map size: 45x45



max_frames: number of frames (a step for each agent) until game terminates