This project is an AI simulation where programmers can pitch their AIs to compete against each other in an AI Arena application. To run an AI simulation in the Arena, the programmer must create an AI and generate a Dynamic Link Library(DLL) for it, the arena will then load the DLLs in order to run the simulation.
Project details
- Role: AI Programmer
- Development Time: 3 Months
- Language/Tools: C++
- Deployment: DLL file used in an AI Arena application
Rules of the Arena
Colony Units
The Ant AI consists of 4 different unit types, they are as follows:
The Queen: This ant is the heart of the colony. The Queen can spawn more units and consumes nutrients. In order for the colony to survive, the queen must receive the required nutrients from its gatherers. The queen is denoted on the map as a circle icon. The queen is represented by a circle in the arena.
The Gatherer: This ant type collects food on the map to feed the queen. It is denoted on the map by a diamond-shaped icon. It also does 1 unit of damage to enemy units when they are on the same tile. The gatherer is represented by a diamond shape in the arena.
The Scout: This ant cannot collect food like the gatherer, however, this ant possesses larger visibility. The larger visibility on the map allows the scout to make the colony aware of the available food, path-able tiles, and enemy units. Scouts are represented by a plus sign in the arena.
The Soldier: The last ant type is the soldier. This ant possesses the highest offensive ability but lacks the mobility of the other ants. While the other ants do 1 unit of damage, this ant does 2 allowing it to be an offensively superior species. The soldier, however, cannot walk over dirt tiles on the map, unlike the scouts and gatherers. The soldiers are represented by a X in the arena.
Map information
The AI units need to survive on an NxN tilemap, however, the map can possess a variety of tile types and properties that affect movement and nutrient consumption making the arena an interesting challenge for our colony. For example, soldiers cannot path through dirt whereas other units might be able to do so at an added movement cost. Gathers can dig dirt to create empty tiles allowing movement to be faster for all units and ensuring the map can be pathed. Here is an example of what a simple dig strategy could do on a more complex map:
Tile Types:
- Stone Tiles: These tiles are obstacles for all ant types. No units can path over them, no units can destroy them
- Water Tiles: These tiles turn ants that walk over them into corpse bridges. When water turns into a corpse bridge, other units may walk over the bridge.
- Dirt Tiles: These tiles are obstacles only to the Queen and Soldier unit types. Scouts and Gatherers may walk over dirt tiles however, their movement speed is reduced by doing so. Dirt can also be dug or carried by the Gatherers in which case the tile becomes an empty tile.
- Empty Tiles: Empty tiles have no properties, they do not obstruct any units nor affect their mobility or speed in any way.
Nutrients
Nutrients appear on the map on either empty tiles, dirt tiles or water tiles. They are represented on the map as green diamond icons.
A* Path Finding
In order to be able to navigate the map effectively to beat the enemy AI colonies at resource retrieval and combat, I required some form of pathfinding. Although I could have implemented some Dijkstra pathfinding using a simple flood fill, I decided on using the A-star approach which would be much faster especially when determining paths for multiple agents in the scene.
Below is the interface I implemented for A-star pathfinding along with the core of the implementation.
//------------------------------------------------------------------------------------------------------------------------------ Path AStarPather::CreatePathAStar(int startTileIndex, int endTileIndex, IntVec2 mapDimensions, const std::vector<int>& tileCosts, int limit) { // m_pathInfo lives on Pather. It is the current std::vector<PathInfo> m_pathInfo.clear(); m_pathInfo.resize(mapDimensions.x * mapDimensions.y); CalculateCostsForTileIndex(startTileIndex, endTileIndex, mapDimensions, tileCosts, true); // Begin the AStar! int iterations = 0; while (m_openTileIndexList.size() > 0 && iterations < limit) { int currentIndex = SelectFromAndUpdateOpenIndexList(); m_pathInfo[currentIndex].pathState = PATH_STATE_FINISHED; // If we have our Termination Index, we know we have the shortest path set already; if (currentIndex == endTileIndex) { break; } int neighbors[4] = { 0, 0, 0, 0 }; int validNeighbors = PopulateBoundedNeighbors(currentIndex, mapDimensions, neighbors); for (int i = 0; i < validNeighbors; i++) { bool canPath = CalculateCostsForTileIndex(neighbors[i], endTileIndex, mapDimensions, tileCosts); if (canPath) { m_pathInfo[neighbors[i]].parentIndex = currentIndex; if (m_pathInfo[neighbors[i]].pathState == PATH_STATE_UNVISITED) { m_pathInfo[neighbors[i]].pathState = PATH_STATE_VISITED; m_openTileIndexList.push_back(neighbors[i]); } } } iterations++; } //DebuggerPrintf("\n %d", m_openTileIndexList.size()); if (m_openTileIndexList.size() > m_largestOpenList) { m_largestOpenList = m_openTileIndexList.size(); } // Work backwards from our Termination Point to the Starting Point; Path path; IntVec2 pathCoord = GetTileCoordinatesFromIndex(endTileIndex, mapDimensions); path.push_back(pathCoord); int nextIndex = m_pathInfo[endTileIndex].parentIndex; bool workingBackwards = true; while (workingBackwards) { if (nextIndex == -1 || nextIndex == startTileIndex) { workingBackwards = false; } else { pathCoord = GetTileCoordinatesFromIndex(nextIndex, mapDimensions); path.push_back(pathCoord); nextIndex = m_pathInfo[nextIndex].parentIndex; } } m_openTileIndexList.clear(); return path; } //------------------------------------------------------------------------------------------------------------------------------ int AStarPather::SelectFromAndUpdateOpenIndexList() { // Get my Best Index from the Open List and then remove it from the Open List; int counter = 0; int eraseSlot = 0; int smallestCostIndex = m_openTileIndexList.front(); int smallestCost = m_pathInfo[smallestCostIndex].fCost; for (int tileIndex : m_openTileIndexList) { if(tileIndex < 0) continue; if (m_pathInfo[tileIndex].fCost < smallestCost) { smallestCostIndex = tileIndex; smallestCost = m_pathInfo[tileIndex].fCost; eraseSlot = counter; } counter++; } m_openTileIndexList.erase(m_openTileIndexList.begin() + eraseSlot); // Set the state of this index to be finished, we are going to process it now; m_pathInfo[smallestCostIndex].pathState = PATH_STATE_FINISHED; return smallestCostIndex; } //------------------------------------------------------------------------------------------------------------------------------ bool IsContained(const IntVec2 tile, const IntVec2& dimensions) { if (tile.x < dimensions.x && tile.x >= 0 && tile.y < dimensions.y && tile.y >= 0) { return true; } return false; } //------------------------------------------------------------------------------------------------------------------------------ int AStarPather::PopulateBoundedNeighbors(const int currentTileIdex, const IntVec2& tileDimensions, int* outNeighbors) { IntVec2 currentTile = GetTileCoordinatesFromIndex(currentTileIdex, tileDimensions); IntVec2 northTile = IntVec2(currentTile.x, currentTile.y + 1); IntVec2 southTile = IntVec2(currentTile.x, currentTile.y - 1); IntVec2 eastTile = IntVec2(currentTile.x + 1, currentTile.y); IntVec2 westTile = IntVec2(currentTile.x - 1, currentTile.y); int north = currentTileIdex + tileDimensions.x; int south = currentTileIdex - tileDimensions.x; int east = currentTileIdex + 1; int west = currentTileIdex - 1; int i = 0; if (IsContained(northTile, tileDimensions) && m_pathInfo[north].pathState == PATH_STATE_UNVISITED) { outNeighbors[i] = north; i++; } if (IsContained(southTile, tileDimensions) && m_pathInfo[south].pathState == PATH_STATE_UNVISITED) { outNeighbors[i] = south; i++; } if (IsContained(eastTile, tileDimensions) && m_pathInfo[east].pathState == PATH_STATE_UNVISITED) { outNeighbors[i] = east; i++; } if (IsContained(westTile, tileDimensions) && m_pathInfo[west].pathState == PATH_STATE_UNVISITED) { outNeighbors[i] = west; i++; } return i; } //------------------------------------------------------------------------------------------------------------------------------ IntVec2 AStarPather::GetTileCoordinatesFromIndex(const short tileIndex, const IntVec2 mapDims) { IntVec2 tileCoords; tileCoords.x = tileIndex % mapDims.x; tileCoords.y = tileIndex / mapDims.x; return tileCoords; } //------------------------------------------------------------------------------------------------------------------------------ bool AStarPather::CalculateCostsForTileIndex(const int currentTileIndex, const int terminationPointIndex, const IntVec2& mapDimensions, const std::vector<int>& tileCosts_, bool isStart /*= false*/) { if (!isStart) { m_pathInfo[currentTileIndex].gCost = tileCosts_[currentTileIndex]; m_pathInfo[currentTileIndex].hCost = GetManhattanDistance(GetTileCoordinatesFromIndex(currentTileIndex, mapDimensions), GetTileCoordinatesFromIndex(terminationPointIndex, mapDimensions)); m_pathInfo[currentTileIndex].fCost = m_pathInfo[currentTileIndex].gCost + m_pathInfo[currentTileIndex].hCost; if (m_pathInfo[currentTileIndex].fCost < 99999) { return true; } else { return false; } } else { m_pathInfo[currentTileIndex].gCost = tileCosts_[currentTileIndex]; m_pathInfo[currentTileIndex].hCost = GetManhattanDistance(GetTileCoordinatesFromIndex(currentTileIndex, mapDimensions), GetTileCoordinatesFromIndex(terminationPointIndex, mapDimensions)); m_pathInfo[currentTileIndex].fCost = 0; m_pathInfo[currentTileIndex].pathState = PATH_STATE_FINISHED; int neighbors[4] = { 0, 0, 0, 0 }; int validNeighbors = PopulateBoundedNeighbors(currentTileIndex, mapDimensions, neighbors); for (int i = 0; i < validNeighbors; i++) { CalculateCostsForTileIndex(neighbors[i], terminationPointIndex, mapDimensions, tileCosts_); m_pathInfo[neighbors[i]].parentIndex = currentTileIndex; if (m_pathInfo[neighbors[i]].pathState == PATH_STATE_UNVISITED) { m_pathInfo[neighbors[i]].pathState = PATH_STATE_VISITED; m_openTileIndexList.push_back(neighbors[i]); } } return true; } }
What is an Agent?
To simplify the usage of my Ant AI system, I created a simple Agent class that is capable of “Updating” an Agent and continuing a path if the path is valid. The Agent inherited from AgentReport which is the structure received from the Arena server regarding the information of each of the Ants in my colony.
Below is the overview of the AgentReport struct as well as the Agent class in the arena implementation:
//----------------------------------------------------------------------------------------------- // Structure given for each of your agents (and/or each of your orders just previously issued) // struct AgentReport { AgentID agentID; // your agent's unique ID # short tileX; // current tile coordinates in map; (0,0) is bottom-left short tileY; short exhaustion; // number of turns inactive; non-HOLD orders fail if > 0 short receivedCombatDamage; // amount of damage received this turn short receivedSuffocationDamage; // suffocation damage received this turn (1 is you suffocated) eAgentType type; // type of agent (permanent/unique per agent) eAgentState state; // special status of agent (carrying something, etc.) eAgentOrderResult result; // result of agent's previously issued order }; typedef std::vector<IntVec2> Path; //------------------------------------------------------------------------------------------------------------------------------ class Agent : public AgentReport { public: Agent(const AgentReport& base); void UpdateAgentData(const AgentReport& agentReport); bool ContinuePathIfValid(); public: Path m_currentPath; int m_assignedTileIndex = -1; //Assigned tile index };
//------------------------------------------------------------------------------------------------------------------------------ Agent::Agent(const AgentReport& base) { UpdateAgentData(base); agentID = base.agentID; m_currentPath.clear(); } //------------------------------------------------------------------------------------------------------------------------------ void Agent::UpdateAgentData(const AgentReport& agentReport) { tileX = agentReport.tileX; tileY = agentReport.tileY; agentID = agentReport.agentID; exhaustion = agentReport.exhaustion; receivedCombatDamage = agentReport.receivedCombatDamage; receivedSuffocationDamage = agentReport.receivedSuffocationDamage; type = agentReport.type; state = agentReport.state; result = agentReport.result; } //------------------------------------------------------------------------------------------------------------------------------ bool Agent::ContinuePathIfValid() { if (m_currentPath.size() > 0) { IntVec2 destination = m_currentPath.front(); AIPlayerController* playerController = AIPlayerController::GetInstance(); int destIndex = playerController->GetTileIndex(destination.x, destination.y); if (playerController->m_foodVisionHeatMap[destIndex] && type == AGENT_TYPE_WORKER) { eOrderCode order = playerController->GetMoveOrderToTile(*this, m_currentPath.back().x, m_currentPath.back().y); playerController->AddOrder(agentID, order); m_currentPath.pop_back(); return true; } else if (type != AGENT_TYPE_WORKER) { eOrderCode order = playerController->GetMoveOrderToTile(*this, m_currentPath.back().x, m_currentPath.back().y); playerController->AddOrder(agentID, order); m_currentPath.pop_back(); return true; } else { m_currentPath.clear(); return false; } } else { return false; } }
The Agent Manager
The agent manager or the AIPlayerController class is responsible for managing each agent in the Ant colony and providing instructions for them in each turn. This class provides instructions to the Ants with regard to movement, spawning, finding closest Agents or resources and so on.
Below is the implementation of the AIPlayerController interface:
enum eTileNeighborhood { DIRECTION_EAST = 0, DIRECTION_WEST, DIRECTION_NORTH, DIRECTION_SOUTH, DIRECTION_NOT_NEIGHBOR, DIRECTION_THIS_TILE }; //------------------------------------------------------------------------------------------------------------------------------ class AIPlayerController { public: //Static singleton methods static AIPlayerController* GetInstance(); static AIPlayerController* CreateInstance(); static void DestroyInstance(); void Startup(const StartupInfo& info); void Shutdown(const MatchResults& results); void MainThreadEntry(int threadIdx); void WorkerThreadEntry(int threadIdx); void ReceiveTurnState(const ArenaTurnStateForPlayer& state); bool TurnOrderRequest(PlayerTurnOrders* orders); void SetMapCostBasedOnAntVision(eAgentType agentType, std::vector<int>& costMap); void SetVisionHeatMapForFood(std::vector<bool>& visionMap); void AddOrder(AgentID agent, eOrderCode order); void ReturnClosestAmong(Agent& currentAgent, short &returnX, short &returnY, short tile1X, short tile1Y, short tile2X, short tile2Y); bool CheckTileSafetyForMove(Agent& currentAgent, eOrderCode order); eOrderCode GetMoveOrderToTile(Agent& currentAgent, short destPosX, short destPosY); short GetTileIndex(short x, short y) const; void GetTileXYFromIndex(const short tileIndex, short &x, short&y); IntVec2 GetTileCoordinatesFromIndex(const short tileIndex); bool IsAgentOnQueen(Agent& report); private: void ProcessTurn(ArenaTurnStateForPlayer& turnState); void DebugDrawVisibleFood(); void UpdateAllAgentsFromTurnState(ArenaTurnStateForPlayer& turnState); void CreateAgentFromReport(const AgentReport& agentReport); void CheckAndAddAgentsToList(const AgentReport& agentReports); void RemoveAnyDeadAgentsFromList(); // Helpers void MoveRandom(Agent& currentAgent, int recursiveCount = 0); void MoveToQueen(Agent& currentAgent, int recursiveCount = 0); void MoveToClosestFood(Agent& currentAgent, int recursiveCount = 0); void PathToClosestFood(Agent& currentAgent); void PathToClosestEnemy(Agent& currentAgent); void PathToFarthestVisible(Agent& currentAgent); void PathToQueen(Agent& currentAgent, bool shouldResetPath = false); void PathToClosestDirt(Agent& currentAgent); IntVec2 GetFarthestObservedTile(const Agent& currentAgent); IntVec2 GetFarthestUnObservedTile(const Agent& currentAgent); AgentReport* FindFirstAgentOfType(eAgentType type); int FindClosestEnemy(Agent& currentAgent); IntVec2 FindClosestTile(Agent& currentAgent, eTileType tileType); eTileNeighborhood IsPositionInNeighborhood(Agent& currentAgent, IntVec2 position); //Spawning bool CanSpawnSoldier(); bool CanSpawnScout(); bool CanSpawnQueen(); bool CanSpawnWorker(short exhaustion); //Pathing int GetTileCostForAgentType(eAgentType agentType, eTileType tileType); bool IsTileSafeForAgentType(eTileType tileType, eAgentType agentType); bool IsTileSafeForQueen(eTileType tileType); bool IsTileSafeForScout(eTileType tileType); bool IsTileSafeForSoldier(eTileType tileType); bool IsTileSafeForWorker(eTileType tileType); bool IsThisAgentQueen(Agent& report); int GetClosestQueenTileIndex(Agent& report); int IsEnemyInNeighborhood(int closestEnemy, Agent& report); bool IsObservedAgentInAssignedTargets(ObservedAgent observedAgents); private: MatchInfo m_matchInfo; DebugInterface* m_debugInterface; int m_lastTurnProcessed; volatile bool m_running; std::mutex m_turnLock; std::condition_variable m_turnCV; ArenaTurnStateForPlayer m_currentTurnInfo; PlayerTurnOrders m_turnOrders; std::vector<AgentReport> m_queenReports; AStarPather m_pather; std::vector<Agent> m_agentList; std::vector<ObservedAgent> m_assignedTargets; int lastAgent = 6; std::vector<int> m_scoutDestinations; bool m_repathOnQueenMove = false; int m_moveDelay = 100; public: bool m_workerCostMapInitialized = false; bool m_soldierCostMapInitialized = false; bool m_scoutCostMapInitialized = false; int m_agentIterator = 0; std::vector<int> m_costMapWorkers; std::vector<int> m_costMapSoldiers; std::vector<int> m_costMapScouts; std::vector<bool> m_foodVisionHeatMap; std::map<int, Agent*> m_scoutPositionMap; int m_numWorkers = 0; int m_numSoldiers = 0; int m_numScouts = 0; int m_numQueens = 1; };
AI Implementation
Here’s what the AI I implemented looked like on the map:
On reaching the “Sudden Death” condition, for each turn the number of resources being consumed increases by 1 for each unit making the unit upkeep increase with time. For this case I implement a defensive strategy to store my resources by ordering non essential ants in the colony to commit suicide.
Beating a benchmark AI
To ensure I was able to compete in the AI tournament, I needed to make enough progress in time to be able to beat a benchmark AI which was provided to me. This is what it looks like when comparing my AI to the benchmark provided.
Beating 4 Benchmark Colonies
Upon beating 1 of the AI benchmarks, my next task was to be able to beat 4 of them successfully on a map. The “United Ants of India” emerge victorious yet again. This is what that looks like:
Interesting Strategies
Soldier rush:
I guess a simple and effective strategy for all cases is “Eat everything, spawn infinite soldiers, obliterate opponent” which is exactly what one of my professors did 🙂
His ants consume resources aggressively early on and when the number of resources reaches a threshold amount, the queens spawn as many soldiers as possible and attack. His soldiers would either obliterate the closest enemy unit or path to unvisited tiles on the map.
Defensive Strategy:
One of my friends had a more defensive strategy when compared to the others I encountered. His gathers also used the ability to dig up dirt on the map and then carry it with them while they followed a path to food. On encountering an enemy unit, the gatherer drops the dirt on them and killing them. That meant losing soldiers really quickly.
Optimal gather and attack:
One of my classmate’s ants were just being the most optimal they could be at all tasks. He used A-star pathing, k-means clustering, and utility-based AI logic to create the most aggressive and also most consistent AI across all maps.
Retrospectives
What went well 🙂
- I was able to successfully compete in the final tournament by satisfying benchmark requirements.
- Implemented balanced strategy to work on almost all maps.
- Implemented a clever dig strategy that boosted colony pathing and movement.
What went wrong 🙁
- Poor project planning in the starting phase.
- Losing early in the class tournaments affected morale which affected performance.
- A* Path finding took a lot longer than I had expected and I needed help to get it working correctly.
What I will do better next time 🙂
- Prototype early and time box all features
- Implement easily customizable strategy to follow a toolbox type format
- Implement some simple clustering algorithm for more optimal paths/results.