100 likes | 248 Views
King Fish. Brian Wojcik Nikita Kiriy Juan Pablo Sarmiento. Challenges: very large branching factor limited hardware tricky heuristics Approach Make a decent static player - wins against known opponents Use machine learning to optimize mechanics/learn from opponents.
E N D
King Fish Brian Wojcik Nikita Kiriy Juan Pablo Sarmiento
Challenges: very large branching factor limited hardware tricky heuristics Approach Make a decent static player - wins against known opponents Use machine learning to optimize mechanics/learn from opponents * * * * * * * 1 * * * * * * * 2 * * 2 * * * * * * * * * * * 2 * * * * * * * 1 * 2 1 * * * * * * 2 * 2 * * * * * * * 2 1 * * * * * * * * 1 2 * * * * * * * * * 2 1 * * * * * * * * 1 1 * 1 * * * * * * * * * * * 1 * * * * * * * * 2 * * * * * * * Overview"It was the best of times, it was the 𝛽 of times"
The AI Progress • Stage 1: beat a random player with heuristic moves. • Implemented and optimized. • Bonus: able to beat eigenbot frequently • Stage 2: adaptive AI/learning. • Implemented and optimized. Using: • Alpha/Beta search • Parametrized utility function • Random restart hill climbing • K-nearest neighbor with pre-loaded tournament logs Optimizations • 16 byte board hash
Game Stages Run-Through Opening: • Not Interacting: Use A-Star • Saves time/computations Interacting: • When at a certain distance from opponents pieces, switch to Alpha-Beta • Smart Hashing to minimize memory usage. • Depth 5 in AB search easily achievable with good move ordering. • Static Evaluation function to evaluate non-terminal nodes. Closing: • Consider Special Marbles in AB search. • Depth 7 in AB search after special marbles (depth 4 before).
Static Evaluation Function Alpha-Beta Search cant search to terminal depths. Better Static Eval Function = Better Bot Break down Function into Components: • Vertical Dist • Horizontal Dist • Straggler Utility • Chain Utility Different Scale & Different Importance Parameterize Weights for each component Find a Good Weight Set.
1 * 1 * 1 * * * 1 1 * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * 1 * * * * * * * * * 1 * * * * * * * * * * 1 * * * * * * 2 2 * 2 * * * * * * * 2 * 2 * * * * * * * * * * * 2 * * * * * * * 2 * 2 * * * * * * * 2 * * 1 2 1 * * * * 1 1 1 1 1 * 1 1 * 1 * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * 1 * * * * * * * * * 1 * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * 2 2 2 2 2 2 2 2 2 2 Example: Straggler Utility Same Bot: Player 1 not considering straggler utility, Player 2 is.
Multi-Run • Parameterize Static Evaluation function allows 2 bots with different eval functions to run against each other easily. • Hill-climbing search to find best static evaluation function. Multi-Run Game Server • Own version of GameServer • Runs multiple bot vs bot games, altering evaluation variables each time. • Random Hill climb to find best overall evaluation function.
Learning • Game parser and classification of moves • Used database of recorded games to determine which were "safe" moves and which were "losing" moves. • "Losing" moves were usually those that sent our pieces to the corners of the board. • Used k-nearest neighbors to classify losing moves. • Evaluated accuracy using cross-validation. • Ignore losing moves when performing AB search.
Statistics • Random player: played 50 times • 100% win rate • Greedy player: played 50 times • 100% win rate • Eigen player: ~ 50% win rate • Tournament: 5th place or better • Uploaded much better bot for final tournament. • Awaiting Results
All Hail King Fish The Fish King Of AI