**He Who Goes First . . . ** Hex was first invented and named by Danish poet and inventor Piet Hein in 1942. Several years later Nash invented it independently. There are many Web sites devoted to Hex, as well as some interactive versions (try

www.mazeworks.com/hex7).

**1.** Imagine a river that crosses the board, or a bridge that blocks the river from crossing the board. For one to block the other, it must span the board's outer edges. In other words, one player must win. Mathematician David Gale devised a proof that uses the following reasoning: In the diagram at right, start at point A on the edge of the board, with water on the left and land on the right. Move forward one hexagon at a time, keeping blue hexes (water) to your left and orange (shore) to your right. Your path cannot end in the middle of the board, so it must end somewhere along the edge, which can be only at point B or C. If you end at point B, there will be a land bridge to the right of your path; if you end at point C, there will be a river to the left. Note that in this diagram, if you change any orange hexes to blue to break the land path, you'll create a water path that travels through that very hex.

**2.** In any pure strategy game without draws—like Hex—one of the players must have a winning strategy. Imagine drawing a diagram of all possible games. The empty board is at the root. Branches growing from each board show all possible next moves. (Branches can grow apart, then join back together higher up.) Each branch ends in a leaf, a board where one of the two players wins. Mark each leaf with B when blue wins and O when orange wins. Work backward down the tree. If you know, for example, that all the boards growing out of a particular node are wins for orange, you would mark those nodes with an O. Eventually, you would see that the first nodes are all either B or O, and you would have determined which player has the winning strategy. Player 1 can always win because if player 2 had a winning strategy, then player 1 could steal it by playing the same strategy on his next move. Although this demonstrates that the first player can always win, it says nothing about which moves the players should actually make. A seven-by-seven Hex board is the largest for which a complete strategy is known.

**The Nash Equilibrium** **1.** The Nash equilibrium is the rightmost square in the middle row. In diagram A, red squares mark the optimal choice in each column for the red player, and blue circles highlight the optimal choice in each row for the blue player. Because the Nash equilibrium is by definition optimal for both players, it is the one square highlighted in both red and blue.

**2.** You can introduce a second Nash equilibrium by reducing the value of the blue number in the green square, as shown in diagram B. Just because a square is a Nash equilibrium doesn't mean it's the best solution. In this matrix, 9-8 is the best option for both players, but if they choose 7-6, they can get stuck—neither can improve her lot by switching her choice individually.

**3.** If all numbers in a three-by-three matrix are the same, the matrix has nine equilibriums. The maximum number of Nash equilibriums if all numbers in each row and column are different is three (diagram C), because each row (or column) can contain at most one equilibrium.

**4.** Diagram D is the three-by-three matrix for the game of Rock, Paper, Scissors, which has no equilibrium. The game cycles through the squares as indicated by the arrows.

**Decisions, Decisions** **1.** B—No equilibrium.

**2.** A and C—Multiple equilibriums.

**3.** D—Prisoner's dilemma. This Nash equilibrium is arguably the worst choice for both players. If players agree to cooperate, then they would certainly choose the upper left cell, in which neither player talks. But since they can't trust each other, they are both motivated to talk—a logical but self-defeating spiral that echoes the arms race and other dilemmas. For more information about the prisoner's dilemma, see

*Decisions and Elections* by Donald Saari (Cambridge University Press, 2001) and

*Criminal Dilemmas* by K. Seiberg (Springer-Verlag, 2001). 4 . C—Chicken.

Want to go back to the

puzzle?

Got new solutions for the puzzle? Want to see other people's solutions? Talk to the puzzle master in his discussion forum at

www.scottkim.com.