Belief in the gaming gospel is often an act of faith. For instance, jacks-or-better video poker pundits say to hold two but not three unpaired/unsuited pictures. Why? How do they know? Such pronouncements are predicated on the precise probabilities underlying games of chance. To most players, however, the formal math might as well be a language spoken on the planet Mars. So, some accept the dogma. Others don't.
The door is open for doubt because people think qualitatively, not quantitatively. Solid citizens grasp big or small, good or bad, more or less. Few shoppers compare 283.5 grams for $5.99 versus 1 lb for $8.99. Likewise, craps players rarely weigh $6 eights with 45.4 percent probability of returning $7 against $5 fours with 33.3 percent chance of paying $9. And, qualms over whether there's a right and wrong, let alone which is which, are deepened by the typical small differences among alternatives.
Many bettors, though unswayed by mathematical mumbo-jumbo, still gamble logically. For close calls, they frequently use an intuitive form of what's known in decision and game theory as a "minimax-maximin" rule. I'll illustrate with a common casino quandary, "should you insure a blackjack?" Quantitatively... the experts fiddle with their calculators then say "no," because it's a sucker bet with a theoretical $0.038 loss per dollar of insurance -- blackjack or not. Qualitatively... let's see.
To use the minimax-maximin approach, start by listing all the possibilities. Here, there are four: a) don't insure, dealer doesn't have blackjack -- win 1.5 units; b) don't insure, dealer has blackjack -- push; c) insure, dealer doesn't have blackjack -- win 1 unit; d) insure, dealer has blackjack -- win 1 unit.
Next, rank these possibilities from best to worst. Most players would do so strictly in terms of payoff, figuring it's just luck if the dealer has 10 in the hole. Allowing for tied ranks:
1) Don't insure, dealer doesn't have blackjack -- win 1.5 units
2) Insure, dealer has blackjack -- win 1 unit
2) Insure, dealer doesn't have blackjack -- win 1 unit
3) Don't insure, dealer has blackjack -- push.
Now, put the ranks into a "conflict map" like that in the accompanying table. The player's controlled options are columns, labeled across the top. The dealer's uncontrolled results are rows, labeled down the side.
The highest value in each column is the worst outcome with that column's strategy. The lowest of these highest values is the best of the possible bad results. It is the minimax; for this example, it's #2 -- which occurs in both rows of the "insure" column.
The lowest value in each row is the best outcome for that row's uncontrolled events. The highest of these lowest values is the worst of the possible good results. It is the maximin; for this example it's #2 -- in the top row of the "insure" column.
When the minimax and maximin coincide, as in this case, the column where they occur is the optimum strategy for the assigned rankings. This explains why most players insure their blackjacks, even though not doing so is numerically more advantageous.
When the minimax and maximin don't coincide, even when they're in the same column, the approach gets more complex. For instance, a dilemma might be resolved by assigning probabilities to various uncontrolled results. Or a "least pain" choice might be made by ignoring the maximin and following a pure minimax strategy.
Minimax and maximin decisions are used formally in business and government, as well as intuitively in everyday life. This is one reason why a Nobel Prize was awarded last year for work in game theory.