Tuesday, June 05, 2012

The Developer's Dilemma

The police arrest you and your buddy. They place you in separate interrogation rooms. They suspect you both of a crime but don't have enough evidence to convict you. The cop questioning you offers you a deal: you rat out your accomplish and you get a reduced sentence. Of course, you know another cop is making the same offer to your buddy. If only one of you confesses, the stool pigeon is charged with a misdemeanor and goes free while the other guy is sentenced to a full year in the big house. If you both confess, you'll each do six months. If neither of you confesses, the best they can do on the evidence they have is charge you both with a lesser crime for which you'll each do one month.

This scenario is familiar to anyone to watches crime shows on television, reads police procedurals, or sees pretty much any movie by Martin Scorsese. It is also an example of one of the fundamental games of game theory called the prisoner's dilemma. As the police well know, the dominant strategy is for you to confess: it's the strategy that has the best outcome for you without having to depend on what the other guy does or doesn't do. It allows you to be indifferent to his strategy.

But that doesn't mean it's the best outcome, period. It is also unfortunately the dominant strategy for your buddy. If you both confess, you'll both do six months. It would be far better if both of you kept your mouths shut. The dominant strategy is inefficient because it motivates you both to achieve a sub-optimal outcome. But to do better requires cooperation and trust in your accomplice. This is tough because, as they say, there is no honor among thieves. The prisoner's dilemma is one of the reasons for omertà, the Mafia's code of silence: it alters the payout for both parties to make it more likely that they will not betray one another. The stool pigeon's life isn't worth a plug nickel.

The prisoner's dilemma is widely discussed because it occurs so frequently in real life. The classic example is the nuclear arms race. The dominant strategy for each adversary is to launch a pre-emptive nuclear strike. But if both follow their dominant strategies, to say the outcome is sub-optimal is putting it mildly. Game theory played an important role in framing the strategy of nuclear deterrence for the United States and, one must assume, the Soviet Union. It is probably no coincidence that Merrill Flood and Melvin Dresher of the RAND Corporation first formally framed the prisoner's dilemma the year after the Soviets detonated their first atomic bomb. [1]

There are lots of other real life example. Like the one you may find yourself in right now as a software developer. Don't think so? Consider this: the company you work for probably has a fixed raise or bonus pool to distribute amongst you and your peers. Every dollar the person next to you gets is one dollar you won't get. There are only so many slots into which people may be promoted. Every slot filled with someone else is a slot that is no longer available to you. Many companies use a strategy of forced ranking, where a manager must rank a certain percentage of his employees into the lowest rating tier for use when the layoffs inevitably occur. Someone has to go in that tier no matter that their actual performance in absolute terms may be. This is called a zero-sum game: everyone's gain is balanced by someone else's loss. [2] Like it or not, your employer's human resources polices force you to compete with the developer sitting across from you. Your dominant strategy is to betray him.

But that's not the optimal strategy. Even though your company's incentive program is tragically designed to motivate you to do otherwise, both you and your fellow developer will do better if you cooperate and work together. Your joint efforts will produce a product that will be more successful, the customers will be happier, and your employer will make more money. All this because you chose to cooperate instead of compete, and despite the fact that your company is telling you to act otherwise.

I spent several years working in an organization in which our director thought competition is good (his exact words) and encouraged it among the various groups under his management. This, despite the fact that each group played a different role, and to succeed we all had to work together to support one another. Treating his groups like interchangeable commodities drove tremendous dysfunction into that organization, and highly politicized the work environment. As a manager I spent most of my time trying to shield my troops from this while attempting to get my peer managers to actually fulfill their official responsibilities. I don't miss it. Later I worked in an organization that used forced ranking. It was interesting to watch developers attempt to game the system, occasionally by screwing one another. I don't miss that, either.

How does one escape from the prisoner's dilemma? Much of game theory is about techniques and strategies to encourage cooperation among the players. Robert Axelrod of the University of Michigan held a prisoner's dilemma competition: entries were computer programs that played against one another for multiple rounds, deciding in each round whether to defect (betray or confess), or cooperate (stay silent). The winning algorithm was tit-for-tat: the program always began by cooperating; if its opponent defected, the program defected in the next round, but immediately returned to cooperating in the following round. [3]

Avinash Dixit and Barry Nalebuff illustrate the problem with this strategy: when it plays against itself, and its opponent defects just once, it falls into a perpetual pattern of alternating defection with cooperation. How can its opponent defect just once? Because in real-life, honest mistakes happen, due to misunderstandings and miscommunication. A better strategy, according to Dixit and Nalebuff, is to base your decision on both short-term and long-term memory of your opponent's past decisions, a version of which Scott Stevens has called tit-for-two-tats.

But regardless of the specifics, Stevens points out that tit-for-tat-like strategies that perform well in the repeated prisoner's dilemma game all have four characteristics:


  1. They are nice: they always begin by trusting.
  2. They are provokable: they always punish by betraying if they themselves are betrayed.
  3. They are forgiving: even though betrayed, they can eventually return to cooperating.
  4. They are straightforward: their strategy is rational, transparent, and easily understood.


I first became interested in game theory back in the 1990s when I was working on very large real-time distributed telecommunications systems. The game trees (for sequential games) and game tables (for simultaneous games) from game theory gave me a more structured way to think about error detection and recovery, particularly with regards to communication channels to other systems. I was surprised to find, when I started looking at incentive programs for software developers, that game theory played a key role there as well. Game theory is a topic that once you begin to read about it, you see it everywhere. It is a social science that studies strategic decision making. And the games that it studies permeate our lives.

Once I realized that I was routinely caught in a developer's dilemma, the tit-for-tat strategies gave me a useful approach to deal with it. Studying game theory has led me to be nicer, more provokable, more forgiving (that was the hard one, for me), and more straightforward. I'm not exaggerating when I say it's made me a better person.

Footnotes

[1] The prisoner's dilemma also applies to the Fermi Paradox, which frames the contradiction between the statistical likelihood of technological extraterrestrial civilizations with the fact that so far there is no evidence of such. The dominant but sub-optimal strategy is for each civilization to try to wipe out all the others before someone does it to them. I'm voting for weaponized von Neumann machines. It's interesting (to me anyway) that John von Neumann, the inventor of game theory, also invented the idea of self-replicating automata. I'd like to think that weaponized von Neumann machines of extraterrestrial origin are the untold backstory to books and movies about the zombie apocalypse. Or of the Borg race from the Star Trek universe. The central theme, in my opinion, of the Borg story arc in Star Trek is that the Borg applied the dominant strategy of betrayal to the prisoner's dilemma while the members of the United Federation of Planets chose the cooperative optimal strategy.

[2] Strictly speaking, forced ranking isn't zero-sum. For a game to be zero-sum, not only does every win have to be offset by a loss of equal value, but there must be no way for all players to come out ahead. But if all employees who are ranked relative to one another could enter into an enforceable collective agreement in which they each agreed to work less hard, their forced ranking relative to one another would remain unchanged while they each applied the absolute minimum amount of effort necessary to keep their jobs. While it is hard to believe that this is what their employer intended, it is in fact the most efficient outcome for the forced ranking game.

[3] The WOPR U.S. automated defense computer learns the inefficiency of nuclear war by playing tic-tac-toe in the movie WarGames. But a much better game, and one that likely actually came from nuclear strategists, would be to have it play a repeated prisoner's dilemma game where it would learn the benefit of cooperation over betrayal. On the other hand, this is exactly the plot of the movie Colossus: The Forbin Project, and that didn't turn out so well due to what economists would call an externality.

Sources

Wikipedia, Game theory, 2012

Wikipedia, Prisoner's dilemma, 2012

Wikipedia, Fermi paradox, 2012

Avinash K. Dixit and Barry J. Nalebuff, The Art of Strategy, W. W. Norton,  2008

Scott P. Stevens, Games People Play, The Teaching Company, 2008

Robert D. Austin, Measuring and Managing Performance in Organizations, Dorset House, 1996

Avinash K. Dixit and Barry J. Nalebuff, Thinking Strategically, W. W. Norton, 1991

Lawrence Lasker et al., WarGames, MGM, 1983

James Bridges, D. F. Jones, Colossus: The Forbin Project, Universal Pictures, 1970

No comments: