Tit for tat
Tit for tat is an English saying meaning "equivalent retaliation". It is an alteration of tip for tap "blow for blow",[1] first recorded in 1558.[2]
It is also a highly effective strategy in game theory. An agent using this strategy will first cooperate, then subsequently replicate an opponent's previous action. If the opponent previously was cooperative, the agent is cooperative. If not, the agent is not. This is similar to reciprocal altruism in biology.
Game theory
[edit]Tit-for-tat has been very successfully used as a strategy for the iterated prisoner's dilemma. The strategy was first introduced by Anatol Rapoport in Robert Axelrod's two tournaments,[3] held around 1980. Notably, it was (on both occasions) both the simplest strategy and the most successful in direct competition. Few have extended the game theoretical approach to other applications such as finance. In that context the tit for tat strategy was shown to be associated to the trend following strategy.[4]
Implications
[edit]The success of the tit-for-tat strategy, which is largely cooperative despite that its name emphasizes an adversarial nature, took many by surprise. Arrayed against strategies produced by various teams it won in two competitions. After the first competition, new strategies formulated specifically to combat tit-for-tat failed due to their negative interactions with each other; a successful strategy other than tit-for-tat would have had to be formulated with both tit-for-tat and itself in mind.
This result may give insight into how groups of animals (and particularly human societies) have come to live in largely (or entirely) cooperative societies, rather than the individualistic "red in tooth and claw" way that might be expected from individuals engaged in a Hobbesian state of nature. This, and particularly its application to human society and politics, is the subject of Robert Axelrod's book The Evolution of Cooperation.
Moreover, the tit-for-tat strategy has been of beneficial use to social psychologists and sociologists in studying effective techniques to reduce conflict. Research has indicated that when individuals who have been in competition for a period of time no longer trust one another, the most effective competition reverser is the use of the tit-for-tat strategy. Individuals commonly engage in behavioral assimilation, a process in which they tend to match their own behaviors to those displayed by cooperating or competing group members. Therefore, if the tit-for-tat strategy begins with cooperation, then cooperation ensues. On the other hand, if the other party competes, then the tit-for-tat strategy will lead the alternate party to compete as well. Ultimately, each action by the other member is countered with a matching response, competition with competition and cooperation with cooperation.
In the case of conflict resolution, the tit-for-tat strategy is effective for several reasons: the technique is recognized as clear, nice, provocable, and forgiving. Firstly, it is a clear and recognizable strategy. Those using it quickly recognize its contingencies and adjust their behavior accordingly. Moreover, it is considered to be nice as it begins with cooperation and only defects in response to competition. The strategy is also provocable because it provides immediate retaliation for those who compete. Finally, it is forgiving as it immediately produces cooperation should the competitor make a cooperative move.
The implications of the tit-for-tat strategy have been of relevance to conflict research, resolution and many aspects of applied social science.[5]
Mathematics
[edit]Take for example the following infinitely repeated prisoners dilemma game:
C | D | |
---|---|---|
C | 6, 6 | 2, 9 |
D | 9, 2 | 3, 3 |
The tit-for-tat strategy copies what the other player previously chose. If players cooperate by playing strategy (C,C) they cooperate forever.
1 | 2 | 3 | 4 | ... | |
---|---|---|---|---|---|
p1 | C | C | C | C | ... |
p2 | C | C | C | C | ... |
Cooperation gives the following payoff (where is the discount factor):
a geometric series summing to
If a player deviates to defecting (D), then the next round they get punished. Alternate between outcomes where p1 cooperates and p2 deviates, and vice versa.
1 | 2 | 3 | 4 | ... | |
---|---|---|---|---|---|
p1 | C | D | C | D | ... |
p2 | D | C | D | C | ... |
Deviation gives the following payoff:
a sum of two geometric series that comes to
Expect collaboration if payoff of deviation is no better than cooperation.
Continue cooperating if,
Continue defecting if,
Problems
[edit]While Axelrod has empirically shown that the strategy is optimal in some cases of direct competition, two agents playing tit for tat remain vulnerable. A one-time, single-bit error in either player's interpretation of events can lead to an unending "death spiral": if one agent defects and the opponent cooperates, then both agents will end up alternating cooperate and defect, yielding a lower payoff than if both agents were to continually cooperate. This situation frequently arises in real world conflicts, ranging from schoolyard fights to civil and regional wars. The reason for these issues is that tit for tat is not a subgame perfect equilibrium, except under knife-edge conditions on the discount rate.[6] While this sub-game is not directly reachable by two agents playing tit-for-tat strategies, a strategy must be a Nash equilibrium in all sub-games to be sub-game perfect. Further, this sub-game may be reached if any noise is allowed in the agents' signaling. A sub-game perfect variant of tit for tat known as "contrite tit for tat" may be created by employing a basic reputation mechanism.[7]
Knife-edge is "equilibrium that exists only for exact values of the exogenous variables. If you vary the variables in even the slightest way, knife-edge equilibrium disappear."[8]
Can be both Nash equilibrium and knife-edge equilibrium. Known as knife-edge equilibrium because the equilibrium "rests precariously on" the exact value.
Example:
Left | Right | |
---|---|---|
Up | (X, X) | (0, 0) |
Down | (0, 0) | (−X, −X) |
Suppose X = 0. There is no profitable deviation from (Down, Left) or from (Up, Right). However, if the value of X deviates by any amount, no matter how small, then the equilibrium no longer stands. It becomes profitable to deviate to up, for example, if X has a value of 0.000001 instead of 0. Thus, the equilibrium is very precarious. In its usage in the Wikipedia article, knife-edge conditions is referring to the fact that very rarely, only when a specific condition is met and, for instance, X, equals a specific value is there an equilibrium.
Tit for two tats could be used to mitigate this problem; see the description below.[9] "Tit for tat with forgiveness" is a similar attempt to escape the death spiral. When the opponent defects, a player employing this strategy will occasionally cooperate on the next move anyway. The exact probability that a player will respond with cooperation depends on the line-up of opponents.
Furthermore, the tit-for-tat strategy is not proved optimal in situations short of total competition. For example, when the parties are friends it may be best for the friendship when a player cooperates at every step despite occasional deviations by the other player. Most situations in the real world are less competitive than the total competition in which the tit-for-tat strategy won its competition.
Tit for tat is very different from grim trigger, in that it is forgiving in nature, as it immediately produces cooperation, should the competitor choose to cooperate. Grim trigger on the other hand is the most unforgiving strategy, in the sense even a single defect would the make the player playing using grim trigger defect for the remainder of the game.[10]
Tit for two tats
[edit]Tit for two tats is similar to tit for tat, but allows the opponent to defect from the agreed upon strategy twice before the player retaliates. This aspect makes the player using the tit for tat strategy appear more "forgiving" to the opponent.
In a tit for tat strategy, once an opponent defects, the tit for tat player immediately responds by defecting on the next move. This has the unfortunate consequence of causing two retaliatory strategies to continuously defect against each other resulting in a poor outcome for both players. A tit for two tats player will let the first defection go unchallenged as a means to avoid the "death spiral" of the previous example. If the opponent defects twice in a row, the tit for two tats player will respond by defecting.
This strategy was put forward by Robert Axelrod during his second round of computer simulations at RAND. After analyzing the results of the first experiment, he determined that had a participant entered the tit for two tats strategy it would have emerged with a higher cumulative score than any other program. As a result, he himself entered it with high expectations in the second tournament. Unfortunately, owing to the more aggressive nature of the programs entered in the second round, which were able to take advantage of its highly forgiving nature, tit for two tats did significantly worse (in the game-theory sense) than tit for tat.[11]
Real-world use
[edit]Peer-to-peer file sharing
[edit]BitTorrent peers use tit-for-tat strategy to optimize their download speed.[12] More specifically, most BitTorrent peers use a variant of tit for two tats which is called regular unchoking in BitTorrent terminology. BitTorrent peers have a limited number of upload slots to allocate to other peers. Consequently, when a peer's upload bandwidth is saturated, it will use a tit-for-tat strategy. Cooperation is achieved when upload bandwidth is exchanged for download bandwidth. Therefore, when a peer is not uploading in return to our own peer uploading, the BitTorrent program will choke the connection with the uncooperative peer and allocate this upload slot to a hopefully more cooperating peer. Regular unchoking correlates to always cooperating on the first move in prisoner's dilemma. Periodically, a peer will allocate an upload slot to a randomly chosen uncooperative peer (unchoke). This is called optimistic unchoking. This behavior allows searching for more cooperating peers and gives a second chance to previously non-cooperating peers. The optimal threshold values of this strategy are still the subject of research.
Explaining reciprocal altruism in animal communities
[edit]Studies in the prosocial behaviour of animals have led many ethologists and evolutionary psychologists to apply tit-for-tat strategies to explain why altruism evolves in many animal communities. Evolutionary game theory, derived from the mathematical theories formalised by von Neumann and Morgenstern (1953), was first devised by Maynard Smith (1972) and explored further in bird behaviour by Robert Hinde. Their application of game theory to the evolution of animal strategies launched an entirely new way of analysing animal behaviour.
Reciprocal altruism works in animal communities where the cost to the benefactor in any transaction of food, mating rights, nesting or territory is less than the gains to the beneficiary. The theory also holds that the act of altruism should be reciprocated if the balance of needs reverse. Mechanisms to identify and punish "cheaters" who fail to reciprocate, in effect a form of tit for tat, are important to regulate reciprocal altruism. For example, tit-for-tat is suggested to be the mechanism of cooperative predator inspection behavior in guppies.
War
[edit]The tit-for-tat inability of either side to back away from conflict, for fear of being perceived as weak or as cooperating with the enemy, has been the cause of many prolonged conflicts throughout history.
However, the tit for tat strategy has also been detected by analysts in the spontaneous non-violent behaviour, called "live and let live" that arose during trench warfare in the First World War. Troops dug in only a few hundred feet from each other would evolve an unspoken understanding. If a sniper killed a soldier on one side, the other expected an equal retaliation. Conversely, if no one was killed for a time, the other side would acknowledge this implied "truce" and act accordingly. This created a "separate peace" between the trenches.[13]
The Troubles
[edit]During The Troubles the term was used to describe increasing eye for an eye behaviour between the Irish Republicans and Ulster Unionists.[14] This can be seen with the Red Lion Pub bombing by the IRA being followed by the McGurk's Bar bombing, both targeting civilians. Specifically the attacks of massacres would be structured around the mutual killings of Unionist and Republican communities, both communities being generally uninterested in the violence.[15] This sectarian mentality led to the term "Tit for tat bombings" to enter the common lexicon of Northern Irish society.[16][17]
See also
[edit]- Attitude polarization
- Chicken (game)
- Christmas truce
- Deterrence theory
- Eye for an eye
- Golden Rule
- Mutual assured destruction
- Nice Guys Finish First, a documentary by Richard Dawkins that discusses tit for tat.
- Peace war game
- Quid pro quo
- Trigger strategy, a set of strategies of which tit for tat is a member.
- Virtuous circle and vicious circle
- Zero-sum game
References
[edit]- ^ "tit for tat". Etymology Online. Archived from the original on 2023-07-26.
- ^ Heap, Shaun Hargreaves; Varoufakis, Yanis (2004). Game theory: a critical text. Routledge. p. 191. ISBN 978-0-415-25094-8.
- ^ "The Axelrod Tournaments". September 5, 2011.
- ^ Mahdavi-Damghani, Babak and Roberts, Stephen (2023). "Guidelines for Building a Realistic Algorithmic Trading Market Simulator for Backtesting While Incorporating Market Impact: Agent-Based Strategies in Neural Network Format, Ecosystem Dynamics & Detection". Algorithmic Finance. Pre–press (1): 1–25. doi:10.3233/AF-220356.
{{cite journal}}
: CS1 maint: multiple names: authors list (link) - ^ Forsyth, D.R. (2010) Group Dynamics
- ^ Gintis, Herbert (2000). Game Theory Evolving. Princeton University Press. ISBN 978-0-691-00943-8.
- ^ Boyd, Robert (1989). "Mistakes Allow Evolutionary Stability in the Repeated Prisoner's Dilemma Game". Journal of Theoretical Biology. 136 (1): 47–56. Bibcode:1989JThBi.136...47B. CiteSeerX 10.1.1.405.507. doi:10.1016/S0022-5193(89)80188-2. PMID 2779259.
- ^ "Knife-Edge Equilibria – Game Theory 101". Retrieved 2018-12-10.
- ^ Dawkins, Richard (1989). The Selfish Gene. Oxford University Press. ISBN 978-0-19-929115-1.
- ^ Axelrod, Robert (2000-01-01). "On Six Advances in Cooperation Theory". Analyse & Kritik. 22 (1): 130–151. CiteSeerX 10.1.1.5.6149. doi:10.1515/auk-2000-0107. ISSN 2365-9858. S2CID 17399009.
- ^ Axelrod, Robert (1984). The Evolution of Cooperation. Basic Books. ISBN 978-0-465-02121-5.
- ^ Cohen, Bram (2003-05-22). "Incentives Build Robustness in BitTorrent" (PDF). BitTorrent.org. Retrieved 2011-02-05.
- ^ Nice Guys Finish First. Richard Dawkins. BBC. 1986.
- ^ Hume, John (1986). "A New Ireland: The Acceptance of Diversity". Studies: An Irish Quarterly Review. 75 (300): 378–383. JSTOR 30090790.
- ^ Savaric, Michel (October 11, 2014). Garbaye, Romain; Schnapper, Pauline (eds.). The Politics of Ethnic Diversity in the British Isles. Palgrave Macmillan UK. pp. 174–188. doi:10.1057/9781137351548_10 – via Springer Link.
- ^ Counterterrorism Killings and Provisional IRA Bombings, 1970-1998 Paul Gill, University College London James Piazza, Pennsylvania State University John Horgan, Georgia State University
- ^ Maney, Gregory, Michael McCarthy, and Grace Yukich. "Explaining political violence against civilians in Northern Ireland: A contention-oriented approach." Mobilization: An International Quarterly 17, no. 1 (2012): 27-48.