Evolution of Cooperation through cumulative reciprocity

October 27, 2022

Understanding reciprocal cooperation is a key element to of understanding how people work together. Whether it is friends who exchange favours, animals who exchange food or services, or nations that coordinate their policies, all these are essentially reciprocal interactions. Stable reciprocity requires that people are prepared to help others but that they fight back when being exploited. But what are the rules that ensure cooperation can flourish while at the same time protecting individuals being taken advantage of?

To explore this question, the MPI researchers Charlotte Rossetti and Christian Hilbe, together with collaborators from Dalian Institute of Technology (China), study a simple game called the repeated prisoner’s dilemma. In a repeated prisoner’s dilemma, two players are simultaneously faced with the same decision: they can pay a small cost to give a monetary benefit to the other player, or do nothing. Ideally, both players would ’cooperate’ and pay this cost so that both receive the benefit. However, there is an opportunity for a player to defect and not pay the cost and pocket the benefit the other player has given them. How can people play this game in such a way that cooperation is possible while keeping defectors at bay?

A typical example of how one might approach the repeated prisoner’s dilemma is to play Tit-for-Tat, where one does onto others what others do onto us. And indeed, cooperation can evolve and thrive in a society where people use Tit-for-Tat, but with a major drawback: when individuals make mistakes, cooperation can unravel to a state of mutual defection. ” Tit-for-Tat is a nice rule of thumb that is easy to implement and feels very human. After all it is based on the old saying an eye for an eye.” says Mrs Rossetti from the Max Planck Institute for Evolutionary Biology. ”But it is not forgiving enough and does not account for error, which we know are all too common in humans. If I accidentally defect when I meant to cooperate, and then come back to cooperating only if you do, then we will be out of sync, forever defecting to each other’s defection.”

Cumulative reciprocity

To address these weaknesses, the researchers analyse an alternative strategy for the repeated prisoner’s dilemma, which they call ”Cumulative Reciprocity”, or CURE. Individuals who use CURE keep track of the imbalance of cooperative actions. On each round, they keep track of whether the number of cooperations and defections from both players are equal or out of balance in the other player’s favour. If the imbalance is null or low enough, they keep cooperating. However, if that imbalance gets too big, they are at risk of being exploited. So they defect.

The first advantage of the strategy CURE is practical. Tallying one number (the imbalance) on each round allows individuals to take into account the full history of interaction, but without having to store the detailed outcome of every round. This greatly simplifies the computations and allows researchers to analyse the model extensively. To do so, the team led by Dr Hilbe and Dr Xia explored this strat- egy’s mathematical properties and ran extensive computer simulations to test how it evolves in different environments. Those computational results demonstrated that CURE has the ability to encourage fairness while allowing for error. It is also able to evolve and thrive in a hostile environment, meaning that cooperation can appear even from a population of defectors.

Prediction of human behaviour

The other strength of CURE is its intuitiveness and simplicity, making it a strong candidate for predicting real human behaviour. To explore that aspect in more detail, Mrs Rossetti ran an online experiment where participants had the op- portunity to play with another person for a small amount of money. The results show that in a repeated interaction with a choice between cooperating and defect- ing, CURE explained real behaviour more accurately than other rules, especially when error were introduced. The fact that humans sometimes make errors when they interact with others can be detrimental to cooperation. As such, any model that wishes to approximate human behaviour closely must take that into account.

Work in psychology finds that in friendships and other close relationships, most people do not keep a clear track of who owes who a favour, but rather a general sense of whether the relationship is fair or not. CURE embodies this behaviour perfectly. However, this approach does not presume that people choose this strategy consciously. Instead, strategies like CURE can naturally emerge over time as simple rules of thumbs that allow reciprocal cooperation to evolve.

Go to Editor View