Rationality and Corruption: What Behavioral Economics Knows and What Most of us Miss.


Do you make decisions with your brain or with your heart? Or none of them? Or both of them? Do not rush with an answer; it is not as straightforward as it might seem at a first glance. Read the post and find the answer for yourself!  

In one of my previous posts, I have speculated on how state anti-corruption programs and corporate ethics and compliance ones might benefit from achievements of social phycology. Now I am going to share a few valuable concepts from the area of behavioral economics.

The cornerstone of economics is an assertion that people’s behavior is been impacted by incentives. People are not good or bad - people are people and they respond to incentives. Therefore, creating appropriate incentives is of the paramount importance. There is often a vast gulf between how people say they behave and how they actually behave (in economics these two behaviors are known as declared preferences and revealed preferences). Incentives might be structured in a positive way as a reward (a carrot), or in a negative way as a punishment (a stick). In line with “operant condition principle, “behavior that is followed by pleasant consequences is likely to be repeated, and behavior followed by unpleasant consequences is less likely to be repeated”.

Another important concept that should not be underestimated is the way we frame an incentive, because people tend more to avoid losses than to receive gain. This is known as “loss aversion and “refers to people's tendency to prefer avoiding losses to acquiring equivalent gains: it is better to not lose $5 than to find $5. Some studies have suggested that losses are twice as powerful, psychologically, as gains.”

Loss aversion is strongly correlated with the “framing effect”. To illustrate what it is here are two dilemmas for you. In each there are two alternative programs of medical treatment. And you have to take a decision as if you are a doctor. Please, before you proceed make your choice in each.

Dilemma 1. If Program A is adopted, 200 people of 600 will be 100% saved; If Program B is adopted, there is 2/3 probability that no one will be saved.

Dilemma 2. If Program C is adopted, 400 people out of 600 will die; If Program D is adopted, there is 1/3 probability that all will be saved.

Do you have different answers? If no, congratulations. If yes, you fell into a trap of framing. Program A is the same as Program C. The difference is that Program A is a positively framed option, while Program C is negatively framed one. The same is with Programs B and D. They are the same, except they are framed differently. So positive words may have confused to pick between the actual calculation of risks. Framing effect is “a cognitive bias where people decide on options based on if the options are presented with positive or negative semantics; e.g. as a loss or as a gain. People tend to avoid risk when a positive frame is presented, but seek risks when a negative frame is presented”. ):

 If you fell into this track don’t get upset. When above study was presented to a big range of medical doctors’ absolute majority also fell into it. 71% of all the participants would choose A rather than B in a positive frame, while 72 % of the participants would choose D rather than C in a negative frame.

In another economic experiment Ultimatum Game “one player, the proposer, is endowed with a sum of money. The proposer is tasked with splitting it with another player, the responder. Once the proposer communicates their decision, the responder may accept it or reject it. If the responder accepts, the money is split per the proposal; if the responder rejects, both players receive nothing. Both players know in advance the consequences of the responder accepting or rejecting the offer.” The experiment illustrated human unwillingness to accept injustice: when carried out between members of a shared social group (e.g., a village, a tribe, a nation), people offer "fair" (i.e., 50:50) splits, and offers of less than 30% are often rejected.

In a derivate of an Ultimatum Game, a Dictator Game “the first player, "the dictator", determines how to split cash between themselves and the second player. The dictator’s action space is complete and therefore is at their own will to determine the endowment, which means that the recipient has no influence over the outcome of the game.” The results – most players choose to send money – evidence the role of fairness and norms in economic behavior, and undermine the assumption of narrow self-interest.

The results of both games strongly correlate with the beliefs in an inherence of justice by the pioneer of political economy Adam Smith in his Theory of Moral Sentiments.

Do state anti-corruption programs or corporate ethics and compliance programs utilize these concepts in behavioral economics (appropriate incentives, leveraging on human unwillingness to accept injustice, preference of rewards over punishment, and a proper framing of the incentives)? Let me know what you think in the comments.

Volodymyr Grabchak is a legal counsel with an international FMCG company, responsible, inter alia for anti-corruption compliance issues in Ukraine and Moldova. He got his law degree in Ukraine and LL.M. degree in the Netherlands. 

Volodymyr is attorney at law admitted to Ukrainian Bar and a member of the International Bar Association (IBA).