Avoiding Blind Spots
Motivational blindness is the tendency to not notice the unethical actions of others when it is against our own best interests to notice. The "want" self—that part of us that behaves according to self-interest and, often, without regard for moral principles—is silent during the planning stage of a decision but typically emerges and dominates at the time of the decision. Organizations can monitor how they are creating institutions, structures, and incentives that increase the likelihood of unethical actions, while individuals can "precommit" to intended ethical choices.
The difference in perspective between leadership teams and the rest of the organization can, in nearly half of the situations we have observed, be equivalent to the impact of a year’s worth of working on health. So, a typical leadership team that sees the organization through its own (biased) view will have less urgency and commitment to change because they think they’re already a year ahead.Clearly, making decisions based on the view of top leadership alone is likely to result in focusing on the wrong things. There is no longer any reason why leaders need to make decisions based solely on the part of the elephant that they see. By expanding their view, they can drive value creation and resource allocation much more effectively across their organization. And there is one final benefit based on an insight from behavioral science – the individuals that you involve are more likely to be committed to the change, as they have helped to shape it.
In particular, temporal distance influences the extent to which people believe they will be able to follow their moral compass. When thinking about the future, we are more likely to think about how we want to behave and to consider our ethical values as a result. Therefore, individuals overestimate the extent to which they will behave morally in the future. Bazerman, Tenbrunsel, and Wade-Benzoni attribute such forecasting errors to the tension between the side of us that wants immediate gratification — the ‘‘want self’’ — and the side that wants to make responsible and ethical decisions — ‘should self.’ Before making a decision, people predict they will behave in accordance with their ‘should self,’ a choice that supports their moral self-view (Chugh D, Bazerman MH, 2005). However, when it is time to make a decision, the ‘want self’ becomes dominant the immediate gains from the unethical act become much more salient in the present, while the ethical implications fade away. In the post-decision phase, the ethical implications of a decision arise again, and the ‘should self’ reemerges. As they evaluate their own actions, people reduce the dissonance that results from behavior that contradicts their values. Through ‘psychological cleansing,’ a process of moral disengagement, individuals can activate or deactivate their ethical values. For instance, Shu, Gino, and Bazerman found that when individuals were given an opportunity to cheat and then did so, they viewed cheating behavior as more acceptable. Similarly, individuals engage in ‘motivated forgetting’ of moral rules after engaging in wrongdoing (Bertrand M, Chugh D., 2005). These distortions help individuals close the gap between their unethical behavior and their moral self-image. In sum, temporal inconsistencies prevent us from being as ethical as we desire to be.
Because of this, it is not uncommon for us to systematically overestimate how we will or will not act in future situations because of behavior forecasting errors. Especially when you consider how biased people’s implicit attitudes are towards people, which causes them to reconstruct information to make their actions appear ethical. Furthermore, people’s self-interest can cause them to tilt the scales of fairness in their favor because of their egocentric nature that leads to overclaiming resources which overly discounts the future of the environment and society. Only when people realize just how susceptible we are too unethical behavior will we be able to bridge the gap between who we are and who we want to be.
Banaji MR, Bazerman MH, Chugh D: How (un) ethical are you? Harvard Business Rev 2003, 81:56-65.
Chugh D, Bazerman MH, Banaji MR: Bounded ethicality as a psychological barrier to recognizing conflicts of interest. In Conflict of Interest: Challenges and Solutions in Business, Law, Medicine, and Public Policy. Edited by Moore D, Cain D, Loewenstein G, Bazerman M. Cambridge University Press; 2005.74, 95
Banaji MR, Greenwald AG: Blindspot: Hidden Biases of Good People. New York: Delacorte Press; 2013.
Bertrand M, Chugh D, Mullainathan S: Implicit discrimination. Am Econ Rev 2005, 95:94-98.