Themes

Cognitive dissonance and moral distress

What is this about?

Because of structural imperatives that overemphasize the good of efficiency (number of publications, h-index), researchers may feel it is not possible to do justice to principles and values related to research integrity (e.g. taking time in order to improve the quality of one publication, rather than publishing as much as possible). In such a situation, a researcher experiences cognitive dissonance and moral distress. The psychological notion of cognitive dissonance refers to the mental discomfort experienced by someone who holds two or more contradictory beliefs, ideas, or values. The ethical concept of moral distress denotes the experience of a person who knows what is the right thing to do, but is (or feels) unable to act accordingly.

Why is this important?

In participating in the communal practice of science, we have to accept certain standards of excellence (related to values, like truth) and rules to follow (to give an accurate account of the authors’ contributions). Thus, we are likely to experience cognitive dissonance or moral distress, when confronted with conflicting imperatives (for instance the need to give an authorship to one’s superior, even if she did not contribute to the specific paper). Cognitive dissonance theory holds that when we experience cognitive or dissonance or moral distress, we tend to justify our behavior. The more often we engage in justifying our unethical behavior, the more we will perceive this unethical behavior as already justified and the more likely we are to engage in it again.

Although we will always be blind to our own ignorance to a certain degree, we can learn to recognize our self-justification strategies as indicators of our (evolving) vices. By recognize why we engage in self-justification strategies and how they impact our decision-making, we can foster conditions for good research.

Virtue ethics emphasizes that we need to develop virtues in order to deal with imperatives that are detrimental to good research (1). According to MacIntyre, “virtues serve three functions: to enable individuals to achieve excellence in practice, to protect the practice from threat of corruption by goods of efficiency, and to be constitutive components of the good human life” (2, p. 226-8). So virtues can be seen as crucial to counter corruptive tendencies in the research system (3, 4).

Cultivating sensitivity for cognitive dissonance and moral distress is an important element of research integrity education (5, 6). It may support us in our attempts to find the right middle between being lenient and being too harsh on ourselves. What is the right middle depends on situational factors, as well as individual capabilities of the researcher. Knowing the right middle is not something that we can learn solely by understanding the underlying dynamics. It has to be learned in practice, over and over again. If we keep in sight the goods of excellence to achieve, we can be prepared not to be discouraged if we fail to assess a situation appropriately, but rather use any mistake we make as a means to fine-tune our cognitive strategies and moral behavior.

For whom is this important?

Students, Researchers, Supervisors, Research integrity trainers

What are the best practices?

In their virtue-based model of ethical decision-making, Crossan et al. outline how a virtue-based orientation may be a means of resilience for individuals who are trying to navigate between high situational pressures and demands for ethical behavior (7).

Medeiros et al. give an overview of cognitive biases prevalent among university staff (8). Mecca et al. give valuable insights on the efficacy of a training intervention based on the finding of Medeiros et al. (9).

Cassam recently introduced an account on how epistemic vices may influence unethical decision-making (10). Moreover, he gives an overview on how these vices may be corrected (see chapter 8 “Self-improvement“, p. 167-187).

References

(1) Hicks, D. J., & Stapleford, T. A. (2016). The Virtues of Scientific Practice: MacIntyre, Virtue Ethics, and the Historiography of Science. Isis, 107(3), 449–472. https://doi.org/10.1086/688346

(2) MacIntyre, A. C. (2014). After virtue. London: Bloomsbury.

(3) Davies, S. R. (2018). An Ethics of the System: Talking to Scientists About Research Integrity. Science and Engineering Ethics, 1-19. https://doi.org/10.1007/s11948-018-0064-y

(4) Nakamura, J., & Condren, M. (2018). A systems perspective on the role mentors play in the cultivation of virtue. Journal of Moral Education, 47(3), 316–332. https://doi.org/10.1080/03057240.2018.1444981

(5) Carr, D. (2017). Virtue and Character in Higher Education. British Journal of Educational Studies, 65(1), 109–124. https://doi.org/10.1080/00071005.2016.1224806

(6) Peters, R., & Filipova, A. (2009). Optimizing Cognitive-Dissonance Literacy in Ethics Education. Public Integrity, 11(3), 201–220. https://doi.org/10.2753/PIN1099-9922110301

(7) Crossan, M., Mazutis, D., & Seijts, G. (2013). In Search of Virtue: The Role of Virtues, Values and Character Strengths in Ethical Decision Making. Journal of Business Ethics, 113(4), 567–581. https://doi.org/10.1007/s10551-013-1680-8

(8) Medeiros, K. E., Mecca, J. T., Gibson, C., Giorgini, V. D., Mumford, M. D., Devenport, L., & Connelly, S. (2014). Biases in Ethical Decision Making among University Faculty. Accountability in Research, 21(4), 218–240. https://doi.org/10.1080/08989621.2014.847670

(9) Mecca, J. T., Medeiros, K. E., Giorgini, V., Gibson, C., Mumford, M. D., & Connelly, S. (2016). Biases and Compensatory Strategies: The Efficacy of a Training Intervention. Ethics & Behavior, 26(2), 128–143. https://doi.org/10.1080/10508422.2014.997878

(10) Cassam, Q. (2019). Vices of the Mind. Oxford: Oxford University Press.

Armin Scholmüller, Guy Widdershoven contributed to this theme.

Latest contribution was August 20, 2019