There is much evidence to suggest that humans everywhere recognize the virtues of kindness, fairness, loyalty, respect, sharing, courage, and obedience and abhor cruelty, cheating, betrayal, subversion, hoarding, cowardice, and disobedience2,5,10. But people are often obliged to prioritize one virtue over others or condemn some vices more than others, depending on a wide range of contextual factors and goals. And this variability is apparent also at the level of entire cultural groups, some tending historically to emphasize certain virtues more highly or punishing particular vices more harshly than others. Social scientists have presented countless examples of moral values that serve to reinforce locally prevailing social structures – for example, that egalitarian hunter-gatherers value sharing14, armies demand loyalty and self-sacrifice4, chiefdoms emphasize respect for natural superiors11, and affluent liberal democracies value kindness9.

Universal moral intuitions are like anchors, invisible from the surface but immovably secured to the seabed, whereas culturally prevalent moral norms are like buoys on the surface of the water, available to direct observation.

At an even cruder level, it is possible to distinguish two main kinds of societies from a moral perspective: those that privilege individual rights (even at the cost of collective safety and security) and those that prioritize devotion and conformity to the group (even at the cost of personal freedoms and privileges). Durkheim associated the first kind of society with a highly elaborated division of labor in which a great diversity of human skills and abilities needed to be integrated into an organic whole, whereas deference to the group was more prominent in simple societies in which individual qualities mattered less3. A modern variant of this argument is presented by Moral Foundations Theory which associates the individualizing virtues of care and fairness with Western, educated, industrialized, rich, and democratic (aka WEIRD) societies8 and more groupish and authoritarian moral values with traditional societies6,7. It is possible also to characterize the whole of human history in terms of shifts of moral emphasis. For instance, despotism has been said to follow a U-shaped curve in cultural evolution: while our ancestors were egalitarian apes, valuing compassion and fairness, the rise of agriculture heralded increasingly cruel and repressive empires based on conquest, slavery, and the absolute power of rulers, but in the wake of the Axial Age and the rise of more ethical religions the tide turned again in the direction of increasingly liberal and democratic social formations1.While the details of such theories could be wrong, they all suggest that moral systems are variations on a set of universal themes.

To use a nautical analogy, the relationship between universal morality and its cultural expressions may be compared to the way in which invisible anchors and chains constrain the movements of visible buoys floating on the surface of the sea. Universal moral intuitions are like anchors, invisible from the surface but immovably secured to the seabed, whereas culturally prevalent moral norms are like buoys on the surface of the water, available to direct observation. The same analogy might apply to numerous other domains of culture. For example, there is much evidence that explicit religious beliefs, including so-called ‘theologically correct’ teachings of a given tradition12, are similarly analogous to visible buoys while more intuitive, or ‘cognitively optimal’ religious concepts13, are analogous to hidden anchor points. A key question would then become whether there is some kind of interaction between different kinds of anchors and buoys. At the risk of over-extending this metaphor, we might ask whether the lines linking religious buoys and their anchors somehow get tangled up with normative buoys and moral anchors. For example, do theologically correct religious representations somehow activate our foundational moral principles and thereby amplify or constrain their expression? Efforts to investigate questions of that kind would also need to take into account the effects of environmental factors on religion and morality, ranging from drought and pestilence to institutional innovation and warfare, analogous perhaps to the effects of wind and tides on the position of buoys. Efforts are only now beginning to explore the massive battery of empirically tractable research questions such an approach inevitably generates.


  1. Bellah, Robert N. & Joas, Hans (2012). The Axial Age and Its Consequences. Boston: Harvard University Press.
  2. Curry, Oliver, S., Mullins, Daniel, & Whitehouse, Harvey (forthcoming). Is it good to cooperate? Testing the theory of morality-as-cooperation in 60 societies.Current Anthropology. (
  3. Durkheim, Émile (1893) [1933]. The Division of Labour in Society. New York: MacMillan
  4. Durkheim, Émile (1897) [1951]. Suicide: A Study in Sociology. New York: The Free Press.
  5. Graham, J., & Haidt, J. (2010). Beyond beliefs: Religions bind individuals into moral communities. Personality and Social Psychology Review, 14, 140–150.
  6. Graham, J., Haidt, J., & Nosek, B. A. (2009). Liberals and conservatives rely on different sets of moral foundations. Journal of Personality and Social Psychology, 96, 1029 –1046.
  7. Graham, J., Nosek, B. A., & Haidt, J. (2012). The moral stereotypes of liberals and conservatives: Exaggeration of differences across the political spectrum. PLoS ONE, 7(12), e50092.
  8. Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world? Behavioral and Brain Sciences, 33, 61– 83.
  9. McCarty, Meladee & McCarty, Hanoch, (1996). Acts of Kindness: How to Make a Gentle Difference. Florida: Health Communications.
  10. McKay, Ryan and Whitehouse, Harvey (2014). Religion and Morality. Psychological Bulletin. Advance online publication. Printed 2015, 141(2): 447-73.
  11. Sahlins, Marshall D. (1963). Poor Man, Rich Man, Big-Man, Chief: Political Types in Melanesia and. Polynesia . Comparative Studies in Society and History, Vol. 5, No. 3, pp. 285-303.
  12. Slone, D. J. (2004). Theological Incorrectness: Why Religious People Believe What They Shouldn’t. Oxford University Press
  13. Whitehouse, Harvey (2004). Modes of Religiosity: A Cognitive Theory of Religious Transmission. Rowman Altamira
  14. Woodburn, James (1982). Egalitarian Societies, Man (NS), Vol. 17, No. 3, pp. 431-451.

This article is from TVOL’s project titled “This View of Morality: Can an Evolutionary Perspective Reveal a Universal Morality?” You can download a PDF of the project [here], comment on this article below, or comment on the project as a whole in the Summary and Overview.


Published On: May 17, 2018

Harvey Whitehouse

Harvey Whitehouse

Harvey Whitehouse is Chair of Social Anthropology, Director of the Institute of Cognitive and Evolutionary Anthropology, and a Professorial Fellow of Magdalen College at the University of Oxford. Harvey is one of the founders of the cognitive science of religion field. He is especially well known for his theory of “modes of religiosity” that has been the subject of extensive critical evaluation and testing by anthropologists, historians, archaeologists, cognitive scientists, and evolutionary theorists. The modes theory proposes that the frequency and emotionality of rituals determines the scale and structure of religious organizations: low-frequency, highly arousing rituals bind together small but very cohesive groups of participants; high-frequency, less emotionally intense rituals create large anonymous communities that are more diffusely integrated. In recent years, Harvey’s work has expanded beyond religion to examine the role of rituals of all kinds in binding groups together and motivating inter-group competition, including warfare. This research has become increasingly global in reach with ongoing data collection now established at field sites in Singapore, Japan, New Zealand, Australia, Vanuatu, Brazil, the U.S., Spain, Cameroon, the U.K., Turkey, and Libya. Harvey is also a founding editor, and the editor for ritual variables, of Seshat: Global History Databank.


Ryan McKay

Ryan McKay

Ryan McKay is Reader in Psychology at Royal Holloway, University of London, and Principal Investigator of the Royal Holloway Morality and Beliefs Lab (MaB-Lab). He was educated at the University of Western Australia in Perth and Macquarie University in Sydney, Australia, and has held research posts in Boston (Tufts University), Belfast (Queen’s University), Zürich (University of Zürich) and Oxford (University of Oxford). He has also worked as a clinical neuropsychologist at the National Hospital for Neurology and Neurosurgery in Queen Square, London.



  • David Sloan Wilson says:

    Thanks for this entertaining commentary! Your binary distinction is similar to the distinction between tight and loose cultures studied by Michele Gelfand, as I mentioned in my comment to Elliott Sober. However, I would dispute the explanations offered by Durkheim and others that you cite, which are based on relatively modern societies. I think that the distinction is more fundamental and that variation along the tight-loose continuum can be found in all societies and in a context-sensitive fashion within all societies. The general explanation is that some situations call for tight forms of social organization and others call for loose forms. Both are within the human cultural repertoire and can be invoked rather easily.

    As for your nautical metaphor, it’s awfully complicated! Would you be able to state its predictions in less metaphorical and more explicitly evolutionary terms?

    • Ryan McKay says:

      Thanks, David. It’s no accident that our nautical metaphor is complicated, as our aim was, in part, to try and convey the complexity of some of these issues, which is sometimes underappreciated. Take, for example, the vaunted relationship between religion and morality. In the public sphere there is no shortage of confident pronouncements on this topic, ranging from those who confidently assert that morality requires religion (e.g., Laura Schlessinger’s claim that “It is simply impossible for people to be moral without religion or God”) to those who claim that raising children as religious is a “grievous wrong” (Richard Dawkins). But the confidence of such commentators belies the complexity of the issues. If we resist treating “religion” and “morality” as monolithic entities and instead fractionate them into smaller cognitive and cultural units (anchors and buoys), then the relationship between religion and morality fans out into a matrix of separate relationships between fractionated elements. Thus some aspects of “religion” may promote some components of “morality,” just as others serve to suppress or obstruct the same, or different, components. As we note in our piece above, this approach generates a massive battery of research questions.

  • Mark Sloan says:

    Harvey and Ryan,

    That culturally prevalent moral norms are like buoys immovably secured to anchors on the seabed is an interesting analogy. Do you see “morality as cooperation” as defining what those anchor points are (meaning these anchors were selected for by the benefits of cooperation they produced), or are there additional sources?

    • Ryan McKay says:

      Thanks, Mark. Candidate anchors should confer some adaptive advantage, and this may indeed involve facilitating cooperation. Fairness seems to meet this criterion, e.g., Nicolas Baumard and colleagues have argued that fairness preferences are adapted to an environment in which individuals competed to be selected and recruited for mutually advantageous cooperative interactions. But yes there may be other adaptive advantages. For example, perhaps a tendency to overinfer the presence of intentional, monitoring agents undergirds at least some “moral” behaviour. Some theorists (e.g., Stewart Guthrie, Justin Barrett) have suggested that this tendency was selected for not because it promoted cooperation, but because mistaking an agent (e.g., an a hidden observer) for an inanimate object (e.g., a tree rustling in the wind) was typically more costly in ancestral environments than the converse error.

      • Mark Sloan says:

        Hi Ryan,

        As part of our moral sense, people certainly have “a tendency to overinfer the presence of intentional, monitoring agents (hidden observers which) undergirds at least some ‘moral’ behavior” such as the imagined presence of condemning gods or angered spirits. Such imagined presences can also be explained as a by-product of our biologically selected for sense of guilt, the experience of feeling bad – sometimes very bad – about something ‘immoral’ one has done even though no one else knows about it. I’d have to think about how to conclusively show that the “byproduct of guilt” answer is more scientifically correct.

        Certainly the bigger scope hypothesis (explaining all moral anchors not just one) is that all moral anchors exist because they increase the benefits of cooperation. So far as I know, there are no facts about morality that contradict this hypothesis and it has remarkable explanatory power. If you are aware of any moral anchors that might be counterexamples the “selected for by the benefits of cooperation” hypothesis, please let me know.

        • Ryan McKay says:

          Hi again Mark,

          One obvious candidate anchor is the “purity” anchor (imported more-or-less wholesale from moral foundations theory, which views “sanctity/degradation” as a core moral foundation). As I understand it, the MFT theorists view this “anchor” (again, anchor is our term; they use an architectural rather than nautical metaphor) as an evolved solution to the recurrent adaptive problem of how to avoid communicable diseases. In our metaphor, the “purity” anchor would connect up with a range of different buoys – prescriptions and proscriptions concerning food, sex and death (e.g., “no cooking during menses”, “bury the dead”, “avoid the sin of Sodom” etc.). I’m not sure how you’d tell a “cooperation” story about these kinds of practices?

          • Mark Sloan says:

            Hi Ryan,

            The cooperation story about the purity anchor is that purity norms can be selected for as markers of membership in an in-group (when out-groups have different norms) and markers of commitment to the in-group (when the purity markers require self-denial or even suffering). These markers of membership and commitment to the in-group increase cooperation by denoting reliable people to cooperate with – the heart of human morality.

            Of course, cultural norms can have multiple selection (and anti-selection!) forces. Avoiding communicable diseases is certainly one selection force for some purity norms.

            What distinguishes cultural moral norms is that violators are commonly thought to deserve punishment (though they may not actually be punished). Motivation to punish violators is an innate part of our moral sense’s judgements about right and wrong and a necessary component of reciprocity strategies.

            Understanding the importance to cooperation can be highly revealing for purity norms such as “no cooking during menses”, “homosexuality is evil”, and Female Genital Mutilation which 1) may not actually effectively reduce the risk of disease but 2) whose violation is commonly thought to deserve punishment.

            Arguably all cultural moral norms (norms whose violation is commonly thought to deserve punishment) have the benefits of cooperation for an in-group as a selection force. Purity moral norms, as you point out, may have an avoidance of disease selection force also.

            Thanks for reminding me about the importance of multiple selection forces for moral “anchors”!

  • Andy Norman says:

    Harvey and Ryan-
    I two find the anchors-and-buoys analogy intriguing. In particular, I think it’s worth exploring the idea that explicit religious doctrines are like buoys–anchored by something deeper and more attuned to what’s (“really”) right and wrong, but often mistaken for the moral anchors themselves. Fear of being morally adrift certainly seems to motivate attachment to some religious doctrines.

    I have a strong intuition that this mistaking-of-the-one-for-the-other can cause significant moral disorientation. In extreme cases, it can manifest as moral derangement or religious zealotry; in milder cases, it can lead decent, law-abiding Americans to oppose the regulation of military-grade assault rifles.

    I suppose we’re all prone to structurally similar mistakes. Even secular humanists like myself rely on buoy-like rules of thumb (like “Always keep an open mind”) to orient ourselves morally. And of course we too can mistake a buoy-like rule of thumb for a deeper and more anchor-like moral principle. In this way, we can all suffer moral confusion…

  • Harry Lewis says:

    Much of our present research involves applying the theory to political “cultures” such as those of liberals and conservatives. The current American culture war, we have found, can be seen as arising from the fact that liberals try to create a morality relying primarily on the Care/harm foundation, with additional support from the Fairness/cheating and Liberty/oppression foundations. Whence morality? That is a question which has troubled philosophers since their subject was invented. Two and a half millennia of debate have, however, failed to produce a satisfactory answer.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.