This View of Life Anything and everything from an evolutionary perspective.
FIND tvol:
How To Get Credible Knowledge In A Myth-Filled World
Culture Myth
Joe Brewer
Joe Brewer
is a complexity researcher who specializes in culture design. He is culture editor for This View of Life and co-founder of Evonomics.

A growing number of concerned parents have gotten wind of rumors that vaccines may lead to brain damage in their young children. Even after the evidence behind this claim has been invalidated by medical experts, it continues to challenge the credibility of science and increase the risk of disease epidemics for everyone in society.

We live in a world filled with myths, urban legends, ideological beliefs, and strong opinions that present themselves as expertise.  We also live in a world filled with complex problems that can only be solved through highly specific scientific knowledge.  This poses a great challenge for humanity.  How do we discern credible knowledge in a myth-filled world?

Two examples show just how acute the problems are:

Sign up for our newsletters

I wish to receive updates from:

1) Planetary Climate is Changing and the world is divided into camps based on strongly held opinions, moral judgments, and ideological beliefs.  While few people dispute the fact that extreme weather is causing all sorts of problems, many still cling to ideas that match their worldview and confuse others around them about what is really going on. The public debate about climate science rages on more than a decade after the scientific community resolved the issues and clearly found that human activities are disrupting planetary climate,

2) Health Concerns Arise Dailyand people select medical information that conforms to their individual sensibilities — some people opting to scour the peer-review literature for information they can trust while others jump on the bandwagon of nutritional fads and urban myths about vaccines, herbal remedies, and naturopathic medicine.  How can they tell what will actually work best? In today’s world of ready access to information on the internet, an increasing number of people are taking medical diagnosis into their own hands—especially when they disagree with the prescriptions from their doctors, whose credibility is increasingly called into question.

I have grappled with these issues quite a lot over the last decade.  The approach I took was to study human cognition and the various ways that our minds shape what we perceive and how we act upon in the world. This has been informed by linguistics (how information processing in the brain shapes how we make sense of the world); psychology (the role of emotions in shaping whom we trust and how we evaluate information presented to us); anthropology (evolutionary and tribal origins of social morality that influence how we build trust in group settings); and other research areas that are relevant to this topic.

As editor at TVOL, I make a point to move beyond my own perspectives — realizing from all the study mentioned above just how limited and unreliable one person’s view can be, no matter how well informed one tries to be.  In my attempts to overcome these cognitive shortcomings, I try to seek advice from people whose expertise differs from my own.  I recently interviewed Professor Harry Collins at Cardiff University about his research in the sociology of science to learn about the many kinds of expertise that have been discovered over the years.

Harry Collins recommends that everyone working to address complex technical problems should learn more about what sociology has to say about science.  A great deal is now known about the social norms that shape group decisions.  Quite a lot is known regarding the importance of status and reputation in any community of practice, scientists included.  In his own work, Collins has followed the research practices of physicists studying gravitational waves — a topic that is so profoundly shaped by theory and instrumentation that even those close to the work have great difficulty discerning what is real and knowable. Thus it is an excellent place for a sociologist to go and observe how groups of people argue, collaborate, and make progress (or not) as a social dynamic filled with intricacies and nuance.

My introduction to Collins’ work was his recent book, Are We All Scientific Experts Now?, an easy read for laypeople and researchers alike to learn about the different kinds of expertise that exist in the world.  There are many things I could say about this book.  For the sake of brevity, I will focus on what I feel is one of the most important insights explored within its pages. 

There are many different kinds of expertise.  Most of it is not learned from reading books or peer-reviewed papers.  Most of what scientists learn about their field does not come from experiments.  They learn it by attending lectures, participating in conferences, and so forth — in a word, they learn it from other scientists.

This is important for our discussion in a very specific way.  The only way to know what knowledge is trustworthy is to observe what the experts WITHIN the community of practice discern to be trustworthy.  Reading popular science books on the topic is not enough.  Even reading peer-review journal articles is not enough.  Only those people “in the know” about which research is widely cited and built upon can say with authority what the state of the field is.

Why is this the case?  Because some researchers have little credibility with their peers.  Those researchers are still able to get their work published in peer-review journals.  But no one in the community reads them!  This is because the type of expertise at play is local knowledge of reputation and status within the scientific field itself.  Without this knowledge to guide you, it is very easy to be led astray. This kind of social trust is like a glue in human communities.

Evolutionary research on the origins of human sociality shows that reputation and status are managed by the spread of gossip, commentaries about the moral worth of different people, and—in particular—whether they are trustworthy and credible. In his book Grooming, Gossip, and the Evolution of Language, Robin Dunbar explores this topic at length. Among his key assertions is that “social grooming” among peers is a vital form of communication that helps members of a community know who is credible and who is not in situations where there are more people than each can know personally.

Let’s apply this to the examples mentioned at the beginning of this article, climate change and medicine.  It is quite easy to have an opinion about these topics before starting to search for information.  In subtle (and sometimes overt) ways, this opinion can shape how you search for information, which articles you read, and to whom you turn for expert opinion.  Every step of the way, you are influenced by your level of trust in some sources over others.  You may ultimately come to a conclusion after your “research” into the field — but you never spoke with a practicing scientist with vital social knowledge about the credibility of their peers.  And so you would not easily know if the sources you looked at, even those in scientific journals, should be taken seriously.

This is different from the problem of cheaters or people who “cook the books” to produce fake results.  Collins is referring to those people who do research deemed sufficient in quality to warrant publication, yet are not considered credible enough to draw attention from other researchers in the same field.  In this sense it is a more difficult problem to overcome.  The cranks and charlatans are easier to spot.  Those who explore topics that have been discredited by the community, yet do so with some degree of rigor, are only going to be known as “unworthy” by others with sufficient technical and social knowledge within the research community.

This is a sociological problem and it has a sociological solution.  Learn how credibility and expertise work in social systems (in other words, study the sociology of science) and apply what they have learned to your own learning process.  If you want to know what the state of knowledge is for climate change, attend a major conference of climate scientists and ask them about the major ideas they consider trustworthy.  Do this and you will get a lot farther along than reading articles and books on your own.

The same lesson applies to medicine and public health, evolutionary studies about human nature and group behavior, engineering solutions for energy and food supply, policy interventions that address poverty and inequality, or any other problem whose solutions can be informed by technical knowledge.

Applying this to the concern about vaccinations mentioned at the beginning of this article, if you want to know what is really going on it is best to consult the researchers themselves. Go to one of their gatherings (or read an official statement by one of their respected professional associations) and ask around. Do this and you will quickly learn that the lone study that kicked the entire controversy off all those years ago has since been found to have faulty data. It has been cast out by the community and lacks credibility.

In our ongoing quest to solve the big problems in the world, we will need to keep learning about the sociology of science to ensure that we find credible knowledge.  This is one small piece of the larger web of challenges in our media saturated world.


Join the discussion


  1. Jason says:

    I enjoyed the article. Perhaps you could include information on how to find these major conferences and official statements?

    • Joe Brewer says:

      Hey Jason,

      The conferences and official statements will vary from one topic to the next, of course. So there isn’t a general list for all manner of myths floating around in daily conversation.

      For the topic of vaccination, a major announcement was made by the prominent British medical journal, Lancet, as reported here:

      Journal Retracts 1998 Paper Linking Autism to Vaccines

      Knowing that this journal reflects the vetted opinions for a community of experts is helpful “inside” knowledge about trust and integrity for the facts.

      Similarly, for the debate on climate change there are several key organizations that routinely host meetings. Among them are the Intergovernmental Panel on Climate Change, Royal Meteorological Society, and National Academies of Science.

      Reaching out to the expert community can begin with noted scientists who are called before legislative committees, appointed leaders for science academies, and conference conveners for research symposia.

      Hope this helps get you started!

  2. Paul says:

    This account of scientific progress paints a picture of cozy insiders patting each other on the back and working within the same community. What about the outsiders who challenge deeply held assumptions that are invisible to the community? What about the concept of sudden revolutionary change (à la Kuhn)?

    • Joe Brewer says:

      Hey Paul,

      Your question gets at a common misunderstanding about how science works. There is no group of “cozy insiders” for arenas of scientific inquiry. Instead there are numerous different research groups (such as those you’d find in an academic department or research lab).

      Scientists are among the most vigorous debaters, constantly challenging and critiquing the findings of others in their field. Sometimes it is “friendly” competition like you might find in the sportsmanlike behavior of a good soccer match. Other times it is “hostile” and the people involved behave aggressively toward one another.

      Said another way, the NORM of science is for researchers to challenge each others’ deeply held assumptions. This is how science as a mode of inquiry progresses, after all!

      • Michael says:

        Along the same lines as Paul’s question, I am curious how you think creativity plays into the scientific process. IE, even if an arena doesn’t have a group of cozy insiders, but maintains a rigorous debate, is a large-scale “failure of imagination” still possible? Is some sort of punctuated equilibrium or gradual improvement a more accurate visual of scientific progress? Do paradigm shifting insights usually come from within a field, without, or as a result of cross-disciplinary efforts? (If a paradigm shift is even a valid concept).
        All this makes me think of Adam Frank’s “About Time.” I haven’t read it for years and don’t know what I would think of it now, as it is fairly described as popsci. You seem voracious so you may have consumed it, but it discusses the co-evolution of culture and cosmology.

        • Joe Brewer says:


          Great question! It is a bit out of the boundaries for the article above, but yes it is absolutely possible for a discourse to become insular and myopic. It is also possible for ideas to “fall out of favor” only to be revived later.

          A great example of this can be seen in David Sloan Wilson’s article (also published here on TVOL today) about multi-level selection in evolutionary biology. The concept of group selection was largely dismissed and marginalized for several decades before making a revival and ultimately proving credible within the scientific community.

          Paradigm shifts happen in a manner similar to how an earthquake builds up along a fault line. It is a nonlinear diffusion process whereby the ideas spread and are imitated in the practices of a growing number of people. As this spreading process unfolds, those who oppose the idea become more entrenched in their techniques and perspectives — pushing back and resisting acceptance. There comes a point, just like with an earthquake, where the social system reaches a critical threshold and the paradigm “shifts” (often quite quickly relative to the time scale of the debate — so a decades long controversy can become resolved in a span of only a few years).

          I have a YouTube video where I go into this “meme spreading” dynamic as understood through the lens of complexity science. Check it out for a longer exploration of this fascinating topic.

          • Michael says:

            Fair enough, with regard to the scope of the article. And thanks — I appreciate your thoughts and will keep thinking on this.
            Great example: I have been eagerly following the debate around multi-level selection since discovering it through “The Righteous Mind.”

  3. Adam says:

    Thank you for this article. Planning on sharing it with others, as many among us are prone to taking a single unit of frightening or compelling information to justify a tenacious change in belief and behavior. Despite multiple future exposures to information to the contrary, we are not likely to soon adjust these modes of interaction after ‘laboring’ to have set them. Your elaboration on the diverse but not uniformly distributed consensus of scientific communities is helpful. Though true that conventional wisdom has been overcome by ‘outlier’ ideologies in the past, the general self-correcting nature of the scientific endeavor seeks to provide the public with reasonable interpretations of the information available. Thank you again, Joe.

  4. Josh Mitteldorf says:

    Joe –
    I think you have touched only one side of the issue. Part of the reason that lay people don’t trust establishment science is that establishment science has made some big conceptual blunders, and stuck with them, sometimes for decades. Scientists are human and they are corruptible by money and power. Some of the problem is a kind of group-think that is not deliberate, but it can lead to persistent prejudices. David’s Multilevel Selection theory has been marginalized by of this effect. Another part is well-funded propaganda infiltrating scientific research. One example is the global warming deniers, paid to confuse the field by the fossil fuel industry. Another example is pharmaceutical research, which is almost all funded and conducted by companies that have billions of dollars staked on getting “the right answer”.
    – Josh Mitteldorf

  5. Ibn Alhaitham says:

    After looking at the interesting question you have posed; How to get credible knowledge in a myth filled world? I thought of answering it independently before reading the original post then I read your post and here is what I came up with for what it’s worth. The first that came to my mind was what Richard Dawkins open letter to his 10 year old daughter in which he explained to her how do we know what we know and to warn her about this irrational world (see at link to The key word is ‘evidence’; credible knowledge is based on either experimental or observational evidence or on both. But often we are not in a position to assess the quality of the evidence that are presented in support of a case either (1) because we are completely outside the field of expertise or (2) the field of our expertise is too narrow to be encompassed another branch of the field which is also very specialised as for example when a nuclear physicist looks at some solid state results. In case (1) and (2) I think that a few articles on the issue from such respectable and general science magazines such as Scientific American, Science, Nature, New Scientist and reliable newspapers with sections on science and technology such as the New York Times, Times, The Telegraph and The Guardian may satisfy our needs. Such sources would provide high quality material for thinking about the subject rationally and critically and for reaching more or less satisfactory answers to the not-so-specific questions about the subject. The concerned parents with whose problem you began your post need go no further than the sources I had mentioned above to decide to vaccinate their child or not.
    If however neither case (1) or (2) applies to you and you are in fact an expert in the field then I suppose it goes without saying that you will attend relevant conferences, meet and listen to respected world experts on the matter and review periodicals where it is not very easy to publish.