Have you noticed how political campaigns are run? Take any major democracy across the world and you will notice some striking similarities. The candidate is presented as a study in perfection---crafted to fit a particular image that voters may have in their minds. He or she is trustworthy, ambitious, patriotic, caring, family-oriented, strong but empathetic, and willing and able to understand (and fix!) the ordinary person's concerns. Regardless of whether the candidate is on the left or right of the political spectrum, he or she is introduced in a similar fashion, accentuating the same traits and accomplishments. And for the rest of the campaign, every major political speech, every advertisement, every event, and every debate is staged to reinforce that core message. The tactic works quite well because one of the glitches built into the human mind is that repetition leads people to believe a message. It is in fact, one of the easiest methods of persuasion. This phenomenon is called the illusory truth effect.
The illusory truth effect
A recent example of the illusory truth effect comes to us from Donald Trump's 2016 presidential campaign. Over the course of various debates and interviews, Trump repeatedly echoed the phrase, "Crooked Hillary." Among his accusations, Trump claimed that Hillary Clinton had ties to Wall Street, used her private email server to send government-related messages, had conflicts of interest with the Clinton Foundation, and was responsible for the security lapses resulting in the 2012 attack on the U.S. diplomatic compound in Benghazi. A PolitiFact survey showed that a whopping 70 percent of Trump's statements during the campaign were false. Meanwhile, just 26 percent of Clinton's were deemed false. And yet, Trump was successful in convincing large segments of the population that she was indeed the "crooked" one. The illusory truth effect was given its name in a 1977 study by Villanova University and Temple University researchers Lynn Hasher, David Goldstein and Thomas Toppino. For the study, volunteers rated a series of trivia questions as either true or false. Hours, weeks, even months later, the experimenters brought the subjects back for a quiz. Some of the statements on the quiz were new and some were repeats from the original quiz. Time and again, people rated statements they'd seen before as being true regardless of whether or not they were. The researchers argue that familiar statements are easier to process relative to novel, unknown statements. And so, when assessing whether a statement is true, people unconsciously choose to rely on information that sounds like something they've heard before. More recently, psychologists Gordon Pennycook and David Rand examined the illusory truth effect using experiments with fake news headlines from the 2016 U.S. presidential campaign. Participants in their study were shown an equal number of fake and real headlines and then asked how accurate they were. An example of a fake news headline was a picture of U.S. Vice President Mike Pence with the caption, "Gay conversion saved my marriage." A sample of a real news headline was a picture of Russian Prime Minister Vladimir Putin with the caption "Vladimir Putin 'personally involved' in US hack, report claims." After judging whether the headlines were real or fake, participants were distracted with another task for a while. They were then given a list of 24 headlines to evaluate, which included all of the headlines they had seen previously, as well as a set of completely new (real and fake) headlines. The results showed that when participants had previously been exposed to a fake news headline, they were more likely to accept it as truth later on. Every time a lie is repeated, it is slightly more believable to most people. For example, the headline "Trump to Ban All TV Shows that Promote Gay Activity Starting with Empire" was only rated as accurate by 5 percent of subjects in Pennycook's study. A single prior exposure to that headline doubled the number of participants rating it as true.
Using repetition to signal truth
Repetition is a key tool that managers across all industries use to drive their message home. Management journals have consistently concluded that employees act on a message only after they've heard executives repeat it multiple times. The next time you're walking down the corridors of a major corporation or government office, pay attention to the signage on the walls. You'll notice the same signs appear many times over. And the messages from those signs were probably explained in company town halls, shared via company email distribution lists, and highlighted on the company intranet. An analysis of Fortune 500 company annual reports and quarterly earnings press releases would certainly also reveal plenty of repetition. Business leaders know that a single exposure to a message is nowhere near enough. Psychologists have studied this phenomenon more closely and found that a message needs to be repeated between 10 and 20 times for maximum buy-in. After that point, preference turns into annoyance. We all know perfectly well that simply hearing something over and over again doesn't mean it is true, but we fall for this persuasive tactic anyway. Why? This would seem to be a matter of pure mental dysfunction, but if we dig deeper into the psychology behind this phenomenon, we'll see that the illusory truth effect is a symptom of a bias we have toward things that are familiar to us. The more often we hear something, the more familiar it becomes, and familiarity breeds trust. There was, and still is, a certain logic to this. Putting our trust in situations, things and people we were familiar with is thought to have been a good survival tactic in the era before we were fully evolved. We roamed around in small tribes, vulnerable to many dangers, including other tribes we might encounter at any moment. Trusting those in our group and bonding deeply with them was key to making our way through the challenges. The problem is that our minds now rely too much on it.
How to resist the illusory truth effect
So here are ways to get savvy and protect yourself from alternative facts: Protect yourself: One solution against falling for alternative facts is inoculation. Inoculation is analogous to vaccinations in which people acquire immunity against a disease by being exposed to a weak form of it. Similarly, people can develop resistance against misinformation by being exposed to a small amount of it. Research has found that the most effective way to inoculate someone was by using a two-pronged approach: General inoculation---simply warning that the information may be misleading and specific inoculation---highlighting specific claims that are false. They provided evidence that through this process, people can be "vaccinated" against fake news. Use reflection as a tactic to override bias: Researchers have found that the ability to reflect is what predicts whether a person can distinguish facts from alternative facts. When forming an opinion, consciously choose to delay arriving at your judgment. Let all the available information sink in and deliberately reflect on it. Resist repetition: The first step in overriding bias due to repetition is recognizing that the bias exists and may be coloring your judgment. Don't get swept up by the momentum of alternative facts. Resist it. Engage openly with dissent: Actively engage with all parties involved in a particular disagreement. Try to get as much of a 360-degree view of the issue as you can before you form any judgments. Challenge opinion and ask for facts: For business leaders, force an evidence-driven approach to decision-making within your company. Ask for evidence when strong opinions are expressed. Vigorously challenge opinions that are not grounded in fact and don't give them equal treatment to those based on hard evidence.