But it remained for the Jews, with their unqualified capacity for falsehood, and their fighting comrades, the Marxists, to impute responsibility for the downfall precisely to the man who alone had shown a superhuman will and energy in his effort to prevent the catastrophe which he had foreseen and to save the nation from that hour of complete overthrow and shame. By placing responsibility for the loss of the world war on the shoulders of Ludendorff they took away the weapon of moral right from the only adversary dangerous enough to be likely to succeed in bringing the betrayers of the Fatherland to Justice.
All this was inspired by the principle—which is quite true within itself—that in the big lie there is always a certain force of credibility; because the broad masses of a nation are always more easily corrupted in the deeper strata of their emotional nature than consciously or voluntarily; and thus in the primitive simplicity of their minds they more readily fall victims to the big lie than the small lie, since they themselves often tell small lies in little matters but would be ashamed to resort to large-scale falsehoods.
It would never come into their heads to fabricate colossal untruths, and they would not believe that others could have the impudence to distort the truth so infamously. Even though the facts which prove this to be so may be brought clearly to their minds, they will still doubt and waver and will continue to think that there may be some other explanation. For the grossly impudent lie always leaves traces behind it, even after it has been nailed down, a fact which is known to all expert liars in this world and to all who conspire together in the art of lying. — Adolf Hitler, Mein Kampf, vol. I, ch. X, Project Gutenberg of Australia - Mein Kampf tr. James Murphy. Archived from the original on 24 July 2008.
Later, Joseph Goebbels put forth a slightly different theory which has come to be more commonly associated with the expression "big lie". Goebbels wrote the following paragraph in an article dated 12 January 1941, 16 years after Hitler's first use of the phrase. The article, titled Aus Churchills Lügenfabrik (English: "From Churchill's Lie Factory") was published in Die Zeit ohne Beispiel:The essential English leadership secret does not depend on particular intelligence. Rather, it depends on a remarkably stupid thick-headedness. The English follow the principle that when one lies, one should lie big, and stick to it. They keep up their lies, even at the risk of looking ridiculous.
In a remarkable insight of how Hitler used the big lie, the United States Office of Strategic Services prepared a report entitled, “A Psychological Analysis of Adolph Hitler His Life and Legend” which said:His primary rules were: never allow the public to cool off; never admit a fault or wrong; never concede that there may be some good in your enemy; never leave room for alternatives; never accept blame; concentrate on one enemy at a time and blame him for everything that goes wrong; people will believe a big lie sooner than a little one; and if you repeat it frequently enough people will sooner or later believe it. Hitler was able to implement the big lie because he had “ a matchless instinct for taking advantage of every breeze to raise a political whirlwind. No official scandal was so petty that he could not magnify it into high treason; he could ferret out the most deviously [unreadable] corruption in high places and plaster the town with the bad news." Id. Hitler had the ability to “repudiate his own conscience in arriving at political decisions has eliminated the force which usually checks and complicates the forward-going thoughts and resolutions of most socially responsible statesmen.” Id. Moreover Hitler had the ability to “persuade others to repudiate their individual consciences.” Id.
Today, it is an unfortunate truth that the practice of the big lie is used today by President Trump and his supporters and seen in media. See, Did Fake News On Facebook Help Elect Trump? Here's What We Know; Social media and fake news in the 2016 election. Simply put, Trump lies continuously and with a second thought. All false statements involving Donald Trump. The number of these types of lies by Trump vastly exceeds the lies of previous presidents. Glen Kessler of the Washington Post compiled a list of more than 2000 misleading or false statements in Trump’s first 355 days in office. Leonhardt, et al., of the New York Times, using a much more conservative definition of false statements, compiled 103 separate untruths during Trump’s first ten months in office. These lists often include flip-flops, self-contradictions, unwarranted credit taking, and exaggerations.That is half the question. The other, perhaps more important question, is why do people believe the big lie?
Scholars have known for decades that people tend to search for and believe information that confirms what they already think is true. The new elements are social media and the global networks of friends who use it. People let their guard down on online platforms such as Facebook and Twitter, where friends, family members, and coworkers share photos, gossip, and a wide variety of other information. That’s one reason why people may fall for false news, as Distinguished Professor of Communication & Co-Director of the Media Effects Research Laboratory, Pennsylvania State University S. Shyam Sundar, explains in “Why we believe fake news,” The Conversation. Another reason: People are less skeptical of information they encounter on platforms they have personalized — through friend requests and “liked” pages, for instance — to reflect their interests and identity. Sundar characterizes his research findings in this way: “We discovered that participants who had customized their news portal were less likely to scrutinize the fake news and more likely to believe it.”
A growing body of research also indicates that repeated exposure to false statements can lead people to believe those falsehoods. An experimental study, led by Vanderbilt University assistant professor of psychology Lisa Fazio, showed that people sometimes are more likely to believe repeated untrue facts than even their own knowledge about a topic. For example, even after study participants had answered correctly that the short pleated skirt worn by Scots is called a kilt, their chances of believing the false statement “A sari is the name of the short pleated skirt worn by Scots” increased after they read that sentence multiple times. In another study, subjects rated how certain they were that 60 statements were true or false. Some statements were repeated others were not. Repeated statements were more likely found true than statements which were not repeated.
Similarly, a study forthcoming in the Journal of Experimental Psychology: General “actual fake news headlines presented as they were seen on Facebook, we show that even a single exposure increases subsequent perceptions of accuracy, both within the same session and after a week. Moreover, this “illusory truth effect” for fake news headlines occurs despite a low level of overall believability, and even when the stories are labeled as contested by fact-checkers or are inconsistent with the reader’s political ideology. These results suggest that social media platforms help to incubate belief in blatantly false news stories, and that tagging such stories as disputed is not an effective solution to this problem. Interestingly, however, we also find that prior exposure does not impact entirely implausible statements (e.g., “The Earth is a perfect square”). These observations indicate that although extreme implausibility is a boundary condition of the illusory truth effect, only a small degree of potential plausibility is sufficient for repetition to increase perceived accuracy. As a consequence, the scope and impact of repetition on beliefs is greater than previously assumed.”
Perhaps what is scariest is what happens when people are confronted with the true information to contradict the lie. In their well-cited 2010 study, “When Corrections Fail: The Persistence of Political Misperceptions,” they found that people sometimes hold more firmly to false beliefs when confronted with factual information. For example, when political conservatives were presented with correct information about the absence of weapons of mass destruction in Iraq, they were even more likely to believe Iraq had those weapons. As the study concludes: corrections frequently fail to reduce misperceptions among the targeted ideological group. Corrections actually increase misperceptions among the group in question.