Brainwashing

From New World Encyclopedia
(Redirected from Coercive persuasion)


Brainwashing refers to the systematic application of coercive techniques to change the beliefs or behavior of one or more people, usually for political or religious purposes. The term brainwashing was originally used in the United States to explain why, compared to earlier wars, a relatively high percentage of captured American prisoners of war during the Korean War defected to the Communists. American alarm was ameliorated after prisoners were repatriated and it was learned that few of them retained allegiance to the Marxist and anti-American doctrines that had been inculcated during their incarceration. Later analysis determined that some of the primary methodologies employed on them during their imprisonment included sleep deprivation and other intense psychological manipulations designed to break down their individual autonomy. When rigid control of information was terminated, and the former prisoners' natural methods of reality testing could resume, the superimposed values and judgments were rapidly attenuated. This raised the question as to whether these changes were just a facade, or if the core beliefs of the soldiers had been altered for the extent of their incarceration.

Whether any techniques exist that change thought and behavior to the degree connoted by the term "brainwashing" became a controversial issue in the 1970s. Accusations that new religious movements, or "cults," employed similar techniques to gain and retain members fueled this argument. Extensive research proved inconclusive, and although the term continues in popular parlance, brainwashing remains more a fiction than a reality. Although it is undeniable that human beings are susceptible to many forms of social influence, they are also endowed with free will and the ability to choose what to accept as truth and how to interpret their experience of the world.

Origin of the term

The term brainwashing first came into use in the United States in the 1950s, during the Korean War, to describe the methods applied by the Chinese communists in their attempts to produce deep and permanent behavioral changes in foreign prisoners, and to disrupt the ability of captured United Nations troops to effectively organize and resist their imprisonment. The term xǐ năo (洗脑, the Chinese term literally translated as "to wash the brain") was first applied to methodologies of coercive persuasion used in the "reconstruction" of the so-called feudal thought patterns of Chinese citizens raised under pre-revolutionary regimes.

Coercive persuasion had been seen during the Inquisition, and in the show trials against "enemies of the state" in the Soviet Union. However, the term brainwashing emerged only when the methodologies of these earlier movements were systematized during the early decades of the People's Republic of China. Until that time, descriptions had been limited to concrete accounts of specific techniques.

In later times, the term "brainwashing" came to apply to other methods of coercive persuasion, and even to the effective use of ordinary propaganda and indoctrination. In the formal discourses of the Chinese Communist Party, the more clinical-sounding term "sī xǐang gǎi zào" (thought reform) came to be preferred.

Later use

Popular speech continues to use the word "brainwashed" informally and pejoratively to describe persons subjected to intensive influence resulting in the rejection of old beliefs and in the acceptance of new ones; or to account for someone who holds strong ideas considered to be implausible and that seem resistant to evidence, common sense, experience, and logic. Such popular usage often implies a belief that the ideas of the allegedly brainwashed person developed under some external influence such as books, television programs, television commercials (as producing "brainwashed consumers"), video games, religious groups, political groups, or other people. People have also come to use the terms "brainwashing" or "mind control" to explain the otherwise intuitively puzzling success of some methodologies for the religious conversion of inductees to new religious movements (including cults).

The term "brainwashing" is not widely used in psychology and other sciences, because of its vagueness and history of being used in propaganda, not to mention its association with hysterical fears of people being taken over by foreign ideologies. What is commonly called "brainwashing" may be better understood as a combination of manipulations to promote attitude change, including persuasion, propaganda, coercion, and restriction of access to neutral sources of information. It should be noted that many of these techniques are more subtly used (usually unconsciously) by advertisers, governments, schools, parents, and peers. In other words, such "brainwashing" is no more than the natural process of socialization.

Political brainwashing

The use of coercive persuasion techniques in China

The Communist Party of China used the phrase "xǐ nǎo" ("to wash the brain") to describe their methods of persuading those who did not conform to the Party message into orthodoxy. The phrase was a play on "xǐ xīn" (洗心 "to wash the heart"), a phrase found in many Daoist temples exhorting the faithful to cleanse their hearts of impure desires before entering.

Although American attention came to bear on thought reconstruction or brainwashing as a result of the Korean War, the techniques had been used on ordinary Chinese citizens since the establishment of the Peoples Republic of China (PRC). The PRC had refined and extended techniques used earlier in the Soviet Union to prepare prisoners for show trials, and the Soviets in turn had learned much from the Inquisition. In the Chinese context, these techniques had multiple goals that went far beyond the simple control of those in the prison camps of North Korea. They aimed to produce confessions, to convince the accused that they had indeed perpetrated anti-social acts, to make them feel guilty of crimes against the state, to make them desirous of a fundamental change in outlook toward the institutions of the new communist society, and, finally, to actually accomplish these desired changes in the recipients of the brainwashing/thought-reform process. The ultimate goal that drove these extreme efforts consisted of the transformation of an individual with a feudal or capitalist mindset into a "right thinking" member of the new social system, or, in other words, to transform what the state regarded as a criminal mind into what the state could regard as a non-criminal mind.

To that end, brainwashers desired techniques that would break down the psychic integrity of the individual with regard to information processing, with regard to information retained in the mind, and with regard to values. Chosen techniques included: Dehumanizing of individuals by keeping them in filth, sleep deprivation, partial sensory deprivation, psychological harassment, inculcation of guilt, group social pressure, and so forth. These methods of thought control proved extremely useful in gaining the compliance of prisoners of war. Key elements in their success included tight control of the information available to prisoners, and tight control over their behavior.

In September 1950, the Miami Daily News published an article by Edward Hunter (1902-1978) entitled "'Brain-Washing' Tactics Force Chinese into Ranks of Communist Party." It contained the first printed use of the English-language term "brainwashing," which quickly became a stock phrase in Cold War headlines. An additional article by Hunter on the same subject appeared in New Leader magazine in 1951. Hunter, a CIA propaganda operator who worked under-cover as a journalist, turned out a steady stream of books and articles on the subject. In 1953, Allen Welsh Dulles, the CIA director at that time, explained that "the brain under [Communist influence] becomes a phonograph playing a disc put on its spindle by an outside genius over which it has no control."

In his 1956 book Brain-Washing: The Story of the Men Who Defied It, Edward Hunter described "a system of befogging the brain so a person can be seduced into acceptance of what otherwise would be abhorrent to him." According to Hunter, the process is so destructive of physical and mental health that many of his interviewees had not fully recovered after several years of freedom from Chinese captivity.

Later, two studies of the Korean War defections by Robert Lifton (1961) and Edgar Schein (1961) concluded that brainwashing had only a transient effect when used on prisoners of war (POWs). Lifton and Schein both found that the Chinese did not engage in any systematic re-education of prisoners, but generally used their techniques of coercive persuasion to disrupt the ability of the prisoners to organize to maintain their morale and to try to escape. The Chinese did, however, succeed in having some of the prisoners make anti-American statements by placing the prisoners under harsh conditions of physical and social deprivation and disruption, and then by offering them more comfortable situations such as better sleeping quarters, better food, warmer clothes, or blankets. Nevertheless, the psychiatrists noted that even these measures of coercion proved quite ineffective in changing basic attitudes for most people.

In essence, the prisoners did not actually adopt Communist beliefs. Rather, many of them behaved as though they did in order to avoid the plausible threat of extreme physical abuse. Moreover, the few prisoners influenced by Communist indoctrination apparently succumbed as a result of the confluence of the coercive persuasion, and of the motives and personality characteristics of the prisoners that already existed before imprisonment. In particular, individuals with very rigid systems of belief tended to snap and realign, whereas individuals with more flexible systems of belief tended to bend under pressure and then restore themselves when the external pressures were removed.

Terrible though the process was for individuals imprisoned by the Chinese Communist Party, these attempts at extreme coercive persuasion ended with a reassuring result: They showed that the human mind has enormous ability to adapt to stress and also a powerful homeostatic capacity. Reactions to attempts by the state to reform them showed that most people would change under pressure and would change back when the pressure was removed. An additional finding was that some individuals derived benefit from these coercive procedures due to the fact that the interactions, perhaps as an unintended side effect, actually promoted insight into dysfunctional behaviors that were then abandoned.

Thus, although the use of brainwashing on United Nations prisoners during the Korean War produced some propaganda benefits, its main utility to the Chinese lay in the fact that it significantly increased the maximum number of prisoners that one guard could control, thus freeing other Chinese soldiers to go to the battlefield.

Mass brainwashing

In societies where the government maintains tight control of both the mass media and education system and uses this control to disseminate propaganda on a particularly intensive scale, the overall effect can be to "brainwash" large sections of the population. This is particularly effective where nationalist or religious sentiment is invoked and where the population is poorly educated and has limited access to independent or foreign media.

Refutation of political brainwashing

Dick Anthony, a research and forensic psychologist, claimed that the CIA invented the brainwashing ideology as a propaganda strategy to undercut communist claims that American prisoners of war in Korean communist camps had voluntarily expressed sympathy for communism, and that definitive research demonstrated that collaboration by western POWs had been caused by fear and duress, and not by brainwashing (Anthony 1990). He argued that the CIA brainwashing theory was pushed to the general public through the books of Edward Hunter, who was a secret CIA psychological warfare specialist passing as a journalist. He further asserted that for twenty years, starting in the early 1950s, the CIA and the Defense Department conducted secret research in an attempt to develop practical brainwashing techniques (possibly to counteract the brainwashing efforts of the Chinese), and that the attempt was a failure.

Brainwashing controversy in new religious movements and cults

In the 1960s, after coming into contact with new religious movements (NRMs), popularly referred to as "cults," young people suddenly adopted faiths, beliefs, and behavior that differed markedly from their previous lifestyles and seemed at variance with their upbringings. In some cases, these people neglected or even broke contact with their families, who found these changes very strange and upsetting. To explain these phenomena, the theory was postulated that these young people had been brainwashed by these new religious movements, by isolating them from their family and friends (inviting them to an end of term camp after university for example), arranging a sleep deprivation program (3 a.m. prayer meetings) and exposing them to loud and repetitive chanting. Another alleged technique of religious brainwashing involved "love bombing" rather than torture.

Various social scientists attempted to develop theories of this process. Conway and Siegelman (1978) described sudden, drastic alterations of personality in people who joined NRMs. They claimed that such people were subjected to practices designed to impair the brain's powers of information processing, leading to delusions, altered awareness, and changes in thinking. Under such conditions, they described the mind as "snapping" under the pressure, and realigning with the patterns of thought presented by those in control of the process. Steven Hassan (1988), suggested that the influence of sincere but misled people can provide a significant factor in the process of thought reform. However, many scholars in the field of new religious movements did not accept Hassan's Bite model.

On the other side, defenders of the NRMs likened the personality changes to religious conversion experiences recounted by numerous people, including such notable examples as Saint Paul and Saint Francis of Assisi. They pointed out that radical changes in both belief and behavior are hallmarks of a conversion experience.

This controversy continued through the latter decades of the twentieth century, involving social scientists and religious leaders on both sides of the argument. Various lawsuits were brought, some by families and organizations opposed to NRMs; others by members of NRMs who were subjected to "deprogramming" by people hired to bring them back to their former beliefs and lifestyle. The methods of these deprogrammers included not only various coercive persuasion techniques to break their new-found "faith," but also the forcible abduction, or kidnapping, of these young adults.

Through numerous attempts, in the courtroom and in the media, to characterize NRMs as conducting brainwashing, or to refute such accusations and in turn accuse deprogrammers of similar efforts, it became clear that there is no agreed upon definition of brainwashing in the context of religious faith. Indeed, what some claimed as a religious conversion experience, others described as brainwashing; what some called faith-breaking, others called deprogramming.

Social scientists concluded that there was no agreement about the existence of a social process attempting coercive influence, nor about the existence of the social outcome that people are influenced against their will. Those who studied new religious movements recognized that religious groups can have considerable influence over their members, and that such influence may have come about through deception and indoctrination. Indeed, many sociologists observed that "influence" occurs ubiquitously in human cultures, and that the influence exerted in "cults" or new religious movements does not differ greatly from the influence present in practically every domain of human action and of human endeavor, the influence known as socialization.

Once it was acknowledged that the influence of religions, including NRMs, is no more "brainwashing" than any other societal influences, the concept of "deprogramming" began to come under attack. For, if a person was not "programmed" through coercive techniques, there was no reason to use such techniques to "deprogram" them. The Association of World Academics for Religious Education, stated that "…without the legitimating umbrella of brainwashing ideology, 'deprogramming'—the practice of kidnapping members of NRMs and destroying their religious faith—cannot be justified, either legally or morally."

The APA and the brainwashing theories

In the early 1980s, some U.S. mental health professionals became controversial figures due to their involvement as expert witnesses in court cases against new religious movements. In their testimony, they stated that anti-cult theories of brainwashing, mind control, or coercive persuasion were generally accepted concepts within the scientific community. In 1983, the American Psychological Association (APA) asked Margaret Singer, one of the leading proponents of coercive persuasion theories, to chair a task force called DIMPAC to investigate whether "brainwashing" or coercive persuasion did indeed play a role in recruitment by such movements.

Before the task force had submitted its final report, however, the APA submitted an amicus curiae brief in an ongoing case. That brief characterized the theory of brainwashing as not scientifically proven and suggested the hypothesis that cult recruitment techniques might prove coercive for certain sub-groups, while not affecting others. The brief stated that "[t]he methodology of Drs. Singer and Benson has been repudiated by the scientific community," that the hypotheses advanced by Singer were "little more than uninformed speculation, based on skewed data," and that "[t]he coercive persuasion theory … is not a meaningful scientific concept."

The APA subsequently withdrew its signature from this brief, and later rejected the DIMPAC report due to insufficient evidence. Later, APA Division 36 (then Psychologists interested in Religion Issues, later Psychology of Religion) in its 1990 annual convention approved the following resolution:

The Executive Committee of the Division of Psychologists Interested in Religious Issues supports the conclusion that, at this time, there is no consensus that sufficient psychological research exists to scientifically equate undue non-physical persuasion (otherwise known as "coercive persuasion," "mind control," or "brainwashing") with techniques of influence as typically practiced by one or more religious groups. Further, the Executive Committee invites those with research on this topic to submit proposals to present their work at Divisional programs. ("PIRI Executive Committee Adopts Position on Non-Physical Persuasion Winter," 1991, in Amitrano and Di Marzio, 2001)

Zablocki (1997) and Amitrani (2001), citing APA boards and scholars on the subject, concluded that there has been no unanimous decision of the APA regarding this issue.

Brainwashing in fiction

The idea that brainwashing techniques exist that are capable of altering the beliefs, attitudes, and thought processes of individuals has appeared in a number of fictional forms. Some examples include:

  • In George Orwell's novel Nineteen Eighty-Four, brainwashing is used by the totalitarian government of Oceania to erase nonconformist thought and rebellious personalities.
  • In the Stanley Kubrick film A Clockwork Orange, criminals are re-educated in an attempt to remove their violent tendencies.
  • The alarmist concept of brainwashing functioned as a central theme in the movie The Manchurian Candidate in which Communist brainwashers turned a soldier into an assassin through something akin to hypnosis. The idea that one person could be so enslaved to another as to do their bidding even when no longer under duress has fascinated dramatists and movie viewers throughout the ages.

Conclusion

There seems to be little to no accord among specialists on the existence of brainwashing, although many have theorized that torture, sleep deprivation, and other such techniques may alter a person's state of mind. However, a distinction must be made between the modifying of beliefs versus the modifying of behavior. The changing of a person's behavior through coercive persuasion is possible, but it is not necessarily brainwashing. Only when this change in behavior stems from a core change in beliefs can it be referred to as brainwashing. Acting to avoid pain or some other kind of discomfort is not mind manipulation, it is simply an act of self preservation. Brainwashing as a deliberate practice, though, still remains undefined and unproven.

Nevertheless, it is not far fetched to believe that under extreme circumstances outside forces could influence one's mental state. The understanding of reality comes from one's environment, and so it stands to reason that a drastic change in environmental factors would drastically alter someone's grip on reality. Those who have experienced extreme situations, whether they be natural disasters, the horrors of war, or a spiritual experience leading to new faith, testify that one's view on life changes dramatically through such circumstances. Such experiences reveal both the power and the delicate nature of the mind.

References
ISBN links support NWE through referral fees

  • Amitrani, Alberto. 1998. Blind, or just don't want to see? "Brainwashing," mystification and suspicion. Retrieved July 5, 2007.
  • Amitrani, Alberto. 2001 Blind, or just don't want to see? ""Mind Control" in New Religious Movements and the American Psychological Association. Cultic Studies Review
  • Anthony, Dick. 1990. "Religious Movements and 'Brainwashing' Litigation" in Dick Anthony and Thomas Robbins, In Gods We Trust. New Brunswick, NJ: Transaction.
  • APA Amicus curiae, February 11, 1987. Retrieved July 5, 2007.
  • APA Motion to withdraw amicus curiae March 27, 1987. Retrieved July 5, 2007.
  • APA Board of Social and Ethical Responsibility for Psychology, Memorandum on Brainwashing: Final Report of the Task Force, May 11, 1987. Retrieved July 5, 2007.
  • Bardin, David. 1994. Mind Control ("Brainwashing") Exists in Psychological Coercion & Human Rights.
  • Barker, Eileen. 1984. The Making of a Moonie: Choice or Brainwashing. Oxford, UK: Blackwell Publishers. ISBN 0-631-13246-5
  • Beith-Hallahmi, Benjamin. 2001. Dear Colleagues: Integrity and Suspicion in NRM Research. Retrieved July 5, 2007.
  • Bromley, David. 2001. "A Tale of Two Theories: Brainwashing and Conversion as Competing Political Narratives" in Benjamin Zablocki and Thomas Robbins (ed.), Misunderstanding Cults. ISBN 0-8020-8188-6
  • Committee on Un-American Activities (HUAC). 1958. Communist Psychological Warfare (Brainwashing). United States House of Representatives, Washington, D. C.
  • Conway, Flo and Jim Siegelman. 1987. Snapping: America's Epidemic of Sudden Personality Change. Second edition. Stillpoint Press. ISBN 0964765004
  • Hadden, Jeffrey K. 2000. The Brainwashing Controversy.
  • Hadden, Jeffery K. 1993. The Handbook of Cults and Sects in America. Greenwich, CT: JAI Press, Inc., pp. 75-97
  • Hassan, Steven. 1988. Combatting Cult Mind Control. Rochester, Vermont. ISBN 0892812435
  • Hassan, Steven. 2000. Releasing The Bonds: Empowering People to Think for Themselves. ISBN 0967068800.
  • Hindery, Roderick. 2001. Indoctrination and Self-deception or Free and Critical Thought?
  • Hunter, Edward. 1951. Brain-Washing in Red China. The Calculated Destruction of Men’s Minds. New York: The Vanguard Press.
  • Hunter, Edward. 1956. Brain-Washing: The Story of the Men Who Defied It.
  • Introvigne, Massimo. 1998. Liar, Liar: Brainwashing, CESNUR and APA.
  • Kent, Stephen A. 1997. Brainwashing in Scientology's Rehabilitation Project Force (RPF). Retrieved July 5, 2007.
  • Kent, Stephen A. & Theresa Krebs. 1998. When Scholars Know Sin. Skeptic Magazine (Vol. 6, No. 3).
  • Kent, Stephen A. 2001. Brainwashing Programs in The Family/Children of God and Scientology, in Benjamin Zablocki and Thomas Robbins (ed.), Misunderstanding Cults. ISBN 0-8020-8188-6
  • Langone, Michael. 1993. Recovering from Cults.
  • Lifton, Robert J. 1961. Thought Reform and the Psychology of Totalism. ISBN 0-8078-4253-2
  • Robinson, B.A. 1996. Glossary of Religious Terms.
  • Richardson, James T. 1994. "Brainwashing Claims and Minority Religions Outside the United States: Cultural Diffusion of a Questionable Concept in the Legal Arena," Brigham Young University Law Review.
  • Sargant, William. 1996. Battle for the Mind: A Physiology of Conversion and Brainwashing. ISBN 1-883536-06-5
  • Scheflin, Alan W. and Edward M. Opton Jr. 1978. The Mind Manipulators. A Non-Fiction Account. p. 437
  • Schein, Edgar H. 1961. Coercive persuasion: A socio-psychological analysis of the "brainwashing" of American civilian prisoners by the Chinese Communists.
  • Shapiro, K. A. Grammatical distinctions in the left frontal cortex. J. Cogn. Neurosci. 13, pp. 713-720 (2001). Retrieved July 5, 2007.
  • Singer, Margaret. 1987. "Group Psychodynamics." in Merck's Manual.
  • Singer, Margeret. 2003. Cults in Our Midst: The Continuing Fight Against Their Hidden Menace. Revised and updated edition. John Wiley & Sons. ISBN 0787967416
  • Taylor, Kathleen. 2005. Brainwashing: The Science Of Thought Control ISBN 0-19-280496-0
  • Wakefield, Hollida, M.A. & Ralph Underwager. 1998. Coerced or Nonvoluntary Confessions. Institute for Psychological Therapies.
  • West, Louis J. 1989. "Persuasive Techniques in Religious Cults
  • Zablocki, Benjamin. 1997. The Blacklisting of a Concept: The Strange History of the Brainwashing Conjecture in the Sociology of Religion. Nova Religion.
  • Zablocki, Benjamin. 2001. Towards a Demystified and Disinterested Scientific Theory of Brainwashing, in Benjamin Zablocki and Thomas Robbins (ed.), Misunderstanding Cults. ISBN 0-8020-8188-6
  • Zablocki, Benjamin. 2002. "Methodological Fallacies in Anthony's Critique of Exit Cost Analysis." Retrieved July 5, 2007.
  • Zimbardo, Philip. 2002. Mind Control: Psychological Reality or Mindless Rhetoric? in Monitor on Psychology.

External links

All links retrieved November 20, 2023.

Credits

New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:

The history of this article since it was imported to New World Encyclopedia:

Note: Some restrictions may apply to use of individual images which are separately licensed.