Dr. A Prabhu Dessai Consultant Psychiatrist
Panaji , Goa 403001
India
ph: 9096660920
The word cult pejoratively refers to a group whose beliefs or practices are considered abnormal or bizarre.[1] The word originally denoted a system of ritual practices. The narrower, derogatory sense of the word is a product of the 20th century, especially since the 1980s, and is considered subjective. It is also a result of the anti-cult movementauthoritarian, exploitative and that are believed to use dangerous rituals or mind control. The word implies a group which is a minority in a given society. The word was first used in the early 17th century denoting homage paid to a divinity and derived from French culte or Latin cultus ‘worship,’ from cult- ‘inhabited, cultivated, worshiped,’ from the verb colere 'care, cultivation'.[citation needed] which uses the word in reference to groups seen as
The popular, derogatory sense of the word has no currency in academic studies of religions, where "cults" are subsumed under the neutral label of the "new religious movement", while academic sociology has partly adopted the popular meaning of the word.[2][3][4]
The concept of "cult" was introduced into sociological classification in 1932 by American sociologist Howard P. Becker as an expansion of German theologian Ernst Troeltsch's church-sect typology. Troeltsch's aim was to distinguish between three main types of religious behavior: churchly, sectarian and mystical. Becker created four categories out of Troeltsch's first two by splitting church into "ecclesia" and "denomination", and sect into "sect" and "cult".[5] Like Troeltsch's "mystical religion", Becker's cults were small religious groups lacking in organization and emphasizing the private nature of personal beliefs.[6] Later formulations built on these characteristics while placing an additional emphasis on cults as deviant religious groups "deriving their inspiration from outside of the predominant religious culture".[7] This deviation is often thought to lead to a high degree of tension between the group and the more mainstream culture surrounding it, a characteristic shared with religious sects.[8] Sociologists still maintain that unlike sects, which are products of religious schism and therefore maintain a continuity with traditional beliefs and practices, "cults" arise spontaneously around novel beliefs and practices.[9]
In the 1940s, the long held opposition by some established Christian denominations to non-Christian religions or/and supposedly heretical Christian sects crystallized into a more organized "Christian countercult movement" in the United States. For those belonging to the movement, all new religious groups deemed outside of Christian orthodoxy[10] were considered "cults". As more foreign religious traditions found their way into the United States, the religious movements they brought with them or gave birth to attracted even fiercer resistance. This was especially true for movements incorporating mystical or exotic new beliefs and those with charismatic, authoritarian leaders.
In the early 1970s, a secular opposition movement to "cult" groups had taken shape. The organizations that formed the secular "Anti-cult movement" (ACM) often acted on behalf of relatives of "cult" converts who did not believe their loved ones could have altered their lives so drastically by their own free will. A few psychologists and sociologists working in this field lent credibility to their disbelief by suggesting that "brainwashing techniques" were used to maintain the loyalty of "cult" members.[11] The belief that cults "brainwashed" their members became a unifying theme among cult critics and in the more extreme corners of the Anti-cult movement techniques like the sometimes forceful "deprogramming" of "cult members" becoming standard practice.[12]
In the meantime, a handful of high profile crimes were committed by groups identified as cults, or by the groups' leaders. The mass suicides committed by members of the People's Temple in Jonestown, Guyana, and the Manson Family murders are perhaps the most prominent examples in American popular culture. The publicity of these crimes, as amplified by the Anti-cult movement, influenced the popular perception of new religious movements[citation needed]. In the mass media, and among average citizens, "cult" gained an increasingly negative connotation, becoming associated with things like kidnapping, brainwashing, psychological abuse, sexual abuse and other criminal activity, and mass suicide. While most of these negative qualities usually have real documented precedents in the activities of a very small minority of new religious groups, mass culture often extends them to any religious group viewed as culturally deviant, however peaceful or law abiding it may be.[13][14][15]
In the late 1980s, psychologists and sociologists started to abandon theories like brainwashing and mind-control. While scholars may believe that various less dramatic coercive[16][17] psychological mechanisms could influence group members, they came to see conversion to new religious movements principally as an act of a rational choice. Most sociologists and scholars of religion also began to reject the word "cult" altogether because of its negative connotations in mass culture. Some began to advocate the use of new terms like "new religious movement", "alternative religion" or "novel religion" to describe most of the groups that had come to be referred to as "cults",[18] yet none of these terms have had much success in popular culture or in the media. Other scholars have pushed to redeem the word as one fit for neutral academic discourse,[19] while researchers aligned with the Anti-cult movement have attempted to reduce the negative connotations being associated with all such groups by classifying only some as "destructive cults".
The difference between the negative and the neutral definition of the word cult has also had political implications. In the 1970s, the scientific status of the "brainwashing theory" became a central topic in U.S. court cases where the theory was instrumental in justifying the use of the forceful "deprogramming" of cult members.[3][20] Meanwhile, sociologists critical of these theories assisted advocates of religious freedom in defending the legitimacy of new religious movements in court. While the official response to new religious groups has been mixed across the globe, some governments aligned more with the critics of these groups to the extent of distinguishing between "legitimate" religion and "dangerous", "unwanted" cults in public policy.[11][21] France and Belgium have taken policy positions which accept "brainwashing" theories uncritically, while other European nations, like Sweden and Italy, are cautious about brainwashing and have adopted more neutral responses to new religions.[22] Scholars have suggested that outrage following the mass murder/suicides perpetuated by the Solar Temple[11][23] as well as the more latent xenophobic and anti-American attitudes have contributed significantly to the extremity of European anti-cult positions.[24]
Since 1949, the People's Republic of China has been classifying dissenting groups as xiéjiào(邪教.)[25] In the Chinese language, the word xiéjiào translates to "Evil Education" [邪 (xié) = Evil 教 (jiào)= Education]. The word xiéjiào as a whole is used to describe what is known in the Western world as a cult.[26] In recent years, the Chinese Government has allied with western anti-cult scholars in order to lend legitimacy to its crackdown on practitioners of Falun Gong. In 2009, Rabbi Binyamin Kluger and Raphael Aron, director of the Cult Counseling Australia, spoke at a four-day conference in southern China on cult-fighting strategies.[27] Aron is a Lubavitch Jew, a group which might be considered a cult in that its members believe their former rabbi to be the Messiah.[28] Scientology has also been the target of anti-cult legislation in several countries. This negative politicized use of the word "cult" provides sociologists critical of it with yet another reason to abandon it because, according to them, it may adversely impact the religious freedoms of group members.[2][20][29][30] For cult critics, the creation of legislation restricting the religious freedom of cults is an objective in itself, since in their view, "cults" are harmful or potentially harmful to their members and to society at large.
Studies performed by those who believe that some religious groups do practice mind control have identified a number of key steps in coercive persuasion:[31][32]
This view is disputed by scholars such as James Gene[34][35] Society for the Scientific Study of Religion[36] stated in 1990 that there was not sufficient research to permit a consensus on the matter and that "one should not automatically equate the techniques involved in the process of physical coercion and control with those of nonphysical coercion and control". and Bette Nove Evans.
“When you buy into something that seems to explain everything, you can soon be coaxed into doing almost anything.” | ||
In the opinion of Benjamin Zablocki, a professor of Sociology at Rutgers University, groups that have been characterized as cults are at high risk of becoming abusive to members. He states that this is in part due to members' adulation of charismatic[38] leaders contributing to the leaders becoming corrupted by power.Zablocki defines a cult as an ideological organization held together by charismatic relationships and the demands total commitment.According to Barrett, the most common accusation made against groups referred to as cults is sexual abuse (See some allegations made by former members). According to Kranenborg, some groups are risky when they advise their members not to use regular medical care.[39]
Michael Langone gives three different models for conversion. Under Langone's deliberative model, people are said to join cults primarily because of how they view a particular group. Langone notes that this view is most favored among sociologists and religious scholars. Under the "psychodynamic model," popular with some mental health professionals, individuals choose to join for fulfillment of subconscious psychological needs. Finally, the "thought reform model" posits that people do not join because of their own psychological needs, also because of the group's influence through forms of psychological manipulation. Langone claims that those mental health experts who have more direct experience with large numbers of cultists tend to favor this latter view.[40]
Some scholars favor one particular view, or combined elements of each. According to Marc Gallanter,[41] typical reasons why people join cults include a search for community and a spiritual quest. Stark and Bainbridge, in discussing the process by which individuals join new religious groups, have even questioned the utility of the concept of conversion, suggesting that affiliation is a more useful concept.[42]
There are several ways people leave a cult:[43][44] Popular authors Conway and Siegelman conducted a survey and published it in the book Snapping regarding after-cult effects and deprogramming and concluded that people deprogrammed had fewer problems than people not deprogrammed. The BBC writes that, "in a survey done by Jill Mytton on 200 former cult members most of them reported problems adjusting to society and about a third would benefit from some counseling".[45]
Ronald Burks, in a study comparing Group Psychological Abuse Scale (GPA) and Neurological Impairment Scale (NIS) scores in 132 former members of cults and cultic relationships, found a positive correlation between intensity of reform environment as measured by the GPA and cognitive impairment as measured by the NIS. Additional findings were a reduced earning potential in view of the education level that corroborates earlier studies of cult critics (Martin 1993; Singer & Ofshe, 1990; West & Martin, 1994) and significant levels of depression and dissociation agreeing with Conway & Siegelman, (1982), Lewis & Bromley, (1987) and Martin, et al. (1992).[46]
Sociologists Bromley and Hadden note a lack of empirical support for claimed consequences of having been a member of a "cult" or "sect", and substantial empirical evidence against it. These include the fact that the overwhelming proportion of people who get involved in NRMs leave, most short of two years; the overwhelming proportion of people who leave do so of their own volition; and that two-thirds (67%) felt "wiser for the experience."[47]
According to F. Derks and J. van der Lans, there is no uniform post-cult trauma. While psychological and social problems upon resignation are not uncommon, their character and intensity are greatly dependent on the personal history and on the traits of the ex-member, and on the reasons for and way of resignation.[48]
The report of the "Swedish Government's Commission on New Religious Movements" (1998) states that the great majority of members of new religious movements derive positive experiences from their subscription to ideas or doctrines which correspond to their personal needs, and that withdrawal from these movements is usually quite undramatic, as these people leave feeling enriched by a predominantly positive experience. Although the report describes that there are a small number of withdrawals that require support (100 out of 50,000+ people), the report did not recommend that any special resources be established for their rehabilitation, as these cases are very rare.[49]
Stuart A. Wright explores the distinction between the apostate narrative and the role of the apostate, asserting that the former follows a predictable pattern, in which the apostate utilizes a "captivity narrative" that emphasizes manipulation, entrapment and being victims of "sinister cult practices". These narratives provide a rationale for a "hostage-rescue" motif, in which cults are likened to POW camps and deprogramming as heroic hostage rescue efforts. He also makes a distinction between "leavetakers" and "apostates", asserting that despite the popular literature and lurid media accounts of stories of "rescued or recovering 'ex-cultists'", empirical studies of defectors from NRMs "generally indicate favorable, sympathetic or at the very least mixed responses toward their former group."[50]
Secular cult opponents like those belonging to the anti-cult movement tend to define a "cult" as a group that tends to manipulate, exploit, and control its members. Specific factors in cult behavior are said to include manipulative and authoritarian mind control over members, communal and totalistic organization, aggressive proselytizing, systematic programs of indoctrination, and perpetuation in middle-class communities.[51]
While acknowledging the issue of multiple definitions of the word,[52] Michael Langone states that: "Cults are groups that often exploit members psychologically and/or financially, typically by making members comply with leadership's demands through certain types of psychological manipulation, popularly called mind control, and through the inculcation of deep-seated anxious dependency on the group and its leaders."[53] A similar definition is given by Louis Jolyon West:
In each, the focus tends to be on the specific tactics of conversion, the negative impact on individual members, and the difficulty in leaving once indoctrination has occurred.[55]
The role of former members, or "apostates," has been widely studied by social scientists. At times, these individuals become outspoken public critics of the groups they leave. Their motivations, the roles they play in the anti-cult movement, the validity of their testimony, and the kinds of narratives they construct, are controversial. Some scholars like David G. Bromley, Anson Shupe, and Brian R. Wilson have challenged the validity of the testimonies presented by critical former members. Wilson discusses the use of the atrocity story that is rehearsed by the apostate to explain how, by manipulation, coercion, or deceit, he was recruited to a group that he now condemns.[56] The hostile ex-members would invariably shade the truth and blow out of proportion minor incidents, turning them into major incidents.[57] Bromley and Shupe similarly discuss "captivity narratives" that depict the time in the group as involuntary and point out that the apostate is likely to present a caricature of his former group.[citation needed] Introvigne found in his study of the New Acropolis in France, that public negative testimonies and attitudes were only voiced by a minority of the ex-members, who he describes as becoming "professional enemies" of the group they leave.[citation needed] Benjamin Zablocki performed an empirical study that concludes that the reliability of former members was equal to that of those who stayed in one particular group.
Because of the increasingly pejorative use of the words "cult" and "cult leader" since the cult debate of the 1970s, some scholars, in addition to groups referred to as cults, argue that these are words to be avoided.[58][59]
Catherine Wessinger (Loyola University New Orleans) has stated that the word "cult" represents just as much prejudice and antagonism as racial slurs or derogatory words for women and homosexuals.[60] She has argued that it is important for people to become aware of the bigotry conveyed by the word, drawing attention to the way it dehumanises the group's members and their children.[60][60] At the same time, she adds, labeling a group a "cult" makes people feel safe, because the "violence associated with religion is split off from conventional religions, projected onto others, and imagined to involve only aberrant groups."[60] This fails to take into account that child abuse, sexual abuse, financial extortion and warfare have also been committed by believers of mainstream religions, but the pejorative "cult" stereotype makes it easier to avoid confronting this uncomfortable fact.[60] Labeling a group as subhuman, she says, becomes a justification for violence against it.
The concept of "cult" as an epithet was legally tested in the United Kingdom when a protester refused to put down a sign that read, "Scientology is not a religion, it is a dangerous cult", citing a 1984 high court judgment describing the organization as a cult. The London police issued a summons to the protester for violating the Public Order Act by displaying a "threatening, abusive or insulting" sign. The Crown Prosecution Service ruled that the word "cult" on a sign, "...is not abusive or insulting and there is no offensiveness, as opposed to criticism, neither in the idea expressed nor in the mode of expression." There was no action taken against the protester, and police would allow future such demonstrations.[61] In Scotland, an official of the Edinburgh City Council told inquiring regular protesters, "I understand that some of the signs you use may display the word 'cult' and there is no objection to this."[62]
Sociologist Amy Ryan has argued for the need to differentiate those groups that may be dangerous from groups that are more benign.[63] Ryan notes the sharp differences between definition from cult opponents, who tend to focus on negative characteristics, and those of sociologists, who aim to create definitions that are value-free. The movements themselves may have different definitions of religion as well. George Chryssides also cites a need to develop better definitions to allow for common ground in the debate.
These definitions have political and ethical impact beyond just scholarly debate. In Defining Religion in American Law, Bruce J. Casino presents the issue as crucial to international human rights laws. Limiting the definition of religion may interfere with freedom of religion, while too broad a definition may give some dangerous or abusive groups "a limitless excuse for avoiding all unwanted legal obligations."[64]
Some authors in the cult opposition dislike the word cult to the extent it implies that there is a continuum with a large gray area separating "cult" from "noncult" which they do not see.[64] Others authors, e.g. Steven Hassan, differentiate by using words and terms like "Destructive cult," or "Cult" (totalitarian type) vs. "benign cult."
An additional commonly used subcategory of cult movements are the doomsday cults, characterized by the central role played by eschatology in these groups' belief systems. Although most religious movements adhere to some beliefs about the eventual end of the world as we know it, in doomsday cults, these tend to take the form of concrete prophesies and predictions of specific catastrophic events being imminent, or in some cases, even expected to occur on a particular calendar date. This category of religious movements includes some well-known cases of extremely destructive behavior by adherents in anticipation of the end of times, such as the mass suicide by members of the Peoples Temple in 1978, the Branch Davidians in 1993 and the Heaven's Gate in 1997, although many examples are known of doomsday cults that do not become nearly as destructive. This latter class of doomsday cults are of theoretical interest to the scholarly study of cults, because of the often paradoxical response of adherents to the failure of doomsday prophesies to be confirmed. Social psychologist Leon Festinger and his collaborators performed a detailed case study of one such group in 1954, subsequently documented in "When Prophecy Fails". The members of a small, obscure UFO cult in question were very quick to amend their world-view so as to rationalize the unexpected outcome without losing their conviction about the validity of the underlying belief system, despite the obvious evidence to the contrary. The authors explained this phenomenon within the framework of the cognitive dissonance theory, which posits that people are in general motivated to adjust their beliefs so as to be consistent with their behavior, in order to avoid the painful experience of a dissonance between the two. On this account, the more committed one is at the behavioral level to their beliefs being true, the more driven one is to reduce the tension created by dis-confirming evidence. An important implication of this theory is that common, universal psychological factors contribute to the persistence of what otherwise appear to be bizarre and even absurd set of beliefs.
In many countries, there exists a separation of church and state and freedom of religion. Governments of some of these countries, concerned with possible abuses by groups they deem cults, have taken restrictive measures against some of their activities. Critics of such measures claim that the counter-cult movement and the anti-cult movement have succeeded in influencing governments in transferring the public's abhorrence of doomsday cults and make the generalization that it is directed against all small or new religious movements without discrimination. The critique is countered by stressing that the measures are directed not against any religious beliefs, but specifically against groups whom they see as inimical to the public order due to their totalitarianism, violations of the fundamental liberties, inordinate emphasis on finances, and/or disregard for appropriate medical care.[65]
Cults have been a subject or theme in literature and popular culture since ancient times. There were many references to it in the 20th century.
Anyone could attack a group they disagree with by unfairly labeling it a destructive cult. How would you know whether it really were such a cult or not? Isn't there an objective method to evaluate groups for cultic tendencies? Yes. The following early warning signs can help you reasonably determine whether or not a group is likely to be a destructive cult, and if you should be concerned about a friend, coworker, or loved one being involved with it.
The main reason that the following destructive cult tactics are so damaging to both the individual and society is because they debilitate rationality and reduce empathy. Rationality and empathy are indispensable in making good personal and social decisions. History is littered with personal and social catastrophes where a lack of rationality and lack of empathy were its core causes.
A destructive cult tends to be totalitarian in its control of its members' behavior. Cults are likely to dictate in great detail not only what members believe, but also what members wear and eat, when and where members work, sleep, and bathe, and how members think, speak, and conduct familial, marital, or sexual relationships.
A destructive cult tends to have an ethical double standard. Members are urged to be obedient to the cult, to carefully follow cult rules. They are also encouraged to be revealing and open in the group, confessing all to the leaders. On the other hand, outside the group they are encouraged to act unethically, manipulating outsiders or nonmembers, and either deceiving them or simply revealing very little about themselves or the group. In contrast to destructive cults, honorable groups teach members to abide by one set of ethics and act ethically and truthfully to all people in all situations.
A destructive cult has only two basic purposes: recruiting new members and fund-raising. Altruistic movements, established religions, and other honorable groups also recruit and raise funds. However, these actions are incidental to an honorable group's main purpose of improving the lives of its members and of humankind in general. Destructive cults may claim to make social contributions, but in actuality such claims are superficial and only serve as gestures or fronts for recruiting and fund-raising. A cult's real goal is to increase the prestige and often the wealth of the leader.
A destructive cult appears to be innovative and exclusive. The leader claims to be breaking with tradition, offering something novel, and instituting the ONLY viable system for change that will solve life's problems or the world's ills. But these claims are empty and only used to recruit members who are then surreptitiously subjected to mind control to inhibit their ability to examine the actual validity of the claims of the leader and the cult.
A destructive cult is authoritarian in its power structure. The leader is regarded as the supreme authority. He or she may delegate certain power to a few subordinates for the purpose of seeing that members adhere to the leader's wishes. There is no appeal outside his or her system to a greater system of justice. For example, if a schoolteacher feels unjustly treated by a principal, an appeal can be made to the superintendent. In a destructive cult, the leader claims to have the only and final ruling on all matters.
A destructive cult's leader is a self-appointed messianic person claiming to have a special mission in life. For example, leaders of flying saucer cults claim that beings from outer space have commissioned them to lead people away from Earth, so that only the leaders can save them from impending doom.
A destructive cult's leader centers the veneration of members upon himself or herself. Priests, rabbis, ministers, democratic leaders, and other leaders of genuinely altruistic movements focus the veneration of adherents on God or a set of ethical principles. Cult leaders, in contrast, keep the focus of love, devotion, and allegiance on themselves.
A destructive cult's leader tends to be determined, domineering, and charismatic. Such a leader effectively persuades followers to abandon or alter their families, friends, and careers to follow the cult. The leader then takes control over followers' possessions, money, time, and lives.
If you know someone who belongs to a group that demonstrates a significant number of these warning signs and you would like more information on how to deal with destructive cults or mind control, go to www.factnet.org.
Now have a look at Thought Reform Exists
http://www.factnet.org/headlines/destructive_cult_warning_signs.html
Terminology note: Today Mind control or brainwashing in academia is commonly referred to as coercive persuasion, coercive psychological systems or coercive influence. The short description below comes from Dr. Margaret Singer professor emeritus at the University of California at Berkeley the acknowledged leading authority in the world on mind control and cults. This document, in substance, was presented to the U.S. Supreme Court as an educational Appendix on coercive psychological systems in the case Wollersheim v. Church of Scientology 89-1367 and 89-1361. The Wollersheim case was being considered related to issues involving abuse in this area.
Coercion is defined as, "to restrain or constrain by force..." Legally it often implies the use of PHYSICAL FORCE or physical or legal threat. This traditional concept of coercion is far better understood than the technological concepts of "coercive persuasion" which are effective restraining, impairing, or compelling through the gradual application of PSYCHOLOGICAL FORCES.
A coercive persuasion program is a behavioral change technology applied to cause the "learning" and "adoption" of a set of behaviors or an ideology under certain conditions. It is distinguished from other forms of benign social learning or peaceful persuasion by the conditions under which it is conducted and by the techniques of environmental and interpersonal manipulation employed to suppress particular behaviors and to train others. Over time, coercive persuasion, a psychological force akin in some ways to our legal concepts of undue influence, can be even MORE effective than pain, torture, drugs, and use of physical force and legal threats.
The Korean War "Manchurian Candidate" misconception of the need for suggestibility-increasing drugs, and physical pain and torture, to effect thought reform, is generally associated with the old concepts and models of brainwashing. Today, they are not necessary for a coercive persuasion program to be effective. With drugs, physical pain, torture, or even a physically coercive threat, you can often temporarily make someone do something against their will. You can even make them do something they hate or they really did not like or want to do at the time. They do it, but their attitude is not changed.
This is much different and far less devastating than that which you are able to achieve with the improvements of coercive persuasion. With coercive persuasion you can change people's attitudes without their knowledge and volition. You can create new "attitudes" where they will do things willingly which they formerly may have detested, things which previously only torture, physical pain, or drugs could have coerced them to do.
The advances in the extreme anxiety and emotional stress production technologies found in coercive persuasion supersede old style coercion that focuses on pain, torture, drugs, or threat in that these older systems do not change attitude so that subjects follow orders "willingly." Coercive persuasion changes both attitude AND behavior, not JUST behavior.
THE PURPOSES AND TACTICS OF COERCIVE PERSUASION
Coercive persuasion or thought reform as it is sometimes known, is best understood as a coordinated system of graduated coercive influence and behavior control designed to deceptively and surreptitiously manipulate and influence individuals, usually in a group setting, in order for the originators of the program to profit in some way, normally financially or politically.
The essential strategy used by those operating such programs is to systematically select, sequence and coordinate numerous coercive persuasion tactics over CONTINUOUS PERIODS OF TIME. There are seven main tactic types found in various combinations in a coercive persuasion program. A coercive persuasion program can still be quite effective without the presence of ALL seven of these tactic types.
TACTIC 1. The individual is prepared for thought reform through increased suggestibility and/or "softening up," specifically through hypnotic or other suggestibility-increasing techniques such as: A. Extended audio, visual, verbal, or tactile fixation drills; B. Excessive exact repetition of routine activities; C. Decreased sleep; D. Nutritional restriction.
TACTIC 2. Using rewards and punishments, efforts are made to establish considerable control over a person's social environment, time, and sources of social support. Social isolation is promoted. Contact with family and friends is abridged, as is contact with persons who do not share group-approved attitudes. Economic and other dependence on the group is fostered. (In the forerunner to coercive persuasion, brainwashing, this was rather easy to achieve through simple imprisonment.)
TACTIC 3. Disconfirming information and nonsupporting opinions are prohibited in group communication. Rules exist about permissible topics to discuss with outsiders. Communication is highly controlled. An "in-group" language is usually constructed.
TACTIC 4. Frequent and intense attempts are made to cause a person to re-evaluate the most central aspects of his or her experience of self and prior conduct in negative ways. Efforts are designed to destabilize and undermine the subject's basic consciousness, reality awareness, world view, emotional control, and defense mechanisms as well as getting them to reinterpret their life's history, and adopt a new version of causality.
TACTIC 5. Intense and frequent attempts are made to undermine a person's confidence in himself and his judgment, creating a sense of powerlessness.
TACTIC 6. Nonphysical punishments are used such as intense humiliation, loss of privilege, social isolation, social status changes, intense guilt, anxiety, manipulation and other techniques for creating strong aversive emotional arousals, etc.
TACTIC 7. Certain secular psychological threats [force] are used or are present: That failure to adopt the approved attitude, belief, or consequent behavior will lead to severe punishment or dire consequence, (e.g. physical or mental illness, the reappearance of a prior physical illness, drug dependence, economic collapse, social failure, divorce, disintegration, failure to find a mate, etc.).
Another set of criteria has to do with defining other common elements of mind control systems. If most of Robert Jay Lifton's eight point model of thought reform is being used in a cultic organization, it is most likely a dangerous and destructive cult. These eight points follow:
Robert Jay Lifton's Eight Point Model of Thought Reform
1. ENVIRONMENT CONTROL. Limitation of many/all forms of communication with those outside the group. Books, magazines, letters and visits with friends and family are taboo. "Come out and be separate!"
2. MYSTICAL MANIPULATION. The potential convert to the group becomes convinced of the higher purpose and special calling of the
group through a profound encounter / experience, for example, through an alleged miracle or prophetic word of those in the group.
3. DEMAND FOR PURITY. An explicit goal of the group is to bring about some kind of change, whether it be on a global, social, or
personal level. "Perfection is possible if one stays with the group and is committed."
4. CULT OF CONFESSION. The unhealthy practice of self disclosure to members in the group. Often in the context of a public gathering in the group, admitting past sins and imperfections, even doubts about the group and critical thoughts about the integrity of the leaders.
5. SACRED SCIENCE. The group's perspective is absolutely true and completely adequate to explain EVERYTHING. The doctrine is not subject to amendments or question. ABSOLUTE conformity to the doctrine is required.
6. LOADED LANGUAGE. A new vocabulary emerges within the context of the group. Group members "think" within the very abstract
and narrow parameters of the group's doctrine. The terminology sufficiently stops members from thinking critically by reinforcing a "black and white" mentality. Loaded terms and clichés prejudice thinking.
7. DOCTRINE OVER PERSON. Pre-group experience and group experience are narrowly and decisively interpreted through the absolute doctrine, even when experience contradicts the doctrine.
8. DISPENSING OF EXISTENCE. Salvation is possible only in the group. Those who leave the group are doomed.
COERCIVE PERSUASION IS NOT PEACEFUL PERSUASION
Programs identified with the above-listed seven tactics have in common the elements of attempting to greatly modify a person's self-concept, perceptions of reality, and interpersonal relations. When successful in inducing these changes, coercive thought reform programs also, among other things, create the potential forces necessary for exercising undue influence over a person's independent decision-making ability, and even for turning the individual into a deployable agent for the organization's benefit without the individual's meaningful knowledge or consent.
Coercive persuasion programs are effective because individuals experiencing the deliberately planned severe stresses they generate can only reduce the pressures by accepting the system or adopting the behaviors being promulgated by the purveyors of the coercion program. The relationship between the person and the coercive persuasion tactics are DYNAMIC in that while the force of the pressures, rewards, and punishments brought to bear on the person are considerable, they do not lead to a stable, meaningfully SELF-CHOSEN reorganization of beliefs or attitudes. Rather, they lead to a sort of coerced compliance and a situationally required elaborate rationalization, for the new conduct.
Once again, in order to maintain the new attitudes or "decisions," sustain the rationalization, and continue to unduly influence a person's behavior over time, coercive tactics must be more or less CONTINUOUSLY applied. A fiery, "hell and damnation" guilt-ridden sermon from the pulpit or several hours with a high-pressure salesman or other single instances of the so-called peaceful persuasions do not constitute the "necessary chords and orchestration" of a SEQUENCED, continuous, COORDINATED, and carefully selected PROGRAM of surreptitious coercion, as found in a comprehensive program of "coercive persuasion."
Truly peaceful religious persuasion practices would never attempt to force, compel and dominate the free wills or minds of its members through coercive behavioral techniques or covert hypnotism. They would have no difficulty coexisting peacefully with U.S. laws meant to protect the public from such practices.
Looking like peaceful persuasion is precisely what makes coercive persuasion less likely to attract attention or to mobilize opposition. It is also part of what makes it such a devastating control technology. Victims of coercive persuasion have: no signs of physical abuse, convincing rationalizations for the radical or abrupt changes in their behavior, a convincing "sincerity, and they have been changed so gradually that they don't oppose it because they usually aren't even aware of it.
Deciding if coercive persuasion was used requires case-by-case careful analysis of all the influence techniques used and how they were applied. By focusing on the medium of delivery and process used, not the message, and on the critical differences, not the coincidental similarities, which system was used becomes clear. The Influence Continuum helps make the difference between peaceful persuasion and coercive persuasion easier to distinguish.
VARIABLES
Not all tactics used in a coercive persuasion type environment will always be coercive. Some tactics of an innocuous or cloaking nature will be mixed in.
Not all individuals exposed to coercive persuasion or thought reform programs are effectively coerced into becoming participants.
How individual suggestibility, psychological and physiological strengths, weakness, and differences react with the degree of severity, continuity, and comprehensiveness in which the various tactics and content of a coercive persuasion program are applied, determine the program's effectiveness and/or the degree of severity of damage caused to its victims.
For example, in United States v. Lee 455 U.S. 252, 257-258 (1982), the California Supreme Court found that
"when a person is subjected to coercive persuasion without his knowledge or consent... [he may] develop serious and sometimes irreversible physical and psychiatric disorders, up to and including schizophrenia, self-mutilation, and suicide."
WHAT ARE THE CRITERIA OF A COERCIVE PERSUASION PROGRAM?
A). Determine if the subject individual held enough knowledge and volitional capacity to make the decision to change his or her ideas or beliefs.
B). Determine whether that individual did, in fact, adopt, affirm, or reject those ideas or beliefs on his own.
C). Then, if necessary, all that should be examined is the behavioral processes used, not ideological content. One needs to examine only the behavioral processes used in their "conversion." Each alleged coercive persuasion situation should be reviewed on a case-by-case basis. The characteristics of coercive persuasion programs are severe, well-understood, and they are not accidental.
COERCIVE PERSUASION IS NOT VOLUNTARY, PEACEFUL, RELIGIOUS PRACTICE OR CENTRAL TO ANY BONA FIDE RELIGION.
Coercive persuasion is not a religious practice, it is a control technology. It is not a belief or ideology, it is a technological process.
As a PROCESS, it can be examined by experts on its technology COMPLETELY SEPARATE from any idea or belief content, similar to examining the technical process of hypnotic induction distinct from the meaning or value of the post-hypnotic suggestions.
Examining PROCESSES in this manner can not violate First Amendment religious protections.
Coercive persuasion is antithetical to the First Amendment. It is the unfair manipulation of other's biological and psychological weaknesses and susceptibilities. It is a psychological FORCE technology, not of a free society, but of a criminal or totalitarian society. It is certainly not a spiritual or religious technology.
Any organization using coercive persuasion on its members as a CENTRAL practice that also claims to be a religion is turning the SANCTUARY of the First Amendment into a fortress for psychological assault. It is a contradiction of terms and should be "disestablished."
Coercive persuasion is a subtle, compelling psychological force which attacks an even more fundamental and important freedom than our "freedom of religion." Its reprehensibility and danger is that it attacks our self-determinism and free will, our most fundamental constitutional freedoms.
No part of ths site may be copied or reproduced without permission of the site owner.
All rights reserved.
Dr. A Prabhu Dessai Consultant Psychiatrist
Panaji , Goa 403001
India
ph: 9096660920