Margaret Thaler Singer, Ph.D. Maurice K. Temerlin, Ph.D. Michael D. Langone, Ph.D.
Although the term “cult” is usually associated with religious groups, nonreligious cults are receiving increasing attention. This paper examines the common features of cultic groups, in particular the use of thought reform, a process through which indoctrination and behavior changes are brought about in a number of contemporary situations. Several psychotherapy cults are described to illustrate the coordinated programs of exploitative influence and behavior control that characterize these groups.
The term “cult” is often associated with a process that has been given a variety of labels, including “thought reform” (Lifton, 1961), “coercive persuasion” (Schein, 1956, 1961), “brainwashing,” (Hunter, 1953), “mind control” (Langone, 1988), the “systematic manipulation of psychological and social influence” (Singer, 1983), “coordinated programs of coercive influence and behavior control” (Ofshe & Singer, 1986), and “exploitative persuasion” (Singer & Addis, 1991). These terms reflect somewhat varying perspectives or attempts to explain to different audiences a complex and subtle process composed of techniques, tactics, and strategies of social influence long studied by social psychologists, social anthropologists, and marketing researchers (Cialdini, 1984; Nader, 1991; Zimbardo, Ebbesen, & Maslach, 1977). In this paper we will use the term “thought reform” because it is brief, has achieved wide usage in the field, is not easily susceptible to exaggerated interpretations, and succinctly describes what goes on in the process under investigation, i.e., affected individuals, as a result of planned and systematic psychosocial manipulation imposed by others, are led to adopt radically different beliefs and conform their behavior appropriately.
Whereas “thought reform” refers to a particular process of planned and systematic psychosocial manipulation, “cult” refers not to ideological content, as some mistakenly believe, but to certain social structures and relationships that shape the behavior, thoughts, and feelings of members so as to serve the wishes and needs of the leader(s). Thus, a cult may form in any content area: politics, religion, commerce, philosophy, health, science fiction, psychology, etc. That many persons still mistakenly believe that all cults are religious reflects perhaps the publicity religious cults received during the late 1970s and early 1980s. This paper contributes to the dispelling of this misconception by briefly defining “cult,” outlining the basic features of thought reform as it was originally conceived and in its contemporary form, and illustrating these concepts by describing a variety of psychotherapy cults.
Definitional Issues: Cults
Singer (1986) stated that cultic relationships refer to those relationships in which a person intentionally induces others to become totally or nearly totally dependent on him or her for almost all major life decisions, and inculcates in these followers a belief that he or she has some special talent, gift, or knowledge. (p. 270)
A related definition was proposed at a conference sponsored by the UCLA Neuropsychiatric Institute, the American Family Foundation, and the Johnson Foundation:
…a group or movement exhibiting a great or excessive devotion or dedication to some person, idea, or thing, and employing unethically manipulative techniques of persuasion and control (e.g., isolation from former friends and family, debilitation, use of special methods to heighten suggestibility and subservience, powerful group pressures, information management, suspension of individuality or critical judgment, promotion of total dependency on the group and fear of leaving it, etc.) designed to advance the goals of the group’s leaders, to the actual or possible detriment of members, their families, or the community. (Cultism: A conference for scholars and policy makers, 1986)
Cults, then, are likely to exhibit three elements to varying degrees:
- members’ excessively zealous, unquestioning commitment to the identity and leadership of the group;
- the induction of dependency through the use of manipulative and exploitative techniques of persuasion and control; and
- the tendency to harm members, their families, and/or society.
Because cults profess to help members but in actuality exploit them, cults develop a double agenda in which they employ a dual set of norms in operation at the same time, with the surface norms subservient to the deeper, hidden designs and purposes of an organization or group. Surface norms stress the idealism and the righteousness of the cause. Below the surface, however, are a set of underlying norms that efficiently run the organization. (MacDonald, 1988, p. 68)
Because cults tend to be leader-centered, exploitatively manipulative, and often harmful, they come into conflict with and are threatened by the more rational, open, and benevolent systems of members’ families and society at large. Some cults eventually disintegrate as a consequence of this tension. Some gradually accommodate to society by decreasing their level of manipulation, and consequently, exploitation, harm, and opposition. Others, however, isolate themselves, psychologically if not physically. In order to manage the threat posed by the outside world and to advance the goals of the leader(s), these groups tend to:
- dictate – sometimes in great detail – how members should think, act, and feel;
- claim a special, exalted status (for example, occult powers, a mission to save humanity) for themselves and/or their leader(s); and
- intensify their opposition to and alienation from society at large.
According to these definitions, cults differ from “new religions,” “new political movements,” “innovative psychotherapies,” and other “new” groups in that the former tend to use exploitatively manipulative techniques of influence and subordinate the well-being and welfare of followers to the benefit of the leader(s).
Cults also differ from purely authoritarian groups, e.g., military organizations and some types of sects and communes. The latter, though rigid and controlling, lack a double agenda and are not extremely manipulative and leader-centered. The social control rules of such authoritarian groups, even though sometimes coercive, are consistent, visible, and understood; they are not hidden. Though their decision-making structures are hierarchical, the leaders of purely authoritarian groups serve the group’s interests, not their own. Moreover, most authoritarian groups, e.g., the military, are accountable to higher authorities. Those who head cults, on the other hand, answer to no one.
Because democratic societies value the individual’s freedom, autonomy, and dignity, cultic groups generate considerable criticism. The cause of these negative evaluations, however, is not the newness or unusual beliefs of these groups, but their conduct, especially their methods of recruiting, indoctrinating, and exploiting members.
Cultic manipulations, of course, can influence some persons more easily than others. Ash (1985) notes that the following types of factors render some persons especially vulnerable to cultic manipulations: high level of current distress, cultural disillusionment in a frustrated seeker, lack of an intrinsic religious belief/value system, and dependent personality tendencies as indicated by a lack of inner direction, lack of adequate self-control (e.g., unassertiveness), low tolerance for ambiguity, and susceptibility to trance states.
Although cult recruits may be vulnerable in various ways, cults are, nevertheless, strikingly successful in bringing about and maintaining substantial behavioral and psychological changes in members. To establish a baseline against which to compare the power of cult environments, consider that with respect to nonbelievers at Billy Graham Crusades “2% – 5% of the attendees `make a decision for Christ’ and only about half of these converts are active a year later. About 15% remain permanently converted” (Frank, 1974, p. 82), i.e., are active ten years or more later. Thus, less than one percent (.30% – .75%) of nonbeliever attendees at Billy Graham Crusades remain converted. Moreover, after the crusade, these people return home to their families, their jobs, and their established identities. In contrast, two studies of the less successful centers of one organization found that approximately 10% of the persons recruited into an introductory workshop leave their old lives behind and become full-time missionaries for the group within one month (Barker, 1983; Galanter, 1980), with 5% remaining members after two years (Barker, 1983). A close examination of the Galanter study, however, indicates that the percentage joining may be even higher than reported. Four “dropouts” in the Galanter study had been taken away by parents. If these persons had stayed, the percentage remaining after one month would have been 13%, rather than 9%. Moreover, the three workshop centers Galanter examined are well known by those familiar with the group to be less effective than the San Francisco center, about which there has been considerable controversy. Most of the recruits in these centers had simply been approached on the street, whereas most nonbelievers at Billy Graham Crusades had already had substantial contact with evangelists before attending the Crusade (Billy Graham Association, personal communication to Dr. Langone, 1989).
Some investigators (e.g., Barker, 1983) falsely interpret these findings as evidence that thought reform does not occur in groups commonly alleged to be cults. Their grievous error, however, is assuming that thought reform has virtually 100% effectiveness, which no serious researcher has ever claimed.
Thought Reform: Historical Background
Singer (1986) notes that during this century a series of events has demonstrated that individual autonomy and self- identity are much more fragile than was commonly believed. The Russian purge trials of the 1930s manipulated men and women into falsely confessing to crimes and falsely accusing others of having committed crimes (Mindszenty, 1974). The world press expressed bewilderment and amazement at the phenomenon, but, with few exceptions, soon ceased to pay attention to the phenomenon (Rogge, 1959). The late 1940s and early 1950s witnessed the effects of the revolutionary universities in China and the subjugation of an entire nation to a thought reform program that induced millions to espouse new philosophies and exhibit new behaviors (Chen, 1960; Hinkle & Wolff, 1956; Hunter, 1953; Lifton, 1961; Meerloo, 1951; Sargant, 1951, 1957, 1973; Schein, 1961). Next came the Korean conflict in which United Nations’ prisoners of war were subjected to an indoctrination program based on methods growing out of the Chinese thought reform program, combined with other social and psychological influence techniques (Farber, Harlow, & West, 1956; GAP, 1956, 1957; Schein, 1956).
At that time, the term “brainwashing” was introduced into our vocabulary, “a colloquial term applied to any technique designed to manipulate human thought or action against the desire, will, or knowledge of the individual” (Encyclopedia Britannica, 1975). Because of sensationalized journalistic accounts, however, “brainwashing” took on a sinister, mysterious connotation. Lifton’s (1961) concept of “thought reform,” besides being a more accurate translation of the Chinese term than “brainwashing,” helped explain, in addition to confessions by internees, the remarkable but nonviolent and noncoercive psychological changes produced in Chinese civilians in nonprison settings. Schein’s work (1956, 1961) also clearly demonstrated that neither a prison setting nor physical threats are needed to effect thought reform.
Following the Korean conflict, valuable, though sometimes controversial, psychological research helped illuminate the processes by which individuals could be controlled. Asch’s (1952) conformity studies, Milgram’s (1974) shock experiments, and Zimbardo’s (Zimbardo, Ebbesen, & Maslach, 1977) prison role-play experiment are merely some of the more well-known experiments (see Cialdini, 1984, for a review of the social psychological research).
As this academic work proceeded, other significant events rekindled the public’s interest in influence and control processes. Charles Manson’s diabolical control over a group of middle-class youths shocked the world during the early 1970s (Atkins, 1978; Bugliosi & Gentry, 1974; Watkins, 1979). By the mid-1970s, thousands of families in the United States were puzzled and alarmed about the influence a vast array of new gurus, messiahs, and mind-manipulators had over their offspring. Then on November 18, 1978 Jim Jones led 912 followers to death in a Guyanese jungle (Reiterman & Jacobs, 1982). This tragedy brought the concept of thought reform to the attention of the world.
After Jonestown, public interest in cultic groups increased significantly. Initially, most attention focused on religious cults, especially those with bizarre trappings or an exotic, eastern flavor. But as time passed, cultic features attributed to fringe Christian groups (Enroth, 1986), large-group awareness trainings (Cinnamon & Farson, 1979; Finkelstein, Wenegrat, & Yalom, 1982;), controversial drug rehabilitation programs (Gerstel, 1982; Hawkins, 1980; Hawkins & Wacker, 1983; Mitchell, Mitchell, & Ofshe, 1980; Rebhan, 1983), and psychotherapy groups (Temerlin & Temerlin, 1982, 1986) led to a broader application, and consequent confusion, of the terms “cult” and “thought reform.” We here propose a clarification: cult refers to a particular power structure and thought reform refers to a particular kind of social and psychological influence process. A group practicing thought reform need not necessarily be a cult, but a cult usually will practice thought reform in order to maintain its power structure.
Thought Reform: First and Second Generation Programs
Lifton (1961) described “eight psychological themes which are predominant within the social field of the thought reform milieu” (Lifton, 1961, p. 420):
- milieu control,
- mystical manipulation,
- the demand for purity,
- the cult of confession,
- the sacred science,
- loading the language,
- doctrine over person,
- the dispensing of existence.
Ofshe and Singer (1986) were the first to distinguish between “first generation of interest programs,” which Lifton’s work helped illuminate, and “second generation of interest programs,” on which Ofshe and Singer (1986) focused. First generation of interest programs included Soviet and Chinese programs designed to extract false confessions, inculcate desired political and social beliefs, and ensure conformity and obedience to the demands of leaders. The “management” of these state-sponsored, first- generation thought reform programs controlled at the start the material and social sources of feedback and reward/punishment for persons in the program. Through the skillful use of aversive arousal and peer pressure, leaders succeeded in altering the expressed political beliefs and attitudes of targeted persons.
Second generation of interest programs, such as are associated with cultic groups, also tend to be nonviolent and, furthermore, lack the physical power and authority of the state. Therefore, in order to control targets, these programs have had to rely on subterfuge and capitalize on natural areas of overlap between themselves and prospects. Like first generation programs, second generation programs use social influence techniques, tactics, and strategies that are well-documented in social psychological, marketing, and social-anthropological research literature (Cialdini, 1984; Gerstel, 1982; Hawkins, 1980; Hawkins & Wacker, 1983; Nader, 1990; Rebhan, 1983; Zimbardo et al., 1977). It is noteworthy that more persons break down psychologically in the second generation of interest thought reform programs, which lack the near-constant personal monitoring of subjects in first generation programs (Hinkle & Wolff, 1956; Lifton, 1961; Ofshe & Singer, 1986; Singer & Ofshe, 1990).
Second generation of interest programs initially present themselves as benevolent, promising to fulfill the needs of prospects. Recruiters shower much attention and other positive reinforcement on prospects. Seemingly intimate and caring conversations enable recruiters to assess the psychological and social states of prospects, to learn about their needs, fears, dependency potential, and actual and possible resistances. Testimonies from group members, credentials (whether valid or bogus) of leaders, attacks on the group’s competitors, and prospects’ favorable reaction to members’ seemingly warm and caring attentiveness tend to support the group’s claim of benevolence and superiority, and to convince prospects that they will benefit by joining the group.
Those prospects who do commit to the group are rarely aware of the subtle techniques of persuasion and control shaping their behavior, thoughts, and feelings. The apparently loving unanimity of the group masks strict rules against private as well as public dissent. Questions are deflected; critical comments are met with smiling pleas of “no negativity,” or some other “thought-terminating cliche,” to use one of Lifton’s terms. If prospects or new members persist in “negativity,” they will be reminded of personal problems, doubts, and guilty memories that they have revealed to leaders. Doubt and dissent are thus interpreted as symptoms of personal deficiency.
Prospects and new members slide down a spiral of increasing dependence on the group. They are often encouraged or ordered to live with other group members. In many cases, they even work with other members. People outside the group are viewed as spiritually, psychologically, or socially inferior, or as impediments to the members’ development. In order to “advance” at a satisfactory pace, members must spend long hours involved in various exercises deemed necessary by the group. In short, members spend more and more time with and under the direction of the group.
To ensure continuation of the group’s rewards (praise, attention, promise of future benefits, etc.), members must implicitly, if not explicitly, acknowledge the group’s authority in defining what is real, good, and true. In order to ensure that this acknowledgment is not mere lip service, the group continually challenges and tests members by establishing extremely high, if not impossible, expectations regarding activities (e.g., fund-raising, recruiting new members) and personal “development” (e.g., to be free of “negative” thoughts and doubts). Because dissent, doubt, and negativity are forbidden, members must project a facade of “happiness” and agreement while struggling to achieve the impossible. Those who fail to project the requisite facade (because, for example, they admit, usually with much guilt, to harboring doubts about the group) are attacked and punished, sometimes viciously. Those who persist in “failing through honesty” are, by one means or another, driven out of or ejected from the group. Those who succeed, whether without punishment or after punishment, do so because they learn to deceive themselves and others. They learn, much like hypnotic subjects exhibiting trance logic, how to convince themselves that the group is always right, even if it contradicts itself. Increasing isolation from the world outside the group, exhausting attention to activities serving the group, and hours practicing exercises that induce dissociative states (e.g., meditation, chanting, speaking in tongues) facilitate this splitting process, which in certain instances resembles what Lifton (1987) calls “doubling.” Psychological splitting enables members to adapt to the group’s double agenda, i.e., its contradictory sets of social rules. Members find themselves in a “loyalty/betrayal funnel” (MacDonald, 1988): if they remain loyal to their own perceptions about self and world, they betray the group on which they have become inordinately dependent; if they remain loyal to the group, they betray their own perception of what is real, good, and true. Dissent thus places members in a “funnel” from which there is no escape and which leads inevitably to betrayal either of themselves or the group. Hence, second-generation thought reform programs “attack the core sense of being — the central self-image, the very sense of realness and existence of the self. In contrast, the attack of first-generation programs is on a peripheral property of self, one’s political and social views” (Ofshe & Singer, 1986, p. 18). If second-generation programs, which operate in free, open societies, did not attack central elements of members’ selves, they would not survive. Information from outside the group would neutralize peripheral political and social indoctrination, much as it did to thought reform victims when they were released from captivity in Korea, China, and elsewhere.
Psychotherapy Cults: Case Examples
Psychotherapy cults may arise from the distortion and corruption of long-term individual therapy (Temerlin and Temerlin, 1982; Conason and McGarrahan, 1986) or group psychotherapy (Hochman, 1984), or may be started by a variety of nonprofessionals (West and Singer, 1982; Singer, 1983, 1986). Leaders of groups reported on here ranged from college faculty members to a paroled felon with less than a high school education.
The authors have independently studied 22 psychotherapy cults over recent years. The groups ranged in size from 15 persons to two groups which at their peak had more than 300 members. The larger of these two groups had 350 live- in and 400 peripheral members. The groups have existed from 5 to 25 years, and all but one are still in existence. Fifteen were led by professionally trained persons (psychologists, psychiatrists, social workers) who as time went on tended to raise former patients to “therapist” status in the groups. Seven were run by nonprofessionals (ranging from former clerks to convicted felons). The “therapists” were, with one exception, Caucasian. The patients were primarily middle class to upper middle class Caucasians with some college or advanced degrees. The groups were located in six states.
Traditional methods of field research in the behavioral sciences were followed. Personal interviews were conducted with as many informants from a group as were available. Documents (legal, media, in-house papers and published writings) were examined. When available, tapes made by the group leaders and group members while they were still in the group were reviewed. When possible, multiple informants were interviewed. Although in a few instances only one or two members of a group were available for study, as many as thirty-seven informants from one group were interviewed. Interviews ranged from two hours to dozens of hours per group. Some persons came to two of the authors for therapy; some were met in the context of litigation; some were sought out because of community knowledge of their experiences. Many persons whom we interviewed were excluded from this report because they had been the victims of abusive, illegal, or aberrant therapy practices in a setting that did not meet the definition of a cult. Dr. Temerlin interviewed and/or treated 38 persons (17 in long-term psychotherapy). Dr. Singer interviewed and/or treated 82 persons.
Trainees as followers. The senior author studied a cultic group that evolved when a psychiatrist and his wife offered their clinic as a supervision placement for students working on advanced degrees in psychology and counseling. The trainees are induced to move onto the property owned by the couple, to get money from their families for therapy with the man and wife, and to follow only this one form of therapy in other field-work placements. The couple induces followers to believe that only this therapy can save them and the world. The patient-trainees are induced to get younger siblings to move in and pay for therapy and to recruit other trainees at their schools to join the “movement.” The group has grown and moved to a rural setting where they are involved in running a residential treatment program for severely psychotic patients. The followers maintain the property, care for the psychotics, attempt to recruit other trainees and patients, and, as in the groups Temerlin and Temerlin studied (see below), are compiling and attempting to edit the taped ramblings of the leaders.
Temerlin and Temerlin. In 1982 Temerlin and Temerlin reported on five bizarre groups of mental health professionals which were formed when five teachers of psychotherapy consistently ignored ethical prohibitions against multiple relationships with clients. Patients became their therapists’ friends, lovers, relatives, employees, colleagues, and students. Simultaneously they became “siblings” who bonded together to admire and support their common therapist. (p. 131)
The improprieties of role violations by the therapists were compounded by their use of indirect, deceptive, and coercive influence techniques which led patients to comply with therapists’ wishes. These therapists violated ethical prohibitions against forming exploitative relationships with clients, misusing therapeutic techniques, and manipulating therapeutic relationships to the advantage of the therapists. These cults were formed when professionals deviated from an ethically based, fee-for-service, confidential relationship with clients and brought clients together to form cohesive, psychologically incestuous groups. Leaders were idolized rather than transferences studied and understood. Instead of personal autonomy being encouraged, patients were led into submissive, obedient, dependent relationships with their therapists. Their thinking eventually resembled what Hoffer saw in the “True Believers” (1951) and what Lifton (1961) termed “totalistic.” That is, the clients were induced to accept uncritically their therapists’ theories, to grow paranoid toward the outside world, to limit relationships and thinking to the elite world created by the cult-producing therapists, and to selflessly devote themselves to their therapists. The groups studied by Temerlin and Temerlin varied in size from 15 to 75 members and existed from 10 to 15 years.
For some years the media have reported the allegations made by former members of a New York group called the Sullivanians. The allegations, usually brought out in child custody suits, have been presented in the press since at least 1975 (Black, 1975). Recently, nearly identical allegations have been presented in other custody cases (Conason & McGarrahan, April 22, 1986; Henican, 1988; Lewin, 1988; McMorris, June 3, 1988; Reed, 1988; Span, 1988). Lewin (1988) stated:
For twenty years, the Sullivanians have been a quiet presence in Manhattan — a collective whose 200 members live together in three buildings on the Upper West Side and run the Fourth Wall Repertory Company, a political theater group in the East Village. (B.1)
But many of those who left the group say the Sullivanians have become a bizarre psychotherapy cult whose leaders control every aspect of the members’ lives, including their living arrangements, sexual practices, choice of profession, hobbies, child-rearing practices, and required thrice-weekly therapy sessions with a member of the Sullivan Institute for Research in Psychotherapy. (B.1)
A father explains:
The basic idea of the group is that the nuclear family is the root of all evil, and that a child shouldn’t have a special relationship with his parents, just as adult Sullivanians aren’t supposed to talk to their parents. . . Everyone, even the kids, is supposed to have as many “dates” as possible. I calculated that in one week, my son had dates with 23 different people. I don’t want him to live like that. (Lewin, 1988, B.1)
Lewin (1988) further reports that group members — even those who are married to each other — live in apartments with more than a dozen roommates of the same sex and are encouraged to sleep with a different member of the opposite sex each night.
The group reportedly began in 1957 when a group of dissident therapists broke away from the William Alanson White Institute, which had been founded by Harry Stack Sullivan. Interviews with twelve former members of the group and an examination of legal documents (Dobash v. Bray, 1985; Sprecher v. Sprecher, 1985) and media reports, such as those cited earlier, revealed that what had started out as a therapy center evolved into a psychotherapy collective and finally into a cultic organization that controlled almost all areas of so-called “patients'” lives. Litigation between former and current members is still in progress in the New York Supreme Court (Dobash v. Bray, 1985; Sprecher v. Sprecher, 1985).
Center for Feeling Therapy. Hochman (1984), writing about a now defunct school of psychotherapy, the Center for Feeling Therapy, described the many iatrogenic symptoms he found in former clients and patients who had been members of this group:
A cult that is destructive. . .veers toward remolding the individual to conform to codes and needs of the cult, institutes new taboos that preclude doubt and criticism, and produces a kind of splitting where cult members see themselves as an elite surrounded by unenlightened, and even dangerous, outsiders. (p. 367)
This group, which lasted approximately ten years, consisted of 350 patients living near one another and sharing homes in the Hollywood district of Los Angeles. Hundreds more were nonresident outpatients, and others communicated with “therapists” by letter. (Some therapists were licensed, others allegedly were patients assigned to be therapists.) Maximum benefit supposedly came only to residents, and patients were led to see themselves as the potential leaders of a therapy movement that would dominate the 21st century. The leaders promulgated a “theory” which maintained that individuals function with “reasonable insanity.” But if they learn to “go 100%” in five areas — expression, feeling, activity, clarity, and contact — they can put aside their “old images” and become “sane,” which was defined as the “full experiencing of feelings.” This latter, ambiguous objective was purported to be the attainment of the next stage in human evolution (Mithers, August 1988).
Numerous legal actions (Hart et al. v. McCormack et al., 1985; Raines et al. v. Center Foundation, 1985; State of California, Board of Behavioral Science Examiners v. Cirincione, 1985; State of California, Psychology Examining Committee, Division of Allied Health Professions, Board of Medical Quality Assurance v. Corriere, Gold, Hart, Hopper and Karle, 1985; State of California, Board of Medical Quality Assurance, Department of Consumer Affairs’ Board of Medical Quality Assurance v. Woldenberg, 1985; Timnick, April 21, 1986, September 30, 1987) concern the Center for Feeling Therapy. In her work on several of these cases, Dr. Singer interviewed 37 former members of this group and studied 92 affidavits, countless legal documents, and dozens of hours of taped therapy sessions. Dr. Temerlin also interviewed a number of former members and studied the cited collateral sources. (The exact number is not known as Dr. Temerlin died before this paper was completed.)
In these legal cases, which are cited above, defendants were charged with extreme departures from the standards of psychology, the standards of medicine, and the standards of psychotherapeutic care, including the following allegations:
- Created a sense of powerlessness in purported patients by stripping them of social support (friendship, kinship, ordinary environment, central occupational roles, wealth) and psychological confidence (through ridicule and creating states of physical exhaustion), and then enforced massive new learning demands through a reward/punishment mechanism (including threatened loss of status, anxiety and guilt manipulations, and physical punishment, as well as sexual harassment).
- Utilized racial, religious, and ethnic slurs, physical and verbal humiliation, physical, especially sexual abuse, threats of insanity and violence, and enforced states of physical and mental exhaustion.
- Represented to Center patients that they should hate and blame their parents for making the patients “crazy,” give up their children for adoption, and have abortions, ostensibly because Center members were too “crazy” to be parents.
- Engaged in sexual intimacies with patients, beat and caused patients to be beaten by other patients, allowed and encouraged nonlicensed “therapists” to conduct unsupervised therapy sessions.
- Clients were instructed to strip to their underwear and stand in a “stress position” with legs bent for an hour-and-a-half.
- Collected “donations” running into thousands of dollars from individual patients for the proposed building of a gym on the Center grounds but used the money to buy a ranch with other therapists in Arizona.
- Patients were made to stand naked in front of groups; patients were ordered to inspect the genitals of other patients in front of groups.
- A male patient, who wanted to return to college to study music rather than work as a mechanic in a Center business, was made to wear diapers, sleep in a crib, and eat baby food for eight weeks because his therapist said the patient wanted to live his life like a baby.
Timnick (1986), calling the Center “a once-trendy therapeutic community,” reported that the above legal hearings have “become the longest, costliest, and most complex psychotherapy malpractice case in California history” (p. 3). More than 100 former patients filed complaints of fraud, sexual misconduct, and abuse. Civil cases have settled for more than six million dollars on behalf of former clients (Timnick, 1987).
“Dr. Tim.” A forty-year-old, divorced, licensed, clinical psychologist developed a cultic following in 1971, a portion of which still exists even though he has been dead several years. Dr. Tim had clients move into his house. He charged them a monthly therapy fee plus room and board and directed their lives. Dr. Tim and most of his followers fled overseas together from an eastern state when legal charges were filed against him, including accusations that he engaged in sex with minors. The group lived communally for about seven years, until once again similar legal charges threatened the leader. The group moved a third time, returning to a western state.
Once developed, the group averaged forty members, including a few children. There was a fair turnover in membership during the thirteen years the group existed even though Dr. Tim warned that leaving him would cause them to lead lives of mental suffering. Patients recruited replacements for anyone who left. Leaving was difficult, however, as Dr. Tim sent the largest men in the group to retrieve anyone who left and who could be located. Persons who tried to leave openly were physically restrained.
Dr. Tim told clients that he was “more enlightened than Jesus…and had created the ultimate therapy, combining Freud, Zen, Kundalini yoga, and LSD.” The latter, he said, was to “override their egos.”
No criticism or complaints were tolerated by Tim, as such indicated “being in your head,” rather than “in your feelings.” Anything other than feeling was labeled “being in your stuff” and, therefore, an indication of your mental disorder.
In an initial individual therapy session, Dr. Tim privately diagnosed each new member as showing covert signs of a severe mental illness (e.g., paranoid schizophrenia, manic- depression) and announced that only he could cure the person.
In group sessions members were confronted, humiliated, and chastised by him as “dumb, stupid, and crazy.” They were told that their parents, especially their mothers, had caused their mental illness and they were to “dissociate” from them, except as sources of money for therapy. All phone calls involving parents were surreptitiously taped and played in group therapy sessions to demonstrate how harmful parents were to the patients.
Therapy included replacing intellectual careers with menial physical labor, ostensibly “to learn about the body, to have Zen experiences, and to learn to feel.” It also appears, however, that Dr. Tim wanted house and yard servants.
Additionally, the amounts and regularity of LSD he encouraged followers to use impaired attendance and performance at many jobs. Furthermore, low-paying jobs in nearby motels and resorts made it very difficult for “patients” to accrue money and flee the group. Dr. Tim also indoctrinated members to believe that the group was all the “family” and friends they needed. After all, he maintained, they lived in a big house and had access to cars, sex, and therapy such as is not available elsewhere.
Since Dr. Tim claimed that families were harmful, he broke up and prevented marriages, had children raised by “the group” and not the mothers. He also promoted homosexual as well as heterosexual contacts. He desensitized males by having four or five men live in a bedroom together and mutually masturbate. Then he introduced as a yoga practice having the men lie on the floor with one middle finger in their mouth and the other middle finger in another man’s anus while the same was done to them. While supervising these sessions, Dr. Tim would berate the men, who were bewildered because he had prescribed the practice. Dr. Tim had sexual liaisons with many of the women, several men, and certain teenage girls whose single parents were in the group. One nine-year-old girl reportedly was kept in her room for the major portion of three years, with group members often forgetting her food because they were “stoned” on drugs. Dr. Tim owned all property and cars and often used material he had learned in individual therapy sessions as leverage to cause patients to turn property and possessions over to him.
Soon after moving back to the West Coast, a number of followers left. Some went to legal authorities, only to learn that Dr. Tim had not kept up his licensure or insurance. Dr. Tim died of cancer shortly after returning to the West Coast. A small group of his former “patients” still live near one another and meet to extol his virtues and wonder why their lives have “never worked.” Their confusion continues twenty years after Dr. Tim started the group, which he promised would cure and free them. Ex-members claim that they and the group who still cling to Dr. Tim’s memory have been lastingly damaged.
Nonprofessional Therapy Cults
Parolees on the East Coast developed two psychological cults in states that have no laws regulating psychological practices. The men drew upon their own group therapy experiences during incarceration to develop restrictive, cultic groups when they returned to their communities. One group was based upon “primal scream” techniques, the other upon confrontational attack therapy of a type often labeled a “Synanon-clone” program.
The first man operated out of a second floor apartment in a busy metropolitan neighborhood. He recruited from nearby coffee houses, bookstores, and diners by approaching single males and females and inviting them to have coffee and talk with him about his “therapy.” Sometimes he used posters offering free lectures on “sex, psychology, and loneliness.” His street smarts, con-artist skills, and jargon combined to convey intense attention and seductiveness. He secured detailed histories in private sessions, later using that material in group sessions to demonstrate how “pained and damaged” each person was. He charged modest fees for initial meetings, but over time developed costly “intensives.” He assembled extensive information about each person’s financial status, family, “hang-ups,” and social contacts. He combined this information with his own interpretations of primal scream techniques to strip individuals of their defenses and make them increasingly dependent on him. About fifteen persons over the past eight years have spent their free time with him, returning from work to his place for their primal sessions, seeing him individually several times a week, and relying on him for most life decisions. Several college students secured money from parents to purchase “therapy” from him. He asked that all fees be paid in cash. There has been turn-over in followers, but a few have been present since he began the group.
The second group was begun by a parolee described as a middle-class, fortyish man who had learned confrontational attack therapy while in a drug rehabilitation program to which he had been remanded in lieu of prison time. His history reveals a character-disordered man, who upon leaving his drug rehabilitation program, saw the economic advantages of providing “therapy” to troubled, employed adults in a state that has no legal requirements about who can proclaim themselves to be psychotherapists. Using his assured, smooth, aggressive, and controlling ways, he “set up shop.” Initially, he attracted a few clients by initiating cafe and restaurant conversations, later instructing them to recruit their friends and families. Later, he enlisted the cooperation of a psychiatrist and psychologist. They were to “screen” and “study” certain of his clients, apparently to create an aura of “science and credibility.” Legal documents (not cited here in order to protect the anonymity of peripheral parties) suggested that no one he sent for screening was ever screened out or referred to more appropriate medical or psychological treatment and the research withered into a mere folder of random notes, drawings, and invoices for services.
For more than ten years he has controlled the lives of a group averaging sixty persons. Under his orders, these followers have limited their friendships to the group, severed relationships with their families of origin, spent most of their free time, including vacations, with him, and structured their lives after his dictates. He directs marathon confrontational groups, gives individual counseling, supervises the medical, financial, and social lives of his “clients,” and has them spend their vacations with him in a suburban plot they have helped finance in his name.
In another state, a man dropped his business career and began a cult composed primarily of airline flight attendants. He has participated in many encounter, sensitivity, and large group awareness training programs, and has had considerable personal therapy. He has read widely in “pop” psychology and skillfully uses various psychotherapy, interviewing, interpretative, confrontational, and defense-stripping techniques. He frequented restaurants near a local airport late at night. With a cup of coffee and the evening paper, he would politely approach a uniformed, female flight attendant who was eating alone and ask to share her table. Explaining that he was a single parent caring for his young children who were at home with a sitter, he inquired about the woman’s career and family and did a quick screening interview to locate lonely, vulnerable, trusting women. He made no sexual overtures, but as the woman was finishing her meal he wrote his address and phone number on a card and said in effect, “Tomorrow evening some friends are dropping by. I’ll be giving a little talk summarizing some of the reading and reflecting on psychology that I’ve been doing lately and have a few snacks and drinks for my friends. Please drop by if you can.” He eventually induced a number of women to move into his large home, pay rent, make “donations” for the group “lectures,” and receive psychological “counseling” from him. Eventually he induced his live-in “patients,” whom he counseled intensively, to limit their social life to the group and avoid former friends, family, and lovers. He met individually with each woman, repeatedly “analyzing” her past negative experiences and developing an intense “transference” relationship in which he was able to get each woman increasingly dependent on him. While there has been a turnover in membership, he has managed to develop a coterie of several long-term women followers who, when they are in town, are given the honor of being his lovers, baby-sitters, and housekeepers. The women more on the periphery are subjected to intense “uncovering” sessions in which he interprets their motivations and seeks to evoke intense “cathartic experiences.” Those who move more centrally into his domain replace those who move out.
Yet another cultic group was started about eight years ago by Ray, a man with no credentials. He has maintained a group averaging about 30 members during this period. A major portion of these followers are psychologists and graduate trainees in psychology. Ray attracts these professionals through widely publicized advertisements for seminars on empowerment. He states that he will teach them how to “merge, transform, and marry (their) own experience.” He claims he is “totally free, and if you want freedom badly enough, the universe just lays down at your feet.” He sells three-week basic trainings, usually held at attractive vacation resorts, in which he promises ways to “constantly recreate the self. . .how to bring no agendas with you and be totally free.” These skills in personal “being” are promised to make participants better therapists and “free” persons.
Ray selectively recruits certain attendees of the seminars to move to his home near a large city where he avers that they will “transform, loosen up, learn to surrender, be in service, and get off their holding positions, and will learn to trust.” Importantly enough, he has a “trust fund” to which followers are urged to contribute “cash only, no checks, no credit cards.”
After they move in, Ray tells his followers that they are “losers who should surrender their lives to him as he is the Master Guide.” Since most are professionals and from other states, they often have difficulty getting jobs because of licensing and other problems. Thus, most are forced to work at low-paying jobs or to borrow money in order to partake of Ray’s offerings.
Followers who leave report a constant pressure to be “new and active,” even as Ray tells them they are losers and puts them through odd and degrading “treatments.” These persons report that while they were in the group they were depressed, demoralized, and chronically anxious about how to “be.” Their self-esteem was crushed, and they felt dependent, wrong, and anxiously looking to Ray for behavioral clues. The predominant atmosphere of the group was a contrast between Ray’s high energy and their dependent, used role. Former followers describe Ray as seeming to be the most innocent, the most tender, and yet the most ruthless man they ever met. He berates his followers to “recreate yourself by transforming, merging, and indulging — marry that experience. I’m going to empower you.”
One experienced psychologist in his late thirties gave up his administrative career in a reputable clinic to be in the group for several years. He remarked, “Somehow when I was around him, I lost my sense of self; I lost all my knowledge, all my diagnostic skills. I failed to recognize a brilliant psychopath had control over me.” The group continues to thrive, and Ray now has two large facilities to house followers.
The groups reported on here, whether started by trained therapists or nonprofessionals, grew out of the leaders’ assuming multiple and controlling roles over the lives of so- called patient-followers. The leaders of these psychotherapy cults seemed to corrupt and exaggerate “trendy” notions in psychology and pop-psychology and to make unlimited claims for personal powers and skills. Thus, they constantly denigrated parents, marriage, and the family unit and extolled the raw expression of “feelings” while putting down intellect and reason as hindrances to personal growth.
Some, but not all, of the leaders widely promulgated the “getting out of your head” notion and, consequently, had followers drop technical or professional careers. In many cases, the resulting drop in income rendered followers even more dependent on the leader.
The personality, character traits, and fantasy lives of the leaders of such cultic groups appear to color and direct the paths a particular group takes. Several high-energy, glib, psychopath-like leaders, for example, created groups that they stirred into continual activity. In one case, a leader told a follower to move his residence 25 times in two months. In other cases, such as Dr. Tim’s homosexual group sessions, the leader slowly desensitized the members’ consciences and inhibitions in order to persuade them to take up conduct and values that enabled the leader to live out his fantasies, such as being a powerful person, being above the law, not having to earn a living, being cared for by devoted followers, indulging in unbridled sexual acts, etc. Temerlin and Temerlin’s early work involving trained therapists functioning as cult leaders (Temerlin & Temerlin, 1982, 1986) identified the major features of the psychotherapy cults reported on here. They wrote:
charismatic psychotherapists can so manipulate the therapeutic relationship that they produce groups which function much like destructive religious cults. . .The techniques used by cult therapists. . .a) increase dependence, b) increase isolation, c) reduce critical thinking capacity, and d) discourage termination of therapy. (Temerlin & Temerlin, 1986, p. 234)
Subsequent research, reported on here, extended Temerlin & Temerlin’s findings to psychotherapy cults produced by nonprofessionals. Like other types of cults, psychotherapy cults tend to employ coordinated programs of exploitative influence and behavioral control in order to subjugate members to the needs and wishes of leaders. These groups well illustrate the processes that characterize a thought reform program, and they clearly show how leaders attack and undermine followers’ sense of self, thereby depriving them of the capacity to make autonomous, informed judgments about the world and themselves.
Asch, S. E. (1952). Effects of group pressure upon the modification and distortion of judgments. New York: Holt, Rinehart and Winston.
Ash, S. M. (1985). Cult-induced psychopathology, part 1: Clinical picture. Cultic Studies Journal, 2, 31-90.
Atkins, S. with Slosser, B. (1978). Child of Satan, child of God. New York: Bantam Books.
Barker, E. (1983). The ones who got away: People who attend Unification Church workshops and do not become members. In E. Barker (Ed.), Of gods and men: New religious movements in the West. Macon, GA: Mercer University Press.
Black, D. (December 15, 1975). Totalitarian therapy on the upper West Side. New York Magazine, 54-56.
Brainwashing. (1975). Encyclopedia Britannica. Bugliosi, V., & Gentry, C. (1974). Helter skelter. New York: Bantam Books.
Chen, T.E.H. (1960). Thought reform of the Chinese intellectuals. New York: Oxford University Press for Hong Kong University Press.
Cialdini, R. B. (1984). Influence: How and why people agree tothings. New York: William Morrow and Company, Inc.
Cinamon, J.G., & Farson, D. (1979). Cults and cons: The exploitation of the emotional growth consumer. Chicago: Nelson-Hall.
Conason, J., & McGarrahan, E. (April 22, 1986). Escape from utopia. Village Voice. Cultism: A conference for scholars and policy makers. (1986). Cultic Studies Journal, 3, 117-134.
Dobash v. Bray, Supreme Court of the State of New York, County of New York (1985).
Enroth, R. (October, 1986). Churches on the fringe. Eternity, 17-22.
Farber, I.E., Harlow, H.F., & West, L.J. (1956). Brainwashing, conditioning, and DDD (debility, dependency, and dread). Sociometry, 20, 271-285.
Finkelstein, P., Wenegrat, B., & Yalom, I. (1982). Large group awareness training. Annual Review of Psychology, 33, 515- 539.
Frank, J. (1974). Persuasion and healing. New York: Schocken Books.
Gerstel, D.U. (1982). Paradise, incorporated: Synanon. Novato, CA: Presidio Press.
Group for the Advancement of Psychiatry. (1956). Factors used to increase the susceptibility of individuals to forceful indoctrination: Observations and experiment.
Group for the Advancement of Psychiatry. (1957). Methods of forceful indoctrination: Observations and interviews.
Hart et al. v. McCormack et al., Superior Court of the State of California, Los Angeles County, No. 000713 (1985).
Hawkins, J.D. (1980). Sidebets and secondary adjustments. Seattle, Washington: Center for Social Welfare Research, University of Washington (duplicated manuscript).
Hawkins, J. D., & Wacker, N. (1983). Verbal performance and addict conversion: An interactionist perspective on therapeutic communities. Journal of Drug Issues, 13, 281-298.
Hearst, P., with Moscow, A. (1982). Every secret thing. New York: Doubleday and Company, Inc.
Hennican, E. (May 31, 1988). Dads battle “cult” for children. Newsday.
Hinkle, L.E., & Wolff, H.B. (1956). Communist interrogation and indoctrination of “enemies of the state.” Archives of Neurology and Psychiatry, 76, 115-174.
Hochman, J. (1984). Iatrogenic symptoms associated with a therapy cult: Examination of an extinct “new psychotherapy” with respect to psychiatric deterioration and “brainwashing.” Psychiatry, 47, 366-377.
Hoffer, E. (1951). The true believer. New York: Harper & Row, Publishers.
Hunter, E. (1953). Brainwashing in Red China: The calculated destruction of men’s minds. New York: Vanguard.
Langone, M.D. (1988). Cults: Questions and answers. (Available from the American Family Foundation, P.O. Box 336, Weston, MA 02193.) Lewin, T. (June 3, 1988). Custody case lifts veil on a “psychotherapy cult.” The New York Times, B1-B2.
Lifton, R.J. (1961). Thought reform and the psychology of totalism. New York: W. W. Norton
Lifton, R.J. (1987). The future of immortality and other essays for a nuclear age. New York: Basic Books, Inc.
MacDonald, J. P. (1988). “Reject the wicked man” — Coercive persuasion and deviance production: A study of conflict management. Cultic Studies Journal, 5, 59-121.
McMorris, F. (June 3, 1988). Cultism and sex may hype trial. New York: Daily News Merloo, J.A.M. (1951). The crime of menticide. American Journal of Psychiatry, 107, 594-598.
Milgram, S. (1974). Obedience to authority: An experimental view. New York: Harper and Row.
Mindszenty, J. (1974). Memoirs. New York: MacMillan.
Mitchell, D., Mitchell, C., & Ofshe, R. (1980). The light on Synanon. New York: Seaview Books.
Mithers, L. (August, 1988). When therapists drive their patients crazy. California, 76-86, 135-136.
Nader, L. (1990). Harmony ideology. Stanford, California: Stanford University Press.
Ofshe, R., & Singer, M.T. (1986). Attacks on peripheral versus central elements of self and the impact of thought reforming techniques. Cultic Studies Journal, 3, 3-24.
Raines et al. v. Center Foundation, Superior Court of the State of California, Los Angeles County, No. 372-843, consolidated with C 379-789 (1985).
Rebhan, J., (1983). The drug rehabilitation program: Cults in formation? In D. C. Halperin (Ed.), Psychodynamic perspectives on religion, sect and cult. New York: John Wright.
Reed, S. (July 25, 1988). Two anxious fathers battle a therapy “cult” for their kids. People Weekly.
Reiterman, T., & Jacobs, J.R. (1982). Raven: The untold story of the Reverend Jim Jones and his people. New York: E.P. Dutton, Inc.
Rogge, O.J. (1959). Why men confess. New York: Thomas Nelson and Sons.
Sargant, W. (1951). The mechanism of conversion. British
Medical Journal, 2, 311-316. Sargant, W. (1957). Battle for the mind. New York: Harper and Row.
Sargant, W. (1963). The mind possessed. New York: Penguin Books.
Schein, E.H. (1956). The Chinese indoctrination program for prisoners of war. Psychiatry, 19, 149-172.
Schein, E.H. (1961). Coercive persuasion. New York: Norton. Singer, M.T. (October 29, 1983). Psychotherapy cults. Address to Citizens Freedom Foundation, Los Angeles, CA.
Singer, M.T. (1986). Consultation with families of cultists. In L. C. Wynne, S.H. McDaniel, & T.T. Weber (Eds.), The family therapist as systems consultant. New York: Guilford Press.
Singer, M. T., & Ofshe, R. (1990). Thought reform programs and the production of psychiatric casualties. Psychiatric Annals: The Journal of Continuing Psychiatric Education, 20, 188-193.
Span, P. (July 25, 1988). Cult or therapy: The custody crisis. The Washington Post.
Sprecher v. Sprecher, Supreme Court of the State of New York, County of New York, Index no. 75207/85 (1985).
State of California, Board of Behavioral Science Examiners, No. M 84, L 31542 v. Cirincione, Franklin, Gold and Gross (1985).
State of California, Psychology Examining Committee Division of Allied Health Professions, Board of Medical Quality Assurance, Department of Consumer Affairs v. Corriere, Gold, Hart, Hopper and Karle, Case L-30665, D-3103 through 3107 (1985).
State of California, Board of Medical Quality Assurance, Department of Consumer Affairs v. Woldenberg, No. D-3108, L-30664 (1985).
Temerlin, M.K., & Temerlin, J.W. (1982). Psychotherapy cults: An iatrogenic perversion. Psychotherapy: Theory, Research, and Practice, 19, 131-141.
Temerlin, J.W., & Temerlin, M.K. (1986). Some hazards of the therapeutic relationship. Cultic Studies Journal, 3, 234-242.
Timnick, L. (April 21, 1986). State targets mental health officials’ licenses in complex malpractice case. Los Angeles Times.
Timnick, L. (September 30, 1987). Psychologists in “Feeling Therapy” lose licenses. Los Angeles Times, 1, 4.
Watkins, P. (1979). My life with Charles Manson. New York: Bantam Books.
West, L.J., & Singer, M.T. (1980). Cults, quacks and nonprofessional psychotherapies. In H.I. Kaplan, A.M. Freedman, & B.J. Sadock (Eds.), Comprehensive textbook of psychiatry, III. Baltimore: Williams and Wilkens.
Zimbardo, P.G., Ebbesen, E.B., & Maslach, C. (1977). Influencing attitudes and changing behavior: An introduction to method, theory, and applications of social control and personal power. Reading, MA: Addison-Wesley.
We are grateful to Jane Temerlin, M.S.W., wife and colleague of the late Dr. Maurice Temerlin, for her assistance in completing this manuscript, which meant so much to her husband.
This article is an electronic version of an article originally published in Cultic Studies Journal, 1990, Volume 7, Number 2, pages 101-125.