Why We Can’t Eliminate Cults, and How We Are Drawn In

ICSA Today, 2020, Vol. 11, No. 1, 8-14

Why We Can’t Eliminate Cults, and How We Are Drawn In

Russell Bradshaw

This article is based on a paper presented at ICSA’s Annual Conference in Manchester, United Kingdom in 2019.

Introduction: An Evolutionary Psychology Perspective

We can’t eliminate cultic groups because they appeal to two of our strongest human needs, and because psychopathic and narcissistic personalities may be inclined to exploit and control other persons.

First, we feel a deep-seated need to belong to a group. This is how our species has survived. Through millions of years of natural selection, we have evolved into social animals. Second, our social evolution and cognitive development eventually created a deep-seated human need to understand our life on earth: We try to understand our particular, personal place in family, society, and the cosmos. As we evolved as a species, we experienced a growing need to answer persistent questions that impact our personal well-being and our unique existence. These two basic needs, to feel part of a group and to understand our lives, though often unconscious, are now part of our evolutionary inheritance.

Membership in some kind of a group or groups is therefore rewarding and necessary for most people. Indeed, without such membership, people may experience serious mental and social problems. Many researchers feel that the present epidemic of depression and anxiety illnesses can be traced, at least partially, to increasing loneliness and social separation in our society. However, these deep-seated needs, to belong and to believe, also open the door to the manipulations of narcissistic and psychopathic leaders of cultic groups, which purport to offer both fellowship in a close-knit group and answers to existential questions.

In other words, cultic groups emerge as a form of social and existential collateral damage resulting from our deeply imbedded evolutionary inheritance exploited by unscrupulous individuals. This outcome is especially evident in the context of our postmodern alienating and anomic society. Although our material well-being has increased in advanced societies, our cultural angst and insecurity have also increased. Adolescents and young adults in particular may find these inspiring and idealistic groups especially supportive and appealing as they intently seek their psychosocial identity (Erikson,1980).

Obviously, not all groups are harmful. So how can we distinguish between a normal or beneficial social group and a harmful one? In his “Origins and Prevention of Abuse” article (ICSA Today, 2016, pp. 11–13), Michael Langone finds two critical characteristics that distinguish between these two kinds of groups: In benign or beneficial groups, he explains, members are treated as subjects, with respect for their individuality and autonomy; whereas, in harmful groups, members are treated as objects, attracted, retained, and manipulated at the expense of their autonomy through undue influence and coercive control. These latter are the groups often referred to as cults. Rod Dubrow-Marshall, Maarten van de Donk, and Wessel Haanstra (2019) give a comprehensive description of the powerful social-influence processes used to retain members after initial recruitment.

Our Basic Needs and Cult Recruitment

How, then, are our basic social and cognitive instincts—the need to belong and the need to understand—related to whether or not we become engaged in a cultic group? Although we all have these two evolutionary needs, not everyone becomes involved in a cult. Extremist groups are not active everywhere, and there is a host of individual and environmental variables that promote or hinder our general tendency to belong to a group. Among these variables are the following:

Intelligence. Research has shown that cultic group members are not more mentally or psychologically challenged than the rest of the population; in fact, the intelligence and educational level of members are often higher than the average. They are often seeking to improve themselves or society, and therefore may be more open to unusual or alternative approaches to life’s challenges. The explanation often provided for the success of cultic groups, that weaker or more troubled individuals become members, is not supported by research.

Life Circumstance. Everyone goes through psychosocial developmental stages, with their inherent transitions and crises. All human beings also have personal weaknesses, somewhere in their psyches (e.g., see the works of Freud, Erikson, Kohlberg, Fowler, and Maslow). Narcissistic leaders and cultic groups are adept at identifying situations of transition and crisis, in which people are vulnerable, and exploiting the human weaknesses that may be exacerbated under these conditions. These leaders often find promising recruits in self-help or therapy groups, political movements, religious gatherings, or pyramid business enterprises. Cult recruiters often seek out prospective recruits on college campuses, where students are at a particularly vulnerable stage of life as they emerge from the protective environment of their families and confront adult challenges on their own.

Chance. Perhaps one of the most important variables involved in who joins a cultic or extremist group and who does not may be that someone is merely in the wrong place, and at the wrong time.

I should note here that this paper does not deal with the special issues of people who are born or raised in cultic groups.

Universal Social Influence Processes

In understanding how and why people may be recruited into cultic groups, I have found the work of Robert Cialdini particularly helpful. In Influence: Science & Practice (1984), Cialdini described six social-influence processes: reciprocation, commitment and consistency, social proof, liking, authority, and scarcity. In Pre-Suasion: A Revolutionary Way to Influence and Persuade (2018), he adds a seventh universal social-influence process: unity. He proposes that these processes tend to produce a “click-whir” automatic response in people. These unconscious, automatic reactions arose as survival responses in our early, physically dangerous environments; they were usually correct and gave us an evolutionary advantage. Sometimes, however, the automaticity of these ingrained responses can be maladaptive.

In Pre-Suasion…, Cialdini maintains that, when humans in our modern, evolved age find themselves overwhelmed by information, fake news, stress, and insecurity, they tend to take the same age-old mental shortcuts, unconsciously betting on the odds, and often ending up having the same, unreflective responses. In our modern. digital and high-technology age, the tendency toward this primitive-consent automaticity (based on our evolutionary history) becomes potentially dangerous. People often feel “overwhelmed” and stressed, unable to fully understand or control the increasing flood of new and revolutionary processes and inventions that are taking control of our lives. We become cognitively and emotionally numbed out. It is this overwhelmed state that triggers the old, unreflective, automatic reflexes of survival. This uncritical mental state makes us vulnerable and can be used (consciously or unconsciously) by cultic and extremist leaders and groups to gain influence.

In Influence…, Cialdini explains how his initial six fundamental psychological principles direct human behavior. They give the tactics of influence their power. These influence processes are used in all human groups, and are often consciously employed by advertising firms, businesses, and political groups, for example. He calls these principles weapons of influence. In Pre-Suasion…, he mentions how the (now seven) principles and tactics may be used by narcissistic leaders and their cultic and extremist groups in recruiting new members:

  • Reciprocity. If you give me something (goods or services), I am obliged to give you something back. This is the basis of social give-and-take. It is deep-seated primate behavior. Social bonding and hierarchy building are ultimately based on this behavior. For example, the Moonies (Unification Church) gave free flowers to people. This was a successful fundraising and support-building effort because people felt obliged to give them something back: money or liking, or even some form of commitment. The Hare Krishnas (ISKCON) gave passersby a free copy of the Bhagavad Gita, and people tended to feel obliged to give something in return. In my group, the guru gave me personal recognition and hundreds of dollars in gifts; so I felt obligated to work hard for his mission.

  • Consistency. As one interpretation of the saying “In for a penny, in for a pound” suggests, once an initial commitment is made (even if it is relatively insignificant), we have a tendency not to want to go against our original decision. This inclination comes from an unwillingness to admit we have made a mistake in the first place. In psychological theory and research, this is called cognitive dissonance, our tendency to resolve an apparent contradiction (dissonance) by rationalizing or suppressing the dissonant element. In one study, for example, when the end of the world didn’t occur at the appointed time, instead of followers feeling misled and becoming disillusioned, the belief of many actually increased because the cult leader explained that their “devotion” had saved the world from destruction! This explanation was consistent with and reconfirmed their previous belief.1 In my group, males were required to wear white clothes to spiritual functions. At first I thought this was silly. I felt like an ice cream salesman! I was convinced by a friend that this was only a cultural ingredient of the leader’s otherwise very idealistic and spiritual path. Wanting to conform, I concluded, “What the heck, no big deal.” I went along with the requirement, and I began to feel more like one of the group. This was just the first small step of many later rationalizations that kept coming.

  • Social Proof. “Truth ‘R Us”—we are convinced by the actions of trusted individuals, especially those like ourselves, or those we admire: If they think something is a good thing, well then, it probably is a good thing. As primates, we tend to follow our group. I became a member of a high-demand, cultic group in part because I was impressed by the high quality and idealism of the leader’s disciples—many of whom are still friends today, after 23 years in the group, and 16 years after having left it.

  • Liking. Friendly thieves—physical attractiveness (culturally defined) attracts! Similarity in background, interests, values, likes and dislikes, and so on—attract. Compliments given to us, and being recognized, also attract us, as do shared aspirations and values. Like the other six principles, this influence process is almost invisible.

  • Authority. Directed deference—a deep-seated, primate tendency to trust authority (even blindly), leads us to respect those with the apparatus of authority: military or police uniforms, priests’ collars, gurus’ robes, doctors’ white jackets and stethoscopes, chief executives’ expensive suits and ties, scientists’ and officials’ clipboards, and the like tend to engender obedience. “Follow the group leader” has been successful evolutionary logic for social animals for thousands of years. The classic studies of this phenomenon are Solomon Asch’s lab experiments (1956); Stanley Milgram’s Yale experiments (1956); and Philip Zimbardo’s Stanford Prison experiments (ending in 1971). Asch’s experiments were focused on laboratory experiments showing how human perception could be influenced by external social factors; Milgram’s experiments focused on how authority figures (doctors, professors, psychiatrists, etc.) can influence and persuade us to go against our own better moral judgements; Zimbardo’s social psychology experiments involved roleplaying between powerful prison guards and powerless prisoners, and the insidious development of coercive control and even brutality.

In my personal case, I was impressed by the fact that our leader regularly met with famous world leaders, celebrities, musicians, athletes, and religious leaders (e.g., Gorbachev, Mandela, Mother Theresa, Carl Lewis, Whitney Houston, Carlos Santana, (who actually became a disciple; the others were classified as “admirers”); numerous Nobel prize winners, Bill Clinton and numerous other politicians, Ravi Shankar, Leonard Bernstein, Pir Vilayat Khan, Mats Wilander). (This was also a case of social proof.)

  • Scarcity. If we can be convinced that something (goods, services, psychological or spiritual benefits, or even smiles or tenderness) is in short supply, we will be influenced to want to have it—again, this is deep-seated, evolutionary behavior. For example, my group leader was sparing and unpredictable in giving personal recognition and praise. Once, after the group had spent all night framing and hanging the leader’s pen-and-ink drawings at an art gallery, the guru smiled at me. I thought, “Did you see that! Guru smiled at me! It was all worth it!”

  • Unity. We is the shared Me. The ability to influence (and change) others is often, and importantly, grounded in shared personal relationships, which create a “pre-suasive” context for assent (e.g., “OK, my dear friend; yes, I’ll join…”). This term was coined by Cialdini, and it means that the influencer lays the groundwork for an eventual attempt at persuasion—before the actual influence tactics (or “weapons of persuasion” [Cialdini, 1984/1993]) are employed. The relationship of unity is not “Oh, that person is like us” (although this also works to a lesser degree); rather, it is “Oh, that person is us.” Unity is about shared identities. It’s about the categories that individuals use to define themselves and their groups, such as race, ethnicity, nationality, and family, and also political and religious affiliations. A key characteristic of these categories is that their members tend to feel one with, merged with, the others. There is an overlapping of self and other identities within we-based groups. In the group I was in for more than twenty years, we all honestly felt that we were brothers and sisters, and we shared our lives together. This level of relationship has resulted in an intense feeling of oneness and belonging and security.

My Personal Experience of Being Recruited/Joining

When I was recruited into a New Age, high-demand, yoga group, I was a graduate student in a foreign country without a full-time job; I was largely unaware of my own personality and evolutionary-based weaknesses. I was also totally unaware of cultic groups, narcissistic spiritual leaders, and social-influence processes. This was in the early 1970s, before the exposés concerning high-profile, cult-like groups such as the Branch Dravidians, Scientology, Aum Shinriko, Catholic Church sects and pedophile priests, the Osho/Rajneesh group, the Unification Church (Moonies), and others. It was a time often viewed as one of relative innocence, reflected in the Beatles, Eastern meditation, self-improvement, and progressive social change. I had started practicing Transcendental Meditation and was spiritually/philosophically seeking. As a high-school student I listened to Bob Dylan; Muddy Waters; Joan Baez; Doc Watson; Lightning Hopkins; John Lee Hooker; Peter, Paul, and Mary; Pete Seeger; Judy Collins… Although I came from a conservative, middle-class, Republican family, I was relatively open-minded.

A relative of my wife had become a disciple of a serious, celibate, and “God-Realized Master,” an Indian guru living in New York City (NYC) who was recognized and accepted at the United Nations headquarters. I resisted this relative’s well-intended and honest efforts for about four years. However, when my wife accompanied the relative on a spiritual pilgrimage to the guru’s NYC ashram and was convinced to become a disciple, I was forced to make a decision. At the time I made my decision, I was (ironically) at a bathing-suit-optional beach, drinking a beer, and enjoying a beautiful, sensual summer. I loved my wife and respected her decision; so I eventually shaved off my beard and mustache, cut off my ponytail, stopped drinking beer, and also became a disciple on a very serious, spiritual path. We later moved to Queens, NYC to be where the guru was.

As I am able to recognize now, many of the required values, beliefs, and behaviors for this discipleship were selectively concealed in the beginning. We were told we could merely “reduce” our sexual activity, for example: we did not have to totally renounce it; but later we found out that celibacy was a demand for single disciples. As novices, we were gradually induced to conform as we became more solid and more integrated into the group. By then, however, we had close and trusted acquaintances, and we had experienced the pleasantness of being accepted by the group, and of being approved of and stroked both by the guru and other disciples. We felt special, and we felt kinship. Importantly, the powerful psychological process of cognitive dissonance was also already at work: We either rationalized away or developed blinders to discrepant or disturbing incidents, behaviors, or values. Eventually we learned of the double morality and deceitfulness of the guru, when we talked personally to women disciples and read their testimonials about their sexual abuse at his hands. After all, his ‘celibacy’ was only self- proclaimed and difficult to authenticate (!).

At the time, I felt I had made a voluntary decision to join this inspiring, energetic, and idealistic group (“world peace through inner peace and outer manifestation”). Now I understand that I was actually recruited by loved ones—family and friends. Yes, I did make the decision to join; but also, with the perspective of Cialdini’s universal-influence processes, I recognize that I was unconsciously recruited through liking and social proof. Most importantly, perhaps, I felt a sense of unity with my wife and her relatives. We shared so much at this point in our young and seeking lives, and we were family.

I found fellowship with kindred, serious seekers at the guru’s ashram. We meditated together and traveled as a group around the world. We worked long hours, producing flyers, T-shirts, books, pamphlets, articles, and posters; and we translated the content into many languages. We arranged meetings with world leaders, celebrities, Nobel prize winners, musicians, athletes, and politicians. Many of us felt that we had achieved spiritual insights and made progress: We lived in a spiritual greenhouse! This was a high point of our young, seeking years.

Many of my best friends today are people I met in this group, who are now former members. We trained and ran marathons together, we sang in choirs together, we meditated for many hours together, we travelled around the world together, we prepared meals and ate together; we manifested together; we guarded (the guru) together at functions and activities; we presented plays; we participated in parades and circuses together. We suffered, celebrated, and made progress together. Perhaps most importantly, we shared common values, beliefs and goals—examples of liking, social proof, and unity.

These three influence processes, along with authority, helped to convince us to accept the genuineness of our leader. We took his Eastern and exotic ways as proof of his authenticity. Holy men of every religion, we assumed, were sincere and morally motivated to help a struggling and suffering humanity. Our guru’s dhotis, robes, and incense, chanting and singing in Bengali, his hundreds of books, his meetings with world leaders, his acceptance by famous celebrities, were all trademarks of his spiritual authority.

In the beginning phase, I was showered with attention, approval, and encouragement—a treatment often referred to as love bombing. I was even given presents, money, and public recognition. I was treated as special. However, after this introduction, gradually the guru seemed to be stricter; my fellow recruits and I were more confined. We were now selfless servers, not to expect praise or reward for our actions. For their own spiritual good, we were even expected to report to the leader when others became weak or misled, or they misbehaved. This protocol, we were to understand, was how real leaders acted, for the spiritual and psychological advancement of their students. There were many examples of this kind of reasoning (and rationalizing) in books, in videos, and even in Beatles’ songs.

No One Ever [Knowingly] Joins a Cult

What I have described about my own experience is quite typical of high-demand, cultic or extremist groups. It is important to understand that people don’t join groups because or knowing that they are cults; they don’t choose a leader knowing that, or because that individual is deceptive and narcissistic. Rather, their engagement in a cultic group is usually an unconscious, gradual process in which they are deceptively persuaded by largely automatic, unconscious, and evolutionary/hard-wired social-influence processes, only recognizing later, if they recognize it at all, that the group and leader are cultic. It is sometimes suggested that members join these groups because the groups are so different and exotic; yet it is often the case that members become engaged or stay in cultic groups in spite of, not because of, the groups’ exotic or unusual aspects. Those are not the main attraction: Rather, the appeal is the close fellowship, the sense of unity, and the security of finding what they believe are existential answers.

Michael Langone’s description of why people join provides a succinct summary of the flow of cultic and extremist involvement:

…Believing & Belonging (B&B) lead us to seek and hold on to group affiliations. Groups vary on numerous dimensions. Individuals vary on numerous dimensions. Groups that lean toward treating persons as subjects to be respected rather than objects to be manipulate may provide positive experiences for individuals seeking B&B. Groups that lean toward treating persons as objects to be manipulated rather than subjects to be respected may exercise unethical levels of control over those who are vulnerable, for whatever reasons, to the group’s sales pitches and manipulations. Whether a person encounters an unethical group that can take advantage of the person’s vulnerabilities may depend upon chance factors, on being in the wrong place at the wrong time… (personal correspondence, August 3, 2019)

Conclusion: Can We Help?

We can alleviate some suffering by helping individuals avoid joining in the first place, by helping individuals recognize their own weaknesses and innate needs. This might include, for example, referring to the Latin myth of Narcissus and the mountain nymph Echo, whose interactions are presently interpreted as a metaphor for narcissists and codependents (see Dan Shaw, 2014). Also, attachment theory can be discussed, describing how the early interaction of caregiver and infant can impact adult personality issues that stem from anxious or ambivalent, instead of secure infant emotional attachment to the caregiver. We can also conduct more research and better disseminate information about the misleading and deceptive nature of narcissistic and charismatic leaders and their cultic groups. We can require the subjects of critical thinking, internet awareness, and social media skills as important elements in educational curricula.

We can provide knowledgeable, sympathetic counselling for those exiting cultic and extremist groups. Cultic-group survivors are often suffering and confused. Members have been, in effect, gradually brainwashed for a long time. They have been subject to intense social-influence processes, and have assimilated and internalized group propaganda and misleading information. They have built up an alternate view of reality relative to normally accepted social views. They may have, to varying degrees, problems reentering normal society and reconnecting with friends and family.

Unfortunately, however, increased information and education have not been successful in eliminating cults. Their innate appeal is too deeply imbedded in us. In times of crisis, stress, and transition, individuals may wander (or be lured) into a seductive spider’s web by narcissistic leaders with appealing belief systems and supportive groups that we otherwise might not have stumbled into.

In the early 1980s, many of us believed we could eventually eliminate destructive cultic groups through educational programs and widespread media coverage. At present, however, rather than believing we can wipe out cultic groups, we realize we need to greatly extend our research efforts, including new theory and research into our hard-wired, primate, evolutionary impulses and needs (evolutionary psychology). We must reexamine our widespread and often desperate longing for group fellowship and the fundamental role of a philosophical, spiritual, even mystical experience of our lives and our place in the universe. We are now beginning to focus on educating ourselves and our students about why, under the right conditions, we may be predisposed to join these groups. We must also describe how the social-influence principles, using the cultic dynamic or coercive control, function to keep us from leaving once we are inside.

As evolved human beings, with all the benefits of being sentient, psychosocial, and existentially aware, it may not be easy or attractive to look back at our evolutionary psychological past. Nevertheless, we must undertake this reflection. Although it is unlikely that we will eliminate cults, it is within our power to greatly reduce their influence and harm.


Berger, P., & Luckmann, T. (1991). The social construction of reality: A treatise on the sociology of knowledge. (First published by Doubleday in 1966.) New York, NY: Anchor Books.

Bradshaw, R. (2013a). Attachment theory: Freud, Bowlby, Ainsworth, & Main. Lecture ESC 501, Lehman College, CUNY.

Bradshaw, R. (2013b). Comparative chart of stage theories of development: Freud, Erikson, Piaget, & Kohlberg. Lecture ESC 301, Lehman College, CUNY.

Burks, R. (2019). Toward a neuroscience of thought reform. Paper presented at the ICSA Annual Conference, Manchester UK, July 6. [An excellent and understandable overview.]

Cialdini, R. (1984; 1993). Influence: The psychology of persuasion. New York, NY: Quill/William Morrow.

Cialdini, R. (2016). Pre-suasion: A revolutionary way to influence and persuade. New York, NY: Simon & Schuster.

Dubrow-Marshall, R., van de Donk, M., & Haanstra, W. (2019). Lessons from adjacent fields: Cults and radical extremist groups. ICSA Today, Vol 10, No. 1, pp. 2–9.

Erikson, E. (1963). Childhood and society. (Republished in 1964, 1983; reissued in 1993). New York, NY: W. W. Norton

Erikson, E. (1980). Identity and the life cycle. (Originally published in 1959.) New York, NY: W. W. Norton.

Erikson, E., & Erikson, J. (1997). The life cycle completed. (First published in 1982.) [About the ninth stage of psychosocial development and ”Gerotranscendance.] New York, NY: W.W. Norton New York, NY: W.W. Norton

Fowler, J. (1995). Stages of faith: The psychology of development and the quest for meaning. (First published in 1981.) New York, NY: HarperCollins.

Frankl, V. (2006). Livet måste ha mening (Man’s search for meaning). Stockholm, Sweden: Natur&Kultur.

Freud, S. (1930/2018). Civilization and its discontents. (Originally published in German as Das Unbehagen in der Kultur. authorized translation by J. Riviere. English translation by J. Riiere. Paperback, 2010, Eastford, CT: Martino Publishing. Hardcover, 2018, Pittsburgh, PA: General Press.) Austria: Internationaler Psychoanalytischer Verlag Wien.

Goldberg, L., Goldberg, W., Henry, R., and Langone, L. (Eds.). (2017). Cult recovery: A clinicians guide to working with former members and families. Bonita Springs, FL: ICSA.

Goleman, D. (2005). Vital lies, simple truths: The psychology of self-deception. (Originally published in 1985.) New York, NY: Bantam.

Gonzalez, A., & Willems, P. (2013). Theories in educational psychology: A concise guide to meaning and practice. Lanham, MD: R&L Education.

Hari, J. (2018). Lost connections: Uncovering the real causes of depression and the unexpected solutions. New York, NY: Bloomsbury USA.

Junger, S. (2016). Tribe: On homecoming and belonging. New York, NY: Twelve/Hachette Book Group.

Kluger, J. (2014). The narcissist next door. New York, NY: Riverhead Books/Penguin.

Lalich, J. (2004).True believers and charismatic cults. Oakland, CA: University of California Press.

Langone, M. (Ed.). (1993). Recovery from cults: Help for victims of psychological and spiritual abuse. New York, NY: Norton.

Langone, M. (2016). Origins and prevention of abuse. ICSA Today, Vol. 7, No. 3, 11–13. Available online at https://www.icsahome.com/articles/origins-and-prevention-of-abuse-doc

Lifton, R. J. (1995). Forward. In M. Singer & J. Lalich, Cults in our midst (xi–xiii). San Francisco, CA: Jossey-Bass.

Lindholm, C. (2002). Culture, charisma and consciousness: The case of the Rajneeshee. Ethos, 30(4): 357–375.

Lindholm, C. (1990). Charisma. Oxford, UK: Blackwell.

Maslow, A. (1987). Motivation and personality (1st Ed., 1954; 2nd Ed., 1970). London, UK: Longman.

Oakes, L. (1997). Prophetic charisma: The psychology of revolutionary religious personalities. Syracuse, NY: Syracuse University Press.

Reisman, D., Glazer, N., & Denney, R. (1950). The lonely crowd: A study of the changing American character. New Haven, CT: Yale University Press.

Shaw, D. (2014). Traumatic narcissism: Relational systems of subjugation. New York, NY: Routledge.

Sjöstrand, I. (1973). Samhem: Små nära gemenskaper för alla: En bok om mänsklig miljö i mänsklig skala (Togetherness centers: Small local fellowship for everyone: A book about human environments on a human scale). A book about postmodern social isolation; the book and theory that brought me to Sweden, on a Sheldon Traveling Fellowship, Harvard University.

Snowman, J., & McCown, R. (2012). Psychology applied to teaching (13th ed.). Stamford, CT: Cengage Learning.

Tobias, M., & Lalich, J. (1994). Captive hearts, captive minds. Alameda, CA: Hunter House.

von Hippel, W. (2018). The social leap: The new evolutionary science of who we are, where we come from and what makes us happy. New York, NY: Harper Wave.

Weber, M., & Eisenstadt, S. (Ed.). (1968). Charisma and institution building. Chicago, IL: University of Chicago Press.

Wilson, E. (2012). The social conquest of earth [based on Paul Gaugin’s famous mural, “What are we?” Where did we come from?” Where are we going?”]. New York, NY: Liveright Publishing; London, UK: W. W. Norton.

Wright, R. (1994). The moral animal: Why we are the way we are: The new science of evolutionary psychology. New York, NY: Random House.

Zablocki, B. (1980). Alienation and charisma: A study of contemporary American communes. Glencoe, IL: Free Press.

Zablocki, B., & Robbins, T. (2001). Misunderstanding cults: Searching For objectivity in a controversial field. Toronto, Canada: University of Toronto Press.

About the Author

Russell H. Bradshaw, EdD, was Associate Professor at Lehman College, City University of New York (retired September 2015). He has taught psychological and historical foundations of education and directed the MA program in Teaching Social Studies: 7–12. Dr. Bradshaw’s master’s and doctoral dissertations described alternative-living and child-care arrangements in Sweden (Samhem and Kollektivhus). During his undergraduate studies he received a stipendium to live in Samoa; he wrote his honors thesis on religion’s effect on cultural stability and change in Western Samoan villages. Dr. Bradshaw’s continuing interest in alternative living and child-care solutions led him to an intensive experience of a Hindu-based religious cult in New York City. Dr. Bradshaw has received fellowships and grants from Wesleyan, Harvard, and Uppsala (Sweden) universities, and from the City University of New York. He and his wife Gunilla currently live in Norrtälje, Sweden several months a year, where they are continuing their work for ICSA’s New York Educational Outreach Committee.