Sunday, March 29, 2009

Inviting versus Effecting Change

Here are some thoughts I have had on Steve Hassan’s Strategic Interaction (SIA) approach. This was taken from another list serv I just posted to, so I have edited it somewhat so I don’t include the other person’s postings in it. Here is what I wrote:

The chapter on Exit Counseling in the book Recovery from Cults written by the Giambalvos, Kevin Garvey and Michael Langone is more than 15 years old, but many of the issues they bring up are still relevant today. They make the distinction between “inviting” vs. “effecting” change. Their model “invites” change whereas Hassan’s model (in his own words that they quoted and criticized) “effects” change.It comes down to how much we think the precondition for the exit counseling to work (the cult member being ready, able and willing to critically evaluate the cult) can and/or should be intentionally influenced by the exit counselor or the family. This also applies to mental health professionals who might very well end up with an active cult member coming to see them. It wouldn’t happen with a Scientologist, but there are other cults that don’t necessarily forbid therapy. The real question is can or should we try to change a person’s readiness to hear what we have to say? If so, what methods is it ethical to use and where should we draw the line? These are difficult questions.

Since that chapter was written, as we’re all aware, Steve Hassan has published a second book, Releasing the Bonds that appears to take this even further and is much more specific on what he sees can be done, often via family members, to influence the person, both before and during an intervention, if an intervention is done. The book is all about how family members and counselors can effect change (whether he uses that word or not, that’s what is being described). He proposes many different ways of doing this, some more questionable than others. The person being intervened on in this way who is an active cult member, of course, has no idea that the family, friends, and others in the person’s life (the “team”) are working with a counselor on a very specific plan to get him out of the cult so there is a complete lack of informed consent on the person’s part. What it comes down to is the belief that the team knows better what is best for the person than the cult member does. The rationale for this is that the cult member has taken on a “cult” personality, but that the real person, deep down really does want out, but again, this is not such an easy rationale to swallow since it is presumptuous to assume that, and what it comes down to is that the team knows better than the person does what the “real person” wants. Does the fact that the family members are doing this rather than the counselor directly resolve the ethical dilemma? I really don’t think so since it is the counselor that is directing things and thus does have a responsibility for what goes on.

The whole theory of dissociation into separate personalities is questionable to begin with and the social psychology concept of role identity is the more parsimonious (simpler -- see Occam’s Razor principle that the simplest explanation that fits the facts is usually best) explanation. Whether we’re in a cult or not, we all take on different roles in our life and we might seem like very different people in those roles. For example, someone who is both a CEO and a father might look like a very different person in the tough, demanding CEO role than he might be as a gentle, caring father. Is this dissociation? No, it is a normal function of social role identity. However, by Hassan’s model (and others as well), the cult experience is medicalized and the person is basically given a diagnosis of a dissociative “disorder” by virtue of the fact he/she is in a group that is considered a cult. That kind of diagnosis is difficult enough under normal circumstances where the person can be directly interviewed, but it is even more problematic in the case of a cult member who usually is not being directly interviewed/examined by the SIA counselor who is working with the team members instead.

The assumption here is that because the person is in a cult, he/she automatically has this so-called “cult” personality, so the usual structured interview that is considered the “gold standard” for diagnosing this disorder is not done and cannot be done. The DD diagnosis is a big generalization and assumption to be making, based on a theory and second-hand reports of the person’s behavior. This is especially problematic since even the “gold standard” and the diagnosis itself has been called into question by some highly respected psychologists, even if everything is done properly. When it is done Hassan’s way with his assumptions, it becomes even more problematic. The implicit premise here is that because the person is involved in a cult, the person is mentally incompetent to make good decisions for him/herself and thus the team must decide what is best, and that means covert tactics to lead the person into “change and growth”, and of course the assumption is that positive change and growth leads to leaving the cult. Even though Hassan insists that is not the goal, let’s get real, it is what the family members want or they would not have paid Hassan all that money for consultation. According to Hassan, the “cult personality” has taken over and is in control, limiting the person’s free will.

Getting back to the SIA approach and the techniques used, these can vary from simple things like having family members talk about other cults or parallel experiences to the cult to more controversial tactics such as using techniques Hassan learned in the Moonies to match their thinking, feeling, doing or believing personality style (see p. 289 – it is not explicitly mentioned on that page that this is a Moonie technique, but there is a footnote that leads to p. 355 where he explains he first learned this in the Moonies). So again, we have the question of whether it works and is it ethical. It appears to work for some people, although the Moonies have a very high failure rate since the overwhelming majority of people they attempt to recruit do turn them down (see Barker’s study – even though there are people who disagree with her, few would dispute that cult recruiters have a high failure rate, especially at that initial stage). Still, one could argue that the team members know the person much better than a recruiter stranger would and thus might have more of a chance of success. But then, that raises the question of whether it is ethical to do the same thing cultists are doing to influence the person without his/her consent?

Also, when looking at these two models for exit counseling (the Giambalvo’s and their colleagues educational vs. Hassan’s mental health) we’re talking about whether it is a good idea to medicalize the cult issue. Hassan (and some others) operate on the assumption that cult members have a mental disorder induced by the cult. Is that a helpful assumption? That (the issue of medicalization of cult issues – both during and post cult) is going to be one of the issues my co-author and I plan to discuss in the presentation we have planned for the Denver ICSA conference.