X-Message-Number: 7048
Date: Fri, 18 Oct 1996 10:06:00 -0700
From: Tim Freeman <>
Subject: Terror Management

In cryomsg 6194 (written May 1996), David Stodolsky
() said: 

   Research has shown that mortality salience increases the tendency
   to conform to culturally acceptable systems of thought. See:
   Solomon, S., Greenberg, J., & Pyszczynski, T. (1991).
   A terror management theory of social behavior:
   The psychological functions of self-esteem and cultural worldviews.
   In M. Zanna (Ed.) Advances in Experimental Social Psychology
   (Vol. 24, pp. 93-159). New York: Academic Press.

I recently acquired a copy of this, and I agree with David Stodolsky
that terror management (TM) explains why so few people are interested
in trying to use technology to postpone or prevent death.

Briefly, this is the hypothesis: there is obvious survival value in
trying to avoid death in the short term.  However, avoiding death in
the long term has historically been impossible, so survival has been
promoted by any psychological means (a "buffer") that prevents the
wasted effort of applying the short-term death-avoidance strategies to
the problem of long-term death avoidance.  In practice, the usual
buffer has two parts:

   1. A firmly-held belief about what makes life valuable.  A belief
      can be more firmly held by a group than by an individual, since
      the group consensus obscures the essential arbitraryness of the
      belief.  This belief can reasonably be called "culture".
   2. Behavior that conforms to the values of the culture.  Conformity
      to the cultural value system is a major component of self-esteem.

When people are reminded of death, they must strengthen the buffer, so
they defend their culture.  Applying this to life extension,
technological approaches to life extension remind people of death, and
most cultures don't include any useful life extension techniques, so
few people do anything intelligent to extend their lives.

This theory makes several predictions that have been born out by
experiment:

   1. Remind the experimental subject of death, and remind the control
      subjects of something inane like television.  Ask both groups to
      determine the appropriate bail bond for a hypothetical arrested
      prostitute.  Of those persons who have a worldview that
      prostitution is bad, the persons who have been reminded of death give
      a significantly larger number.  (Municipal court judges were used
      as subjects in one run that replicated the result.  It is scary
      that things like this might influence actual court outcomes.)

   2. Similar preparation for the subjects, but confront the American
      subjects with an apparently pro-American interviewer or an apparently
      anti-American interviewer.  The subjects who were recently reminded of
      death like the pro-American interviewer more and dislike the
      anti-American interviewer more.

Some time ago I also read Combatting Cult Mind Control by Steven
Hassan (Park Street Press, 1988).  The author is an ex-Moonie who
helps people escape from mind control cults.  He did not have the
concept of TM, but TM explains why his technique works.  It turns out
that mind control cults typically give the victim a belief system
which the victim adopts as the "culture" to be defended a la TM.
Frequently part of the belief system is an exit phobia, a belief that
leaving the cult will cause some disaster to happen.  TM explains why
this is so effective: disasters remind people of death and make people
defend their culture (that is, the belief system of the cult).  TM
also explains why people with low self-esteem tend to be sucked into
cults more often.  If the existing defense mechanism is broken, a
person is more willing to adopt a new one.

The technique used by Hassan to help people leave cults is
particularly interesting.  He confronts the cult member with someone
who is an ex-member of *some other cult*.  For example, suppose we're
trying to save a Moonie; we go find an ex-EST person to talk to him
about cults.  (The Moonie does not believe he is in a cult.)  The
Moonie's belief isn't directly threatened by conversations about how
EST works as a cult, so TM does not prevent the ex-EST person from
explaining to the Moonie how cults work and detailed things about how
EST fit the profile of a cult.  Giving the Moonie this knowledge as
part of an apparently idle conversation lays the groundwork for a
future intervention where his non-Moonie friends and family try to
persuade him that he is in a cult that does not help him achieve any
reasonable goals.

There are many other details too.  Our goal, of course, would be to
pull people out of the prevailing deathist mind-set.  The difference
between this and a cult is merely the size, and that the deathist
consensus does not have a strong exit phobia (except some but not all
Christians will believe they will burn in Hell if they believe us).

There are also books out there like "Virus of the Mind" that are
explicitly about meme engineering.

At the moment I am reluctant to participate directly in trying to
engineer the optimal propaganda technique, since my past experiences
where I have accidentally pushed people into TM mode leave me somewhat
nauseous and disgusted.  Now that I understand why that happened, I
may have a different attitude in a few months.  Engineering the
propaganda technique would definitely be a worthwhile exercise.

I don't read the extropians list at the moment, so please cc me in any
dialogue about this.

Tim Freeman


Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=7048