[Vision2020] MIT Neuroscientists: Moral judgments can be altered ... by magnets

Ted Moffett starbliss at gmail.com
Sun Apr 11 21:08:02 PDT 2010


On 4/11/10, Saundra Lund <v2020 at ssl.fastmail.fm> wrote:
>
>    If I lived in Whitman County, I’d be demanding answers about why those
> with mental illness are held to a higher standard than criminals – with
> genuine free choice -- without bone fide medical conditions.
>

--------------

Skip my musings to read about the MIT research on moral judgments at the
bottom, for those only interested in what the subject heading references...
--------------
Can you define "genuine free choice?"  I can't.

Of course there are mental illnesses that impair logic and perception.  The
brain can malfunction just as any organ of the body, and no moral blame is
justified for this malfunction.  But are the actions of a "normal" person
any less a result of factors out of their control, when all variables are
accounted for, than the actions of a psychotic?  The normal functioning
brain involves a certain neurochemistry that follows laws of science, just
as does the mentally ill brain; they both involve deterministic variables,
as does the neurochemistry of choice.

Are most normal people "normal" because they made a conscious choice to be
thus?  One child decides to lead a life of crime, while another chooses a
law abiding and responsible life.  This example is what would be required to
assign moral choice in many cases, because science indicates that the brain
is being wired at young ages regarding conformity to social norms, or the
lack thereof.  Once the brain is wired from childhood (till 18 years old)
experiences, this greatly determines adult personality, based on
deterministic variables in brain function.  Yet children are often not
assumed to have full moral decision making capabilities, so how can they
make a fully informed decision regarding what sort of person they want to be
when an adult?

I'm wading into very deep philosophical and scientific waters, no doubt over
my head; but I am amazed that people often address "free will," as though it
is a well understood and agreed upon concept.

The concept has been tackled by the greatest minds in philosophy,
psychology, modern neuroscience, and religion, and there is no consistent
agreement.  *If "free will" were a principle in physics, it would be
regarded as highly speculative, rife with disagreement and discord.  **It
appears, based on science, that the person who is kind, loving, responsible,
hard working, and so forth, full of virtue, has no more control over these
qualities based on so called "free choice," than does the person who is
cruel, hateful, irresponsible, lazy and so forth.  The virtuous person would
find it very difficult to "choose" to be otherwise, unless compelled by
environmental forces, just as criminals are often driven to crime over and
over, even when wishing deeply to reform.  Genetics and environment, mostly
influences during childhood and adolescence, determine who will be "good" or
"bad."*   Moral decision making is genetically and environmentally
conditioned.  I'm writing in oversimplified terms, given that most everyone
engages in a mixture of good and bad behavior, however this is defined.

Of course there is a social need to blame and seek retribution against those
who commit crimes.  This no doubt serves society's needs for mechanisms of
social control and conformity, with guilt and revenge appearing almost hard
wired into most people's brains; ascribing responsibility for actions based
on the concept of "free choice" appears to act as a psychological mechanism
to enforce virtuous acts, but this is social conditioning that is
controlling behavior, another deterministic factor based on brain
chemistry that induces most people to conform to the expectations of their
culture.  "Free choice" in this analysis is a socially useful illusion, but
not based on science.

Of course regardless of whether or not blame or punishment is assigned for
crimes based on a concept of free will, some sort of incarceration for
criminals is required to protect society.

Anyway, given our current limited understanding of the human brain, we are
still learning why human beings make various kinds of moral decisions; a
very complex subject.
-------------------------------
http://web.mit.edu/newsoffice/2010/moral-control-0330.html

Moral judgments can be altered ... by magnets

By disrupting brain activity in a particular region, neuroscientists can
sway people’s views of moral situations.
Anne Trafton, MIT News Office

To make moral judgments about other people, we often need to infer their
intentions — an ability known as “theory of mind.” For example, if one
hunter shoots another while on a hunting trip, we need to know what the
shooter was thinking: Was he secretly jealous, or did he mistake his fellow
hunter for an animal?

MIT neuroscientists have now shown they can influence those judgments by
interfering with activity in a specific brain region — a finding that helps
reveal how the brain constructs morality.

Previous studies have shown that a brain region known as the right
temporo-parietal junction (TPJ) is highly active when we think about other
people’s intentions, thoughts and beliefs. In the new study, the researchers
disrupted activity in the right TPJ by inducing a current in the brain using
a magnetic field applied to the scalp. They found that the subjects’ ability
to make moral judgments that require an understanding of other people’s
intentions — for example, a failed murder attempt — was impaired.

The researchers, led by Rebecca Saxe, MIT assistant professor of brain and
cognitive sciences, report their
findings<http://www.pnas.org/content/early/2010/03/11/0914826107.full.pdf+html>in
the
*Proceedings of the National Academy of Sciences* the week of March 29.
Funding for the research came from The National Center for Research
Resources, the MIND Institute, the Athinoula A. Martinos Center for
Biomedical Imaging<http://www.nmr.mgh.harvard.edu/martinos/aboutUs/index.php>,
the Simons Foundation <http://sfari.org/> and the David and Lucille Packard
Foundation.

The study offers “striking evidence” that the right TPJ, located at the
brain’s surface above and behind the right ear, is critical for making moral
judgments, says Liane Young, lead author of the paper. It’s also startling,
since under normal circumstances people are very confident and consistent in
these kinds of moral judgments, says Young, a postdoctoral associate in
MIT’s Department of Brain and Cognitive Sciences.

“You think of morality as being a really high-level behavior,” she says. “To
be able to apply (a magnetic field) to a specific brain region and change
people’s moral judgments is really astonishing.”

*Thinking of others*

Saxe first identified the right TPJ’s role in theory of mind a decade ago —
a discovery that was the subject of her MIT PhD thesis in 2003. Since then,
she has used functional magnetic resonance imaging (fMRI) to show that the
right TPJ is active when people are asked to make judgments that require
thinking about other people’s intentions.

In the new study, the researchers wanted to go beyond fMRI experiments to
observe what would happen if they could actually disrupt activity in the
right TPJ. Their success marks a major step forward for the field of moral
neuroscience, says Walter Sinnott-Armstrong, professor of philosophy at Duke
University.

“Recent fMRI studies of moral judgment find fascinating correlations, but
Young et al usher in a new era by moving beyond correlation to causation,”
says Sinnott-Armstrong, who was not involved in this research.

The researchers used a noninvasive technique known as transcranial magnetic
stimulation (TMS) to selectively interfere with brain activity in the right
TPJ. A magnetic field applied to a small area of the skull creates weak
electric currents that impede nearby brain cells’ ability to fire normally,
but the effect is only temporary.

In one experiment, volunteers were exposed to TMS for 25 minutes before
taking a test in which they read a series of scenarios and made moral
judgments of characters’ actions on a scale of one (absolutely forbidden) to
seven (absolutely permissible).

In a second experiment, TMS was applied in 500-milisecond bursts at the
moment when the subject was asked to make a moral judgment. For example,
subjects were asked to judge how permissible it is for a man to let his
girlfriend walk across a bridge he knows to be unsafe, even if she ends up
making it across safely. In such cases, a judgment based solely on the
outcome would hold the perpetrator morally blameless, even though it appears
he intended to do harm.

In both experiments, the researchers found that when the right TPJ was
disrupted, subjects were more likely to judge failed attempts to harm as
morally permissible. Therefore, the researchers believe that TMS interfered
with subjects’ ability to interpret others’ intentions, forcing them to rely
more on outcome information to make their judgments.

“It doesn’t completely reverse people’s moral judgments, it just biases
them,” says Saxe.

When subjects received TMS to a brain region near the right TPJ, their
judgments were nearly identical to those of people who received no TMS at
all.

While understanding other people’s intentions is critical to judging them,
it is just one piece of the puzzle. We also take into account the person’s
desires, previous record and any external constraints, guided by our own
concepts of loyalty, fairness and integrity, says Saxe.

“Our moral judgments are not the result of a single process, even though
they feel like one uniform thing,” she says. “It’s actually a hodgepodge of
competing and conflicting judgments, all of which get jumbled into what we
call moral judgment.”

Saxe’s lab is now studying the role of theory of mind in judging situations
where the attempted harm was not a physical threat. The researchers are also
doing a study on the role of the right TPJ in judgments of people who are
morally lucky or unlucky. For example, a drunk driver who hits and kills a
pedestrian is unlucky, compared to an equally drunk driver who makes it home
safely, but the unlucky homicidal driver tends to be judged more morally
blameworthy.
------------------------------------------
Vision2020 Post: Ted Moffett
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.fsr.com/pipermail/vision2020/attachments/20100411/0ba7fe04/attachment-0001.html 


More information about the Vision2020 mailing list