Mirror, mirror

Ballerinas dance with bar in a mirror

Mirror, mirror

The discovery of mirror systems has helped us understand the planning and imagining of actions

When we move to strike a tennis ball, our actions are guided by the brain’s motor control systems.

Recently, it has become clear that these same systems are also active when we imagine making an action in our head (reliving a perfect cross-court volley, for example). And, remarkably, they also light up when we watch someone performing an action.

The key difference is that the levels of activity are lower than when we actually perform the action – so muscular contraction is not actually triggered. Because the systems reflect the ‘real’ activity, they are known as mirror systems.

The system is extraordinarily specific. Mirror systems fire when someone sees a person making an arm movement, for example, but not when they see a robotic arm move. It is possible that this activity allows us to put ourselves in others’ positions, experiencing (but to a lesser degree) what they are experiencing. They may therefore help us to infer the intentions of others.

Lead image:

Quinn Dombrowski/Flickr CC BY

About this resource

This resource was first published in ‘Thinking’ in September 2006 and reviewed and updated in August 2014.

Topics:
Cell biology, Neuroscience, Physiology
Issue:
Thinking
Education levels:
16–19, Continuing professional development

Mind the gap

Mind the gap sign on tube platform with train rushing past

Mind the gap

Humans have an uncanny ability to put themselves in the position of others

Young children can be horribly selfish. They want things for themselves and are not interested in sharing. Partly this is because they lack the ability to appreciate what other individuals are thinking and feeling. This develops gradually during childhood.

Being able to understand the feelings and motivations of others, being able to put yourself in other people’s shoes, is known as theory of mind. It is the basis of what we know as empathy – appreciating what others are feeling and how our own behaviour may impact on them.

It is likely that people’s capacity for empathy varies. We can probably identify people whom we feel are particularly empathic (or those who seem to lack empathy).

In some conditions theory of mind seems to be very badly affected. A common feature of autism, for example, is an inability to appreciate what others are thinking and feeling, or to appreciate the impact of one’s actions on others. As a result, people with autism generally lack social skills, and have to be taught how to behave in social situations where most of us would behave naturally, relying on unconscious social skills.

Lead image:

Leo Marco/Flickr   CC BY NC ND

About this resource

This resource was first published in ‘Thinking’ in September 2006 and reviewed and updated in August 2014.

Topics:
Neuroscience, Physiology
Issue:
Thinking
Education levels:
16–19, Continuing professional development

Body talk

A person crossing their arms

Body talk

When people scratch their nose, does it mean they are lying?

Popular psychology is full of accounts of ‘body language’. If I cross my arms, I’m being defensive; if I pull my ear, I’m likely to be lying; if I avoid your gaze, I’ve got something to hide.

The basis of body language is in animal communication. Without language, animals need ways to convey information to one another – and they use parts of their bodies in imaginative ways to do so. Faces are again important, but so too are, for example, gestures of submission. Mating relies heavily on signals of intent, receptivity or rejection, often leading to elaborate rituals.

The popularity of studying body language in humans owes much to Desmond Morris. He argued that information from animals could be extrapolated to humans. The scientific value of this area, social anthropology, has been questioned by many neuroscientists.

The neuroscience of body language has been studied much less well than responses to faces. But it does appear that the brain can recognise particular body postures and that recognition occurs early during processing of a scene (as is also true of face recognition). There could be brain modules specifically for body perception.

The body language responses studied to date seem to be closely linked to the brain’s emotional responses. So seeing someone showing signs of distress fires up our amygdala. This cues behaviour needed to escape from threatening stimuli (such as the need to run away very fast).

We also seem to be particularly sensitive to bodies in motion – though as artists through the century have proved, our emotional responses to still images of bodies in peril are powerful and quick to appear.

Lead image:

Winjohn/FreeImages CC BY NC ND

About this resource

This resource was first published in ‘Thinking’ in September 2006 and reviewed and updated in August 2014.

Topics:
Neuroscience, Psychology
Issue:
Thinking
Education levels:
16–19, Continuing professional development

Wired

Large computer chip circuit board with many large wires

Wired

How much of our behaviour is fixed, embedded in the neural networks of our brain? Is it ‘hard-wired’ – set for life – or more flexibly arranged?

Behaviour is complex. No single gene encodes for it, nor does any single event or experience control it. Although we can control some aspects with our own willpower or volition, in the end our behaviour arises from an intricate interplay between our environment, our genes and us.

Science has shown that many patterns of behaviour, including alcoholism, criminality and homosexuality, are somewhat genetically influenced. Our genes even have some control over behaviours that we are unaware of – such as hand clasping (people tend to intertwine clasped hands with either the right or the left side uppermost).

In the case of alcoholism genes may code for certain receptors that bind chemical messengers in the brain, or for enzymes involved in breaking down alcohol. However, our social and cultural upbringing may also affect our alcohol consumption – our parents may be teetotal, for example.

There is bound to be interplay between these factors. We may be born with a genetic predisposition to alcoholism but be lucky enough in our family and social life that we never get tipped over the edge into dependency.

Also, the brain itself is not set in stone. It develops through childhood, goes through massive changes at adolescence, and reaches maturity in our early 20s. Even then the brain retains significant plasticity – it learns and adapts. So if we practise tennis we get better at it.

So exactly how much of our behaviour can be modified, and how much is inborn or fixed by our upbringing? It is hard to say. With humans such a debate is risky, as the notion of ‘hard-wiring’ can be used to support racist or sexist views or other forms of bigotry. On the other hand, in Steven Pinker’s famous phrase, we are clearly not ‘blank slates’ either.

Lead image:

Speric/Flickr CC BY NC

About this resource

This resource was first published in ‘Thinking’ in September 2006 and reviewed and updated in August 2014.

Topics:
Neuroscience, Psychology
Issue:
Thinking
Education levels:
16–19, Continuing professional development

A brief history of mental illness

An illustrated scene at Bedlam from Hogarth’s series of paintings ‘A Rake’s Progress’

A brief history of mental illness

Pre-history to the modern day: the progression of understanding mental illness

Pre-history (eg Stone Age)
Trepanning (drilling holes in the skull) is used to get rid of evil spirits.

Approx. 400 BCE
Hippocrates treats mental illness as a problem of the body rather than a punishment sent by the gods.

1377 CE
Opening of the Bethlem Royal Hospital in London, also known as ‘Bedlam’.

1600s
Chains, shackles and imprisonment are largely used to restrain and control the mentally ill.

1850s
Asylums built.

1870s
Normal ovaries are removed to treat ‘mental madness’ and ‘hysterical vomiting’ in some women.

1879
Wilhelm Wundt opens the first experimental psychology lab at the University of Leipzig in Germany.

Early 1900s
Psychoanalysis inspired by Sigmund Freud, Carl Jung and others.

1911
Swiss psychiatrist Eugen Bleuler first uses the term ‘schizophrenia’.

WWI
Patients with shell shock are counselled – the precursor of modern treatment for post-traumatic stress disorder.

1936
Lobotomy (surgical removal of part of the brain).

1938
Electro-shock therapy for schizophrenia and manic depression (now called bipolar disorder).

1949
Lithium for psychosis and manic depression. 

1952
The first anti-psychotic drug, Thorazine, for psychosis.

Mid-1950s
Behaviour therapy for phobias.

1960–63
Librium and Valium for nonpsychotic anxiety.

1970s–1980s
A move away from asylums, mental institutions and hospitals to community-based healthcare.

1980s
Selective serotonin reuptake inhibitors for depression.

1990s
New generation of anti-psychotic drugs for schizophrenia.

2000s
Mindfulness meditation becomes an increasingly important tool in mainstream psychiatric and psychological care.

2012
The link between hearing voices (previously thought to be a symptom of brain disease) and childhood trauma is found to be stronger than the link between smoking and lung cancer.

2014
LPM570065, a triple reuptake inhibitor (TRI), rapidly reduces depression in rats. This class of drugs is being considered as an alternative treatment for depressed people.

Lead image:

An illustrated scene at Bethlem Royal Hospital in London – commonly known as ‘Bedlam’ – from Hogarth’s series of paintings ‘A Rake’s Progress’ (1732–34).

Wellcome Library, London CC BY

References

About this resource

This resource was first published in ‘Thinking’ in September 2006 and reviewed and updated in August 2014.

Topics:
Neuroscience, Psychology, Medicine, History, Health, infection and disease
Issue:
Thinking
Education levels:
16–19, Continuing professional development

Morality tales

Hear no evil, see no evil, speak no evil

Morality tales

Neuroscience is helping us to understand the biological basis of human morality

At a very general level, morality concerns the nature of good and bad and the nature of right and wrong.

What kind of actions are right, and what are wrong? What kind of life is a good life, and what kind is a bad life? What makes a good person good? If we had true answers to these questions, we would have a pretty good understanding of morality.

Unfortunately, we often disagree with one another about what actions are right or wrong, and what kinds of lives are good or bad. We also often disagree about whether morality itself is absolute, universal and unchanging, or whether morality changes depending on what is socially acceptable.

The ancient Greeks might have seen slavery as morally acceptable. Many modern people strongly disagree. Is this disagreement about morality itself, or was morality simply different for the ancient Greeks? Is it possible for societies to make moral progress or not?

One way to try to understand the nature of moral disagreement is to study moral judgement – the mental processes that lead to our decisions about what is right or wrong and about what is good or bad. Although making a moral judgement is often an easy thing to do, understanding what happens in the brain during moral judgement is very difficult.

It is possible that a better understanding of the neural and psychological basis of moral judgement will help us to understand better the nature of morality: for example, whether it is universal and absolute, whether it is tied to what is socially acceptable, and why our moral judgements often generate disagreement.

Recently cognitive scientists have begun studying moral judgement in earnest, and the results are very interesting. For example, we know that our moral judgements differ according to our sex, religion and culture. They also change with age. Very young children can’t tell right from wrong. Toddlers base their morality around themselves. With age, morality shifts towards peer-group values, and eventually moves towards consideration of the wider social group.

These results pose difficult questions for philosophers and theologians. Why do we make the moral judgements we do? And should we believe that our moral judgements track the truth about morality? Do our brains come with some sense of morality already built in, or is morality something we have to learn? These questions are the subject of much debate between philosophers, theologians and cognitive scientists.

Lead image:

Thomas Hawk/Flickr CC BY NC

References

Further reading

About this resource

This resource was first published in ‘Thinking’ in September 2006 and reviewed and updated in August 2014.

Topics:
Neuroscience, Psychology, History
Issue:
Thinking
Education levels:
16–19, Continuing professional development

Animal models

The fruit fly Drosophila melanogaster – wild type

Animal models

Is it really possible to use experimental animals to study human behaviour and decision making?

In many areas of biomedical research, animals such as mice and rats are used to find out about underlying physiological mechanisms. Such experiments cannot be done on people, but the results of experiments on rodents are of relevance to human biology. But can the same be true of complex behaviours, given that the human brain is so different from those of other animals?

While extrapolation from animals to humans should always be made with caution, it is turning out to be possible to study behaviour in animals and to make connections between behaviour, brains, cell biology and genetics.

Different strains of mice, for example, vary in certain behavioural traits. Some are particularly neurotic, for example, and possible genetic contributions to this trait have been identified. This does not reveal a ‘gene for neuroticism’, but does suggest a possible biological mechanism influencing neuroticism in mice and, by extension, humans.

A similar approach is to use genetic techniques to ‘knock out’ particular mice genes and to see what effect that has on behaviour. A good example is the 2005 knockout of the stathmin gene, which turned mice into daredevils, transforming their normally timid nature. Stathmin is active in the amygdala, and seems to be crucial in the animal’s fear responses. Without stathmin, the mice show much less fear. For humans, the end result could be a better understanding of anxiety and (eventually) new treatments for anxiety disorder or post-traumatic stress disorder.

Studies can also provide clues to the physiological changes associated with behaviour. Mice that are repeatedly bullied, for example, become very reclusive and are reluctant to mix with other mice – a response known as ‘social defeat’. This response can be overcome by long-term treatment with antidepressants. It has been found to depend on a molecule known as brain-derived neurotrophic factor; without this protein, the mice do not become reclusive when bullied. The distinctive social defeat response resembles human anxiety and depression, and studies in mice should help us understand the biochemical nature of these psychological conditions.

Fly sexuality

Even simple animals can provide clues to the mechanisms underlying behaviour. Much research is carried out on the fruit fly, and some studies have identified the factors underpinning their sexual orientation.

Fruit fly mating is very stereotyped. Males court other females by walking behind them doing a kind of dance. This behaviour is dependent on a gene known as fruitless, which comes in male and female forms. If a female fly is genetically engineered to make the male version, it adopts the male courting behaviour, though in other respects it is a typical female. The difference in behaviour has also been linked to very specific changes to neural pathways in the fly brain.

This does not, of course, mean that there is a single human gene that controls human sexual orientation. Human sexuality is far more complex than fruit fly courting. But it does illustrate how gene changes can affect the brain and hence behaviour in an animal with complex behaviours, and suggests new ways to study such behaviours in other animals.

Lead image:

The fruit fly Drosophila melanogaster – wild type.

Audio-visual, London School of Hygiene & Tropical Medicine/Wellcome Images CC BY NC ND

References

Further reading

About this resource

This resource was first published in ‘Thinking’ in September 2006 and reviewed and updated in August 2014.

Topics:
Genetics and genomics, Neuroscience, Psychology
Issue:
Thinking
Education levels:
16–19, Continuing professional development

Lobotomy

Set of Watts–Freeman lobotomy instruments, circa 1950

Lobotomy

Lobotomy has a bad name, but it won its inventor a Nobel Prize. Does surgical intervention in the brain have a place in the medical armoury?

Brain surgery has a long history; ancient human skulls have been found with holes drilled in them (known as trepanning or trephining). In the late 19th century medics were experimenting with brain surgery to treat conditions such as schizophrenia. But it was a Portuguese doctor, António Egas Moniz, who really pushed the field forward with the development of the leucotomy in the 1930s.

At the time there was very little that could be done for patients with schizophrenia, depression or other serious mental health problems. Moniz’s approach was to inject alcohol into fibres joining the frontal lobes of the brain, to destroy the nerves and reduce unwanted brain activity. Moniz won the Nobel Prize in Physiology or Medicine in 1949 for this work.

The technique did in some cases work, though not always, and Moniz himself argued it should only be used as a last resort. (Despite the Nobel Prize, Moniz suffered for his science – he was shot by a former patient and ended up a paraplegic.) Lobotomy might have faded away had it not been for some enthusiastic supporters – particularly Walter Freeman in the USA.

Freeman promoted the ice pick lobotomy – rather than the special tools Moniz favoured, Freeman and his collaborating surgeons inserted a standard ice pick through the eye socket and jiggled it about under local anaesthetic. The whole operation took only a few minutes. Despite many medical misgivings, the lobotomy was wildly popular in the USA and elsewhere. It was also undoubtedly misused for political ends or by families embarrassed by mentally ill relatives.

By the 1950s, the fact that its success rate was low and its side-effects common and severe, often leaving patients in a zombie-like state, had led to its demise. The final nail in its coffin was the development of drugs, beginning with Thorazine, to treat mental disorders.

Brain surgery today

While lobotomies today are rare, surgical intervention in the brain is still carried out in the treatment of certain disorders (in a much more controlled way). As well as surgery for treating brain tumours, surgery can be highly successful in treating epilepsy, to prevent unwanted electrical activity spreading across the brain. Electroconvulsive therapy (ECT) is occasionally used for severe depression when a patient has not responded to other therapies.

Deep-brain stimulation is a successful treatment for Parkinson’s disease. Electrodes are inserted deep into the brain and electrical signals inhibit the nerve signals that cause the uncontrollable shaking seen in Parkinson’s.

Overall, though, surgeons are much less likely to dig into the brain these days. Partly this is because drugs can often do a similar job without the need for a scalpel, by altering brain chemistry. Even so, there are concerns that such drugs have excessive side-effects or are used too much – the rising use of Ritalin for attention deficit hyperactivity disorder (ADHD) being a case in point.

There is also the potential for drugs to be used to treat socially unacceptable behaviour. There have been calls, for example, for sex offenders to be treated with agents to curb their sexual behaviour. One lesson from the leucotomy episode is that we need to be wary that what seems initially to be a prudent medical technique does not become abused and misused.

Lead image:

Set of Watts–Freeman lobotomy instruments, c.1950.

Wellcome Library, London CC BY

About this resource

This resource was first published in ‘Thinking’ in September 2006 and reviewed and updated in August 2014.

Topics:
Neuroscience, Psychology, Medicine, Health, infection and disease
Issue:
Thinking
Education levels:
16–19, Continuing professional development

Responsible adults?

The view from a nightclub balcony

Responsible adults?

If a lot of our behaviour is outside our conscious control (or feels as if it is), can we always be held responsible for our actions?

Our legal system (and many other aspects of society) are based on the idea that we are ‘free agents’, able to decide for ourselves how we behave.

But how much freedom do we actually have to control our behaviour? Some brain responses are not under conscious control. Sometimes, even when we think we are making a conscious decision, our brain has already made an unconscious one. Or our conscious and unconscious wrestle for control of our actions.

Our genetic inheritance will affect our brain and behaviour, as will the environment we experience in the womb, and the way we are brought up. By the time we are adults, our scope to behave in any way we choose is significantly reduced.

On the other hand, genetic or neuroscientific determinism – that we are ‘born’ or ‘hard-wired’ to behave in a particular way – can become a self-fulfilling prophecy. The prefrontal cortex, the ‘thinking brain’, still has plenty of scope to shape our actions.

Legally, courts are more lenient if a defendant can prove ‘diminished responsibility’. Sentencing will also depend to some extent on an assessment of a defendant’s mental health. So far, there has been little evidence that judges are willing to consider biological susceptibilities as a justifiable defence. As we discover more about the links between the brain and behaviour, it is likely that this will become a more common issue.

You are the judge

Read through the following two case studies and decide for yourself where you stand. You might also want to debate and discuss these topics in the classroom.

Case study 1

Defendant X

  • Impulsive behaviour runs in his family.
  • He has a variant in a neurotransmitter receptor gene that may influence behaviour.
  • He hit a bouncer at a nightclub, causing actual bodily harm.

Do any of the factors influence whether he is found guilty or not?

Should any influence the punishment if he is found guilty?

Should any biological factor ever be considered?

Case study 2

Defendant Y

  • She was brought up on a deprived inner-city estate.
  • She was physically abused as a child.
  • She stole a mobile phone to give to her boyfriend.

Do any of the factors influence whether she is found guilty or not?

Should any influence the punishment if she is found guilty?

Lead image:

Bruce Turner/Flickr CC BY

Further reading

About this resource

This resource was first published in ‘Thinking’ in September 2006 and reviewed and updated in August 2014.

Topics:
Neuroscience, Psychology
Issue:
Thinking
Education levels:
16–19, Continuing professional development

Better brains?

Calculator on a maths textbook

Better brains?

How should we react to the potential to enhance our brain’s abilities?

New drugs are appearing that act on the brain. Initially developed to tackle medical problems, they also have the potential to be used by the healthy to enhance brain function.

A good example are ‘cognitive enhancers’. Developed to protect against memory loss in Alzheimer’s disease, they can also boost normal memory.

Some people fear we are heading towards becoming ‘super-humans’, with everyone feeling pressured to enhance themselves or their children for fear of falling behind in a competitive world. The gaps between the haves and have-nots could widen. And what does it all mean for our view of what it is to be human? We all have our flaws – are we chasing an impossible dream of perfection?

On the other hand, the whole point of learning is to expand the mind, and we think nothing of providing extra school or educational activities, or pumpimg children full of vitamins to boost their IQ. And we use drugs like caffeine all the time to boost mental performance. What is so different about pharmacological approaches, if tried and tested?

You are the parent

Read through the following two case studies and decide for yourself where you stand. You might also want to debate and discuss these topics in the classroom.

Case study 1

Your son wants a ‘cognitive detector’ chip implanted in his temple so he can interact better with his immersive virtual reality computer game.

  • Do you let him have the implant?
  • What if it aided learning as well as gaming?
  • Is there any reason to limit the use of such technologies?

Case study 2

You find packets of modafinil, a memory-enhancing drug, in your daughter’s bedroom. She says she needs them for her exams – everyone else is using them, and she’ll be at a disadvantage without them.

  • Would you allow her to take them?
  • Should people be free to use enhancing drugs or technologies?
  • What limits, if any, should be placed on their use?

Lead image:

Steven S./Flickr CC BY

Further reading

About this resource

This resource was first published in ‘Thinking’ in September 2006 and reviewed and updated in August 2014.

Topics:
Neuroscience, Medicine
Issue:
Thinking
Education levels:
16–19, Continuing professional development

Pages