In Defense of Susan Greenfield

Baroness Susan Greenfield has been taking a lot of flak these past few days for her comments about Facebook and computers in general:

If the young brain is exposed from the outset to a world of fast action and reaction, of instant new screen images flashing up with the press of a key, such rapid interchange might accustom the brain to operate over such timescales. Perhaps when in the real world such responses are not immediately forthcoming, we will see such behaviours and call them attention-deficit disorder...
I often wonder whether real conversation in real time may eventually give way to these sanitised and easier screen dialogues, in much the same way as killing, skinning and butchering an animal to eat has been replaced by the convenience of packages of meat on the supermarket shelf
She's taken a lot of flak, and she fully deserves it. Her comments were ill-judged and they bring her position as head of the Royal Institution into disrepute. Her speculations about clinical diagnoses such as ADHD and autism were especially dubious.

Greenfield's statements also display the vacuous obsession with "The Brain" so common today - if she'd simply said that spending hours on the internet might plausibly make kids grow up anti-social, that would be fair enough, but she had to bring the brain into it (several times in her various comments). Hence the headlines to the effect that Facebook could change or damage the brain. Well, Facebook does change the brain - as does everything else - because every experience we have has an influence somewhere in the brain. I'm reminded of Vicky Tuck on boy's and girl's brains; Tuck, however, is not a neuroscientist. Greenfield should know better.

But despite all this, Baroness Greenfield does make an important point.
At the moment I think we're sleepwalking into these technologies and assuming that everything will shake down just fine
These are very wise words. As a society, we are in danger of "sleepwalking" into social and cultural changes which we may end up regretting. Profound changes in the way people live rarely happen overnight, and they are rarely presented to us as a choice that we can either accept or reject. Societies just change, over a span of decades, often without anyone noticing what is happening until the change has happened.
One of my favorite books is Bowling Alone by the sociologist Robert D. Putnam. Putnam assembled data from a wide range of sources to support his theory that a profound change took place in America over the years from about 1960 to 1990; namely, that Americans stopped participating in community life. Union membership, Church attendance, charitable giving, league bowling, voter turnout, cards-playing, and many other such statistics fell markedly over this period, after a high peak in the 1950s. Meanwhile, solitary or small-group activities such as TV watching, spectator sports, and so on, exploded. Over the span of 20 years or so, Americans lost interest in "the community" as a whole and turned to themselves and their immediate circle of friends and family. He also makes a convincing case that this is, in many ways, a bad thing.

I doubt that Putnam's thesis is water-tight; for all I know he may have cherry-picked those statistics that support his theory and ignored those that don't. It wouldn't be the first time that someone has done that. Yet what's interesting about Bowling Alone is that even if Putnam's theory is only part of the truth, it's hard to deny that there's something in it - but it still took a book published in 2000 to bring it to people's attention. Putnam was writing about profound changes that every American will have felt to some degree. Yet these changes went un-noticed, or at least, few noticed that the various individual changes were part of a larger trend.

Putnam proposes various causes for the fragmentation of American community life, ranging from suburbanization to the increasing time pressures of work to that old favorite "the breakdown of the family". None of these were deliberate choices. Over 20 years or so America sleepwalked into a different way of life. This is hard to deny even, if you don't accept everything Putnam says. Baroness Greenfield, clearly, is no Robert Putnam. But her point about the dangers of sleepwalking is a sound one. Sleepwalking happens. It would be a pity if that message were to be lost in all the nonsense about Facebook and the brain.


A Very Optimistic Genetics Paper

Saturday saw the Guardian on fine form with a classic piece of bad neuro-journalism which made it all the way onto the front page:

Psychologists find gene that helps you look on the bright side of life
Those unfortunate enough to lack the 'brightside gene' are more likely to suffer from mental health problems such as depression
What the research actually found was nothing to do with looking on the bright side of anything, and was nothing to do with depression either. In fact, it suggests that the gene in question doesn't cause mental health problems. So the headlines are a little misleading, then.

The study comes from Elaine Fox and colleagues from the University of Essex.* They took 111 people, presumably students, and got them to do a "dot-probe" task. Performance on this task was related to the genotype of the 5HTTLPR polymorphism, a variant in the gene which encodes for the serotonin transporter protein. Serotonin is "the brain's main feelgood chemical" as the Guardian put it... except it isn't, although it does have something to do with mood.

What's a "dot probe" task? It's a test which has become popular amongst all kinds of psychologists over the past 10 years or so, having first been used in 1986 by Colin MacLeod et. al. The task involves pressing a button whenever a "probe" - a little dot - appears on a screen. The goal is to press the button as quickly as possible, as soon as the dot appears.

The twist is that as well as the dots, there are other things on the screen. In the 1986 version of the test these were words, while in this experiment they were colour pictures. Some of the images were pleasant: smiling faces, flowers, and other nice things. Some were unpleasant - scary dogs, bloody injuries, etc. And some were neutral objects, like furniture.Pairs of these pictures appeared on the screen for a short time (half a second) immediately before each dot appeared, one on the left of the screen and one on the right. The key is that the dot appeared in the same place as one of the pictures.

The task operates under the assumption that if the viewer's attention is grabbed by one of the pictures, they are likely to be faster to respond to seeing the dot when it appears in the same place as that picture, because they will already be focused on that area of the screen. If, for example, people are on average faster to detect the dot when it appears in the same place as the nice pictures as opposed to the horrible ones, this is described as indicating a "positive attentional bias" i.e. an unconscious tendency to pay attention to pleasant pictures.

Unfortunately, now that you know what a dot probe task is, you can't take part in any psychology experiments which uses one, because once you know how it's supposed to work there's no point in doing it. Sorry. But on the bright side, you now officially know more about psychology than The Economist, whose write-up of this experiment managed to be even worse than the Guardian's. They not only sensationalized the results, but also misunderstand the whole point of the dot-probe task - it's not about "distraction", it's about selective attention-grabbing.

Anyway, that's the task, and the study found that carriers of two "long" variants of the 5HTTLPR gene showed a strong attention bias towards nice pictures and away from nasty ones, while other people showed no biases. Statistically, the result was highly significant, so let's assume it's true. What does it mean? You could take it to mean that carriers of two long variants were more optimistic in that they tend to pay attention to the good stuff. On the other hand you could equally well say they're so squeamish and wussy that they can't bear to look at the bad stuff and have to avert their eyes from it.

And what's this got to do with depression? Well to cut a very long story short the gene in question has been previously linked to depression and also to personality traits such as "neuroticism" - being anxious, worried and generally miserable (see this paper). But in this study they found no such association with neuroticism. Despite the fact that it was a report of this association which got everyone interested in the 5HTTLPR variant in the first place back in 1996! Brilliantly, they spin their negative finding as a good thing -
The fact that our genotype groups were matched on a range of self-report measures, including neuroticism can be seen as a major strength.
Hope springs eternal. Overall, while this paper is a fine contribution to the psychology literature on the dot-probe task (and the results genuinely do seem to be very significant - there's probably something going on here) it's got nothing to do with optimism and little to do with anything that the average newspaper reader cares about. Luckily, we have journalists to make science interesting on the cheap and on the quick - at the cost of accuracy. There's a lot of really interesting, really thought-provoking popular science writing to be done about the dot-probe, and about the 5HTTLRP gene. But none of it has yet made it into the British papers.


*Fox, my PubMed search reveals, also does work on so-called "electromagnetic sensitivity". The upshot of her work is that lots of people sincerely believe that signals from mobile phones and other sources make them feel unwell, but actually, it's all the placebo effect. Now that really is something that everyone should find fascinating - much more so than this study, anyway.

ResearchBlogging.orgElaine Fox, Anna Ridgewell and Chris Ashwin (2009). Looking on the bright side: biased attention and the human serotonin transporter gene Proc. R. Soc. B

The Ethics of Junk Science

"Women, the weaker sex... at resisting food, researchers find".
This appeared in the Daily Mail a while back. This headline was based on a neuroimaging experiment, which, predictably, didn't prove anything of the kind. Yawn. I've written about this kind of thing before, and no doubt I will do again. But why do I do it? What's the harm in this kind of thing?

A cynic might say that this kind of thing is harmless fun - or at any rate, harmless. No-one really cares about articles like this, and no-one takes them seriously. No-one's going to read this article and start to think that all women are impulsive and gluttonous - at least not unless they were a sexist pig to begin with.

The opposite view is that this article represents a sexist attack on the rights and status of women (in this particular case), and more generally, that this kind of science writing promotes a reductionist view of life in which all of our problems are ultimately biological ones. This has the dangerous implication that our problems either can't be solved, or can only be solved through the application of some kind of pill or potion. Junk science writing of this kind, then, is actively dangerous.

Those are the two extremes. The truth, I assume, lies somewhere in between. No-one is going to read this one single article and become a sexist pig, just like this spectacularly awful article isn't going to make anyone hate hip-hop and no-one is going to listen to Baroness Susan Greenfield and decide they want to ban Facebook. It's just a couple of hundred words in the Daily Mail.

But while eating one packet of cookies won't make you overweight, scoff a packet every day and your waistline will suffer for it. Every crap science article in the newspapers is another portion of nonsense in the nation's diet; over time, it builds up. One simplistic, vaguely sexist popular science headline isn't going to do much harm; if that's all people read for years on end it's going to have an effect. And there are loads of them.

There's anothing point, though. Even if you don't have strong views on the particular issues at hand, you should care about the abyssmal standards of science journalism. I'm open to the idea that women are, on average, less able to restrain their hunger then men. It's probably not true, but I don't see that idea in itself as either implausible or inherently sexist. But what I do know is that it's a fairly difficult question. There are arguments on both sides. More generally, the controversy over human sex differences is a vast one, with a huge amount of evidence to consider, and it isn't going to be settled by one neuroimaging study, or even a hundred.

The implicit message of this kind of junk science writing is that all kinds of complex and difficult scientific questions are in fact really simple. If people were more aware of the difficulties inherent in answering even apparantly simple questions, they'd be less willing to settle for the easy, simple, entirely wrong answers when it comes to practical political and social issues.

Maybe that's too simplistic. But it wouldn't hurt.


powered by Blogger