Tampilkan postingan dengan label media. Tampilkan semua postingan
Tampilkan postingan dengan label media. Tampilkan semua postingan

BBC: Something Happened, For Some Reason

According to the BBC, the British recession and spending cuts are making us all depressed.


They found that between 2006 and 2010, prescriptions for SSRI antidepressants rose by 43%. They attribute this to a rise in the rates of depression caused by the financial crisis. OK there are a few caveats, but this is the clear message of an article titled Money woes 'linked to rise in depression'. To get this data they used the Freedom of Information Act.

What they don't do is to provide any of the raw data. So we just have to take their word for it. Maybe someone ought to use the Freedom of Information Act to make them tell us? This is important, because while I'll take the BBC's word about the SSRI rise of 43%, they also say that rates of other antidepressants rose - but they don't say which ones, by how much, or anything else. They don't say how many fell, or stayed flat.

Given which it's impossible to know what to make of this. Here are some alternative explanations:

  • This just represents the continuation of the well-known trend, seen in the USA and Europe as well as the UK, for increasing antidepressant use. This is my personal best guess and Ben Goldacre points out that rates rose 36% during the boom years of 2000-2005.
  • Depression has not got more common, it's just that it's more likely to be treated. This overlaps with the first theory. Support for this comes from the fact that suicide rates haven't risen - at least not by anywhere near 40%.
  • Mental illness is no more likely to be treated, but it's more likely to be treated with antidepressants, as opposed to other drugs. There was, and is, a move to get people off drugs like benzodiazepines, and onto antidepressants. However I suspect this process is largely complete now.
  • Total antidepressant use isn't rising but SSRI use is because doctors increasingly prescribe SSRIs over opposed to other drugs. This was another Ben Goldacre suggestion and it is surely a factor although again, I suspect that this process was largely complete by 2007.
  • People are more likely to be taking multiple different antidepressants, which would manifest as a rise in prescriptions, even if the total number of users stayed constant. Add-on treatment with mirtazapine and others is becoming more popular.
  • People are staying on antidepressants for longer meaning more prescriptions. This might not even mean that they're staying ill for longer, it might just mean that doctors are getting better at convincing people to keep taking them by e.g. prescribing drugs with milder side effects, or by referring people for psychotherapy which could increase use by keeping people "in the system" and taking their medication. This is very likely. I previously blogged about a paper showing that in 1993 to 2005, antidepressant prescriptions rose although rates of depression fell, because of a small rise in the number of people taking them for very long periods.
  • Mental illness rates are rising, but it's not depression: it's anxiety, or something else. Entirely plausible since we know that many people taking antidepressants, in the USA, have no diagnosable depression and even no diagnosable psychiatric disorder at all.
  • People are relying on the NHS to prescribe them drugs, as opposed to private doctors, because they can't afford to go private. Private medicine in the UK is only a small sector so this is unlikely to account for much but it's the kind of thing you need to think about.
  • Rates of depression have risen, but it's nothing to do with the economy, it's something else which happened between 2007 and 2010: the Premiership of Gordon Brown? The assassination of Benazir Bhutto? The discovery of a 2,100 year old Japanese melon?
Personally, my money's on the melon.

Amy Bishop, Neuroscientist Turned Killer

Across at Wired, Amy Wallace has a long but riveting article about Amy Bishop, the neuroscience professor who shot her colleagues at the University of Alabama last year, killing three.

It's a fascinating article because of the picture it paints of a killer and it's well worth the time to read. Yet it doesn't really answer the question posed in the title: "What Made This University Scientist Snap?"

Wallace notes the theory that Bishop snapped because she was denied tenure at the University, a serious blow to anyone's career and especially to someone who, apparantly, believed she was destined for great things. However, she points out that the timing doesn't fit: Bishop was denied tenure several months before the shooting. And she shot at some of the faculty who voted in her favor, ruling out a simple "revenge" motive.

But even if Bishop had snapped the day after she found out about the tenure decision, what would that explain? Thousands of people are denied tenure every year. This has been going on for decades. No-one except Bishop has ever decided to pick up a gun in response.

Bishop had always displayed a streak of senseless violence; in 1986, she killed her 18 year old brother with a shotgun in her own kitchen. She was 21. The death was ruled an accident, but probably wasn't. It's not clear what it was, though: Bishop had no clear motive.

Amy had said something that upset her father. That morning they’d squabbled, and at about 11:30 am, Sam, a film professor at Northeastern University, left the family’s Victorian home to go shopping... Amy, 21, was in her bedroom upstairs. She was worried about “robbers,” she would later tell the police. So she loaded her father’s 12-gauge pump-action shotgun and accidentally discharged a round in her room. The blast struck a lamp and a mirror and blew a hole in the wall...

The gun, a Mossberg model 500A, holds multiple rounds and must be pumped after each discharge to chamber another shell. Bishop had loaded the gun with number-four lead shot. After firing the round into the wall, she could have put the weapon aside. Instead, she took it downstairs and walked into the kitchen. At some point, she pumped the gun, chambering another round.

...[her mother] told police she was at the sink and Seth was by the stove when Amy appeared. “I have a shell in the gun, and I don’t know how to unload it,” Judy told police her daughter said. Judy continued, “I told Amy not to point the gun at anybody. Amy turned toward her brother and the gun fired, hitting him.”

Years later Bishop, possibly with the help of her husband, sent a letter-bomb to a researcher who'd sacked her, Paul Rosenberg. Rosenberg avoided setting off the suspicious package and police disarmed it; Bishop was questioned, but never charged.

Wallace argues that Bishop's "eccentricity", or instability, was fairly evident to those who knew her but that in the environment of science, it went unquestioned because science is full of eccentrics.

I'm not sure this holds up. It's certainly true that science has more than its fair share of oddballs. The "mad scientist" trope is a stereotype but it has its basis in fact and it has done at least since Newton; many say that you can't be a great scientist and be entirely 'normal'.

But the problem with this, as a theory for why Bishop wasn't spotted sooner, is that she was spotted sooner, as unhinged, albeit not as a potential killer,by a number of people. Rosenberg sacked her, in 1993, on the grounds that her work was inadaquate and said that "Bishop just didn’t seem stable". And in 2009, the reason Bishop was denied tenure in Alabama was partially that one of her assessors referred to her as "crazy", more than once; she filed a complaint on that basis.

Bishop also published a bizarre paper in 2009 written by herself, her husband, and her three children, of "Cherokee Lab Systems", a company which was apparantly nothing more than a fancy name for their house. There may be a lot of eccentrics in science, but that's really weird.

So I think that all of these attempts at an explanation fall short. Amy Bishop is a black swan; she is the first American professor to do what she did. Hundreds of thousands of scientists have been through the same academic system and only one ended up shooting their colleagues. If there is an explanation, it lies within Bishop herself.

Whether she was suffering from a diagnosable mental illness is unclear. Her lawyer has said so, but he would; it's her only defence. Maybe we'll learn more at the trial.#

H/T: David Dobbs for linking to this.

The Mystery of "Whoonga"


According to a disturbing BBC news story, South African drug addicts are stealing medication from HIV+ people and using it to get high:

'Whoonga' threat to South African HIV patients

"Whoonga" is, allegedly, the street name for efavirenz (aka Stocrin), one of the most popular antiretroviral drugs. The pills are apparantly crushed, mixed with marijuana, and smoked for its hallucinogenic effects.

This is not, in fact, a new story; Scientific American covered it 18 months ago and the BBC themselves did in 2008 (although they didn't name efavirenz.)

Edit 16.00 pm: In fact the picture is even messier than I first thought. Some sources, e.g. Wikipedia and the articles it links to, mostly from South Africa, suggest that "whoonga" is actually a 'brand' of heroin and that the antiretrovirals may not be the main ingredient, if they're an ingredient at all. If this is true, then the BBC article is misleading. Edit and see the Comments for more on this...

Why would an antiviral drug get you high? This is where things get rather mysterious. Efavirenz is known to enter the brain, unlike most other HIV drugs, and psychiatric side-effects including anxiety, depression, altered dreams, and even hallucinations are common in efavirenz use, especially with high doses (1,2,3), but they're usually mild and temporary. But what's the mechanism?

No-one knows, basically. Blank et al found that efavirenz causes a positive result on urine screening for benzodiazepines (like Valium). This makes sense given the chemical structure:
Efavirenz is not a benzodiazepine, because it doesn't have the defining diazepine ring (the one with two Ns). However, as you can see, it has a lot in common with certain benzos such as oxazepam and lorazepam.

However, while this might well explain why it confuses urine tests, it doesn't by itself go far to explaining the reported psychoactive effects. Oxazepam and lorazepam don't cause hallucinations or psychosis, and they reduce anxiety, rather than causing it.

They also found that efavirenz caused a false positive for THC, the active ingredient in marijuana; this was probably caused by the gluconuride metabolite. Could this metabolite have marijuana-like effects? No-one knows at present.

Beyond that there's been little research on the effects of efavirenz in the brain. This 2010 paper reviewed the literature and found almost nothing. There were some suggestions that it might affect inflammatory cytokines or creatine kinase, but these are not obvious candidates for the reported effects.

Could the liver be responsible, rather than the brain? Interestingly, the 2010 paper says that efavirenz inhibits three liver enzymes: CYPs 2C9, 2C19, and 3A4. All three are involved in the breakdown of THC, so, in theory, efavirenz might boost the effects of marijauna by this mechanism - but that wouldn't explain the psychiatric side effects seen in people who are taking the drug for HIV and don't smoke weed.

Drugs that cause hallucinations generally either agonize 5HT2A receptors or block NMDA receptors. Off the top of my head, I can't see any similarities between efavirenz and drugs that target those systems like LCD (5HT2A) or ketamine or PCP (NMDA), but I'm no chemist and anyway, structural similarity is not always a good guide to what drugs do.

If I were interested in working out what's going on with efavirenz, I'd start by looking at GABA, the neurotransmitter that's the target of benzos. Maybe the almost-a-benzodiazepine-but-not-quite structure means that it causes some unusual effects on GABA receptors? No-one knows at present. Then I'd move on to 5HT2A and NMDA receptors.

Finally, it's always possible that the users are just getting stoned on cannabis and mistakenly thinking that the efavirenz is making it better through the placebo effect. Stranger things have happened. If so, it would make the whole situation even more tragic than it already is.

ResearchBlogging.orgCavalcante GI, Capistrano VL, Cavalcante FS, Vasconcelos SM, MacĂȘdo DS, Sousa FC, Woods DJ, & Fonteles MM (2010). Implications of efavirenz for neuropsychiatry: a review. The International journal of neuroscience, 120 (12), 739-45 PMID: 20964556

The Web of Morgellons

A fascinating new paper: Morgellons Disease, or Antipsychotic-Responsive Delusional Parasitosis, in an HIV Patient: Beliefs in The Age of the Internet

“Mr. A” was a 43-year-old man...His most pressing medical complaint was worrisome fatigue. He was not depressed...had no formal psychiatric history, no family psychiatric history, and he was a successful businessman.

He was referred to the psychiatry department by his primary-care physician (PCP) because of a 2-year-long complaint of pruritus [itching] accompanied by the belief of being infested with parasites. Numerous visits to the infectious disease clinic and an extensive medical work-up...had not uncovered any medical disorder, to the patient’s great frustration.

Although no parasites were ever trapped, Mr. A caused skin damage by probing for them and by applying topical solutions such as hydrogen peroxide to “bring them to the surface.” After reading about Morgellons disease on the Internet, he “recalled” extruding particles from his skin, including “dirt” and “fuzz.”

During the initial consultation visit with the psychiatrist, Mr. A was apprehensive but cautiously optimistic that a medication could help. The psychiatrist had been forewarned by the PCP that the patient had discovered a website describing Morgellons and “latched onto” this diagnosis.

However, it was notable that the patient allowed the possibility (“30%”) that he was suffering from delusions (and not Morgellons), mostly because he trusted his PCP, “who has taken very good care of me for many years.”

The patient agreed to a risperidone [an antipsychotic] trial of up to 2 mg per day. [i.e. a lowish dose]. Within weeks, his preoccupation with being infested lessened significantly... Although not 100% convinced that he might not have Morgellons disease, he is no longer pruritic and is no longer damaging his skin or trying to trap insects. He remains greatly improved 1 year later.
(Mr A. had also been HIV+ for 20 years, but he still had good immune function and the HIV may have had nothing to do with the case.)

"Morgellons" is, according to people who say they suffer from it, a mysterious disease characterised by the feeling of parasites or insects moving underneath the skin, accompanied by skin lesions out of which emerge strange, brightly-coloured fibres or threads. Other symptoms include fatigue, aches and pains, and difficulty concentrating.

According to almost all doctors, there are no parasites, the lesions are caused by the patient's own scratching or attempts to dig out the non-existent critters, and the fibres come from clothes, carpets, or other textiles which the patient has somehow inserted into their own skin. It may seem unbelievable that someone could do this "unconsciously", but stranger things have happened.

As the authors of this paper, Freudenreich et al, say, Morgellons is a disease of the internet age. It was "discovered" in 2002 by a Mary Leitao, with Patient Zero being her own 2 year old son. Since then its fame, and the reported number of cases, has grown steadily - especially in California.

Delusional parasitosis is the opposite of Morgellons: doctors believe in it, but the people who have it, don't. It's seen in some mental disorders and is also quite common in abusers of certain drugs like methamphetamine. It feels like there are bugs beneath your skin. There aren't, but the belief that there are is very powerful.

This then is the raw material in most cases; what the concept of "Morgellons" adds is a theory, a social context and a set of expectations that helps make sense of the otherwise baffling symptoms. And as we know expectations, whether positive or negative, tend to be become experiences. The diagnosis doesn't create the symptoms out of nowhere but rather takes them and reshapes them into a coherent pattern.

As Freudenreich et al note, doctors may be tempted to argue with the patient - you don't have Morgellons, there's no such thing, it's absurd - but the whole point is that mainstream medicine couldn't explain the symptoms, which is why the patient turned to less orthodox ideas.

Remember the extensive tests that came up negative "to the patient’s great frustration." And remember that "delusional parasitosis" is not an explanation, just a description, of the symptoms. To diagnose someone with that is saying "We've no idea why but you've imagined this". True, maybe, but not very palatable.

Rather, they say, doctors should just suggest that maybe there's something else going on, and should prescribe a treatment on that basis. Not rejecting the patient's beliefs but saying, maybe you're right, but in my experience this treatment makes people with your condition feel better, and that's why you're here, right?

Whether the pills worked purely as a placebo or whether there was a direct pharmacological effect, we'll never know. Probably it was a bit of both. It's not clear that it's important, really. The patient improved, and it's unlikely that it would have worked as well if they'd been given in a negative atmosphere of coercion or rejection - if indeed he'd agreed to take them at all.

Morgellons is a classic case of a disease that consists of an underlying experience filtered through the lens of a socially-transmitted interpretation. But every disease is that, to a degree. Even the most rigorously "medical" conditions like cancer also come with a set of expectations and a social meaning; psychiatric disorders certainly do.

I guess Morgellons is too new to be a textbook case yet - but it should be. Everyone with an interest in the mind, everyone who treats diseases, and everyone who's ever been ill - everyone really - ought to be familiar with it because while it's an extreme case, it's not unique. "All life is here" in those tangled little fibres.

ResearchBlogging.orgFreudenreich O, Kontos N, Tranulis C, & Cather C (2010). Morgellons disease, or antipsychotic-responsive delusional parasitosis, in an hiv patient: beliefs in the age of the internet. Psychosomatics, 51 (6), 453-7 PMID: 21051675

WMDs vs MDD

Weapons of Mass Destruction. Nuclear, chemical and biological weapons. They're really nasty, right?

Well, some of them are. Nuclear weapons are Very Destructive Indeed. Even a tiny one, detonated in the middle of a major city, would probably kill hundreds of thousands. A medium-sized nuke could kill millions. The biggest would wipe a small country off the map in one go.

Chemical and biological weapons, on the other hand, while hardly nice, are just not on the same scale.

Sure, there are nightmare scenarios - a genetically engineered supervirus that kills a billion people - but they're hypothetical. If someone does design such a virus, then we can worry. As it is, biological weapons have never proven very useful. The 2001 US anthrax letters killed 5 people. Jared Loughner killed 6 with a gun he bought from a chain store.

Chemical weapons are little better. They were used heavily in WW1 and the Iran-Iraq War against military targets and killed many but never achieved a decisive victory, and the vast majority of deaths in these wars were caused by plain old bullets and bombs. Iraq's use of chemical weapons against Kurds in Halabja killed perhaps 5,000 - but this was a full-scale assault by an advanced air force, lasting several hours, on a defenceless population.

When a state-of-the-art nerve agent was used in the Tokyo subway attack, after much preparation by the cult responsible, who had professional chemists and advanced labs, 13 people died. In London on the 7th July 2005, terrorists killed 52 people with explosives made from haircare products.

Nuclear weapons aside, the best way to cause mass destruction is just to make an explosion, the bigger the better; yet conventional explosives, no matter how big, are not "WMDs", while chemical and biological weapons are.

So it seems to me that the term and the concept of "WMDs" is fundamentally unhelpful. It lumps together the apocalyptically powerful with the much less destructive. If you have to discuss everything except guns and explosives in one category, terms like "Unconventional weapons" are better as they avoid the misleading implication that all of these weapons are very, and equivalently, deadly; but grouping them together at all is risky.

That's WMDs. But there are plenty of other unhelpful concepts out there, some of which I've discussed previously. Take the concept of "major depressive disorder", for example. At least as the term is currently used, it lumps together extremely serious cases requiring hospitalization with mild "symptoms" which 40% of people experience by age 32.

Boy Without A Cerebellum...Has No Cerebellum

A reader pointed me to this piece:

Boy Without a Cerebellum Baffles Doctors
Argh. This is going to be a bit awkward. So I'll just say at the outset that I have nothing against kids struggling with serious illnesses and I wish them all the best.


The article's about Chase Britton, a boy who apparantly lacks two important parts of the brain: the cerebellum and the pons. Despite this, the article says, Chase is a lovely kid and is determined to be as active as possible.

As I said, I am all in favor of this. However the article runs into trouble is where it starts to argue that "doctors are baffled" by this:

When he was 1 year old, doctors did an MRI, expecting to find he had a mild case of cerebral palsy. Instead, they discovered he was completely missing his cerebellum -- the part of the brain that controls motor skills, balance and emotions.

"That's when the doctor called and didn't know what to say to us," Britton said in a telephone interview. "No one had ever seen it before. And then we'd go to the neurologists and they'd say, 'That's impossible.' 'He has the MRI of a vegetable,' one of the doctors said to us."

Chase is not a vegetable, leaving doctors bewildered and experts rethinking what they thought they knew about the human brain.

They don't say which doctor made the "vegetable" comment but whoever it was deserves to be hit over the head with a large marrow because it's just not true. The cerebellum is more or less a kind of sidekick for the rest of the brain. Although it actually contains more brain cells than the rest of the brain put together (they're really small ones), it's not required for any of our basic functions such as sensation or movement.

Without it, you can still move, because movement commands are initiated in the motor cortex. Such movement is clumsy and awkward (ataxia), because the cerebellum helps to coordinate things like posture and gait, getting the timing exactly right to allow you to move smoothly. Like how your mouse makes it easy and intuitive to move the cursor around the screen.

Imagine if you had no mouse and had to move the cursor with a pair of big rusty iron levers to go left and right, up and down. It would be annoying, but eventually, maybe, you could learn to compensate.

From the footage of Chase alongside the article it's clear that he has problems with coordination, albeit he's gradually learning to be able to move despite them.

Lacking a pons is another kettle of fish however. The pons is part of your brainstem and it controls, amongst other things, breathing. In fact you (or rather your body) can survive perfectly well if the whole of your brain above the pons is removed; only the brainstem is required for vital functions.

So it seems very unlikely that Chase actually lacks a pons. The article claims that scans show that "There is only fluid where the cerebellum and pons should be" but as Steven Novella points out in his post on the case, the pons might be so shrunken that it's not easily visible - at least not in the place it normally is - yet functional remnants could remain.

As for the idea that the case is bafflingly unique, it's not really. There are no less than 6 known types of pontocerebellar hypoplasia caused by different genes; Novella points to a case series of children whose cerebellums seemed to develop normally in the womb, but then degenerated when they were born prematurely, which Chase was.

The article has had well over a thousand comments and has attracted lots of links from religious websites amongst others. The case seems, if you believe the article, to mean that the brain isn't all that important, almost as if there was some kind of immaterial soul at work instead... or at the very least suggesting that the brain is much more "plastic" and changeable than neuroscientists suppose.

Unfortunately, the heroic efforts that Chase has been required to make to cope with his disability suggest otherwise and as I've written before, while neuroplasticity is certainly real it has its limits.

The Social Network and Anorexia

Could social networks be more important than the media in the spread of eating disorders?

There's a story about eating disorders roughly like this: eating disorders (ED) are about wanting to be thin. The idea that thinness is desireable is something that's spread by Western media, especially visual media i.e. TV and magazines. Therefore, Western media exposure causes eating disorders.

It's a nice simple theory. And it seems to fit with the fact that eating disorders, hitherto very rare, start to appear in a certain country in conjunction with the spread of Westernized media. A number of studies have shown this. However, a new paper suggests that there may be rather more to it: Social network media exposure and adolescent eating pathology in Fiji.

Fiji is a former British colony, a tropical island nation of less than a million. Just over half the population are ethnic native Fijian people. Until recently, these Fijians were relatively untouched by Western culture, but this is starting to change.

The authors of this study surveyed 523 Fijian high school girls. Interviews took place in 2007. They asked them various questions relating to, one the one hand, eating disorder symptoms, and on the other hand, their exposure to various forms of media.

They looked at both individual exposure - hours of TV watched, electronic entertainment in the home - and "indirect" or "social network" exposure, such as TV watched by the parents, and the amount of electronic entertainment their friends owned. On top of this they measured Westernization/"globalization", such as the amount of overseas travel by the girls or their parents.

So what happened? Basically, social network media exposure, urbanization, and Westernization correlated with ED symptoms, but when you controlled for those variables, personal media exposure didn't correlate. Here's the data; the column I've highlighted is the data where each variable is controlled for the others. The correlations are pretty small (0 is none, 1.0 would be perfect) but significant.


They conclude that:

Although consistent with the prevailing sociocultural model for the relation between media exposure and disordered eating... our finding, that indirect exposure to media content may be even more influential than direct exposure in this particular social context, is novel.
The idea that eating disorders are simply a product of a culture which values thinness as attractive has always seemed a bit shaky to me because people with anorexia frequently starve themselves far past the point of being attractive even by the unrealistic standards of magazines and movies.

In fact, if eating disorders were just an attempt to "look good", they wouldn't be nearly so dangerous as they are, because no matter how thin-obsessed our culture may be, no-one thinks this is attractive, or normal, or sane. But this, or worse, is what a lot of anorexics end up as.

On the other hand, eating disorders are associated with modern Western culture. There must be a link, but maybe it's more complicated than just "thin = good" causes anorexia. What if you also need the idea of "eating disorders"?

This was the argument put forward by Ethan Watters in Crazy Like Us (my review)... in his account of the rise of anorexia in Hong Kong. Essentially, he said, anorexia was vanishingly rare in Hong Kong until after the much-publicized death of a 14 year old girl, Charlene Chi-Ying, in the street. As he put it:
In trying to explain what happened to Charlene, local reporters often simply copied out of American diagnostic manuals. The mental-health experts quoted in the Hong Kong papers and magazines confidently reported that anorexia in Hong Kong was the same disorder that appeared in the United States and Europe...

As the general public and the region's mental-health professionals came to understand the American diagnosis of anorexia, the presentation of the illness in [Hong Kong psychiatrist] Lee's patient population appeared to transform into the more virulent American standard. Lee once saw two or three anorexic patients a year; by the end of the 1990s he was seeing that many new cases each month.
Now it's important not to see this as trivializing the condition or as a way of blaming the victim; "they're just following a trend!". You only have to look at someone with anorexia to see that there is nothing trivial about it. However, that doesn't mean it's not a social phenomenon.

It's a long way from the data in this study to Watters' conclusions, but maybe not an impossible leap. Part of Westernization, after all, is exposure to Western ideas about what is healthy eating and what's an eating disorder...

ResearchBlogging.orgBecker, A., Fay, K., Agnew-Blais, J., Khan, A., Striegel-Moore, R., & Gilman, S. (2011). Social network media exposure and adolescent eating pathology in Fiji The British Journal of Psychiatry, 198 (1), 43-50 DOI: 10.1192/bjp.bp.110.078675

Psychoanalysis: So Bad It's Good?

Many of the best things in life are terrible.


We all know about the fun to be found in failure, as exemplified by Judge A Book By Its Cover and of course FailBlog. The whole genre of B-movie appreciation is based on the maxim of: so bad, it's good.

But could the same thing apply to psychotherapies?

Here's the argument. Freudian psychoanalysis is a bit silly. Freud had pretensions to scientific respectability, but never really achieved it, and with good reason. You can believe Freud, and if you do, it kind of make sense. But to anyone else, it's a bit weird. If psychoanalysis were a person, it would be the Pope.

By contrast, cognitive-behavioural therapy is eminently reasonable. It relies on straightforward empirical observations of the patient's symptoms, and on trying to change people's beliefs by rational arguments and real-life examples ("behavioural experiments"). CBT practitioners are always keen to do randomized controlled trials to provide hard evidence for their success. CBT is Richard Dawkins.

But what if the very irrationality of psychoanalysis is its strength? Mental illness is irrational. So's life, right? So maybe you need an irrational kind of therapy to deal with it.

This is almost the argument advanced by Robert Rowland Smith in a short piece In Defence of Psychoanalysis:

...The irony is that in becoming more “scientific”, CBT becomes less therapeutic. Now, Freud himself liked to be thought of as a scientist (he began his career in neurology, working on the spinal ganglia), but it’s the non-scientific features that make psychoanalysis the more, not the less, powerful.

I’m referring to the therapeutic relationship itself. Although like psychoanalysis largely a talking cure, CBT prefers to set aside the emotions in play between doctor and patient. Psychoanalysis does the reverse. To the annoyance no doubt of many a psychoanalytic patient, the very interaction between the two becomes the subject-matter of the therapy.

The respected therapist and writer Irvin Yalom, among others, argues that depression and associated forms of sadness stem from an inability to make good contact with others. Relationships are fundamental to happiness. And so a science that has the courage to include the doctor’s relationship with the patient within the treatment itself, and to work with it, is a science already modelling the solution it prescribes. What psychoanalysis loses in scientific stature, it gains in humanity.
Rowland Smith's argument is that psychoanalysis offers a genuine therapeutic relationship complete with transference and countertransference, while CBT doesn't. He also suggests that analysis is able to offer this relationship precisely because it's unscientific.

Human relationships aren't built on rational, scientific foundations. They can be based on lots of stuff, but reason and evidence ain't high on the list. Someone who agrees with you on everything, or helps you to discover things, is a colleague, but not yet a friend unless you also get along with them personally. Working too closely together on some technical problem can indeed prevent friendships forming, because you never have time to get to know each other personally.

Maybe CBT is just too sensible: too good at making therapists and patients into colleagues in the therapeutic process. It provides the therapist with a powerful tool for understanding and treating the patient's symptoms, at least on a surface level, and involving the patient in that process. But could this very rationality make a truly human relationship impossible?

I'm not convinced. For one thing, there can be no guarantee that psychoanalysis does generate a genuine relationship in any particular case. But you might say that you can never guarantee that, so that's a general problem with all such therapy.

More seriously, psychoanalysis still tries to be scientific, or at least technical, in that it makes use of a specialist vocabulary and ideas ultimately derived from Sigmund Freud. Few psychoanalysts today agree with Freud on everything, but, by definition, they agree with him on some things. That's why they're called "psychoanalysts".

But if psychoanalysis works because of the therapeutic relationship, despite, or even because, Freud was wrong about most things... why not just chat about the patient's problems with the minimum of theoretical baggage? Broadly speaking, counselling is just that. Rowland Smith makes an interesting point, but it's far from clear that it's an argument for psychoanalysis per se.

Note:
A truncated version of this post briefly appeared earlier because I was a wrong-button-clicking klutz this morning. Please ignore that if you saw it.

Left Wing vs. Right Wing Brains

So apparently: Left wing or right wing? It's written in the brain

People with liberal views tended to have increased grey matter in the anterior cingulate cortex, a region of the brain linked to decision-making, in particular when conflicting information is being presented...

Conservatives, meanwhile, had increased grey matter in the amygdala, an area of the brain associated with processing emotion.

This was based on a study of 90 young adults using MRI to measure brain structure. Sadly that press release is all we know about the study at the moment, because it hasn't been published yet. The BBC also have no fewer than three radio shows about it here, here and here.

Politics blog Heresy Corner discusses it...
Subjects who professed liberal or left-wing opinions tended to have a larger anterior cingulate cortex, an area of the brain which, we were told, helps process complex and conflicting information. (Perhaps they need this extra grey matter to be able to cope with the internal contradictions of left-wing philosophy.)
This kind of story tends to attract chuckle-some comments.

In truth, without seeing the full scientific paper, we can't know whether the differences they found were really statistically solid, or whether they were voodoo or fishy. The authors, Geraint Rees and Ryota Kanai, have both published a lot of excellent neuroscience in the past, but that's no guarantee.

In fact, however, I suspect that the brain is just the wrong place to look if you're interested in politics, because most political views don't originate in the individual brain, they originate in the wider culture and are absorbed and regurgitated without much thought. This is a real shame, because all of us, left or right, have a brain, and it's really quite nifty:

But when it comes to politics we generally don't use it. The brain is a powerful organ designed to help you deal with reality in all its complexity. For a lot of people, politics doesn't take place there, it happens in fairytale kingdoms populated by evil monsters, foolish jesters, and brave knights.

Given that the characters in this story are mindless stereotypes, there's no need for empathy. Because the plot comes fully-formed from TV or a newspaper, there's no need for original ideas. Because everything is either obviously right or obviously wrong, there's not much reasoning required. And so on. Which is why this happens amongst other things.

I don't think individual personality is very important in determining which political narratives and values you adopt: your family background, job, and position in society is much more important.

Where individual differences matter, I think, is in deciding how "conservative" or "radical" you are within whatever party you find yourself. Not in the sense of left or right, but in terms of how keen you are on grand ideas and big changes, as opposed to cautious, boring pragmatism.

In this sense, there are conservative liberals (i.e. Obama) and radical conservatives (i.e. Palin), and that's the kind of thing I'd be looking for if I were trying to find political differences in the brain.

Links: If right wingers have bigger amygdalae, does that mean patient SM, the woman with no amygdalae at all, must be a communist? Then again, Neuroskeptic readers may remember that the brain itself is a communist...

XMRV - Innocent on All Counts?

A bombshell has just gone off in the continuing debate over XMRV, the virus that may or may not cause chronic fatigue syndrome. Actually, 4 bombshells.

A set of papers out today in Retrovirology (1,2,3,4) claim that many previous studies claiming to have found the virus haven't actually been detecting XMRV at all.

Here's the rub. XMRV is a retrovirus, a class of bugs that includes HIV. Retroviruses are composed of RNA, but they can insert themselves into the genetic material of host cells as DNA. This is how they reproduce: once their DNA is part of the host cell's chromosomes, that cell is ends up making more copies of the virus.

But there are lots of retroviruses out there, and there used to be yet others that are now extinct. So bits of retroviral DNA are scattered throughout the genome of animals. These are called endogenonous retro-viruses (ERVs).

XMRV is extremely similar to certain ERVs found in the DNA of mice. And mice are the most popular laboratory mammals in the world. So you can see the potential problem: laboratories all over the world are full of mice, but mouse DNA might show up as "XMRV" DNA on PCR tests.

Wary virologists take precautions against this by checking specifically for mouse DNA. But most mouse-contamination tests are targeted at mouse mitochondrial DNA (mtDNA). In theory, a test for mouse mtDNA is all you need, because mtDNA is found in all mouse cells. In theory.

Now the four papers (or are they the Four Horsemen?) argue, in a nutshell, that mouse DNA shows up as "XMRV" on most of the popular tests that have been used in the past, that mouse contamination is very common - even some of the test kits are affected! - and that tests for mouse mtDNA are not good enough to detect the problem.

  • Hue et al say that "Taqman PCR primers previously described as XMRV-specific can amplify common murine ERV sequences from mouse suggesting that mouse DNA can contaminate patient samples and confound specific XMRV detection." They go on to show that some human samples previously reported as infected with XMRV, are actually infected with a hybrid of XMRV and a mouse ERV which we know can't infect humans.
  • Sato et al report that PCR testing kits from Invitrogen, a leading biotech company, are contaminated with mouse genes including an ERV almost identical to XMRV, and that this shows up as a false positive using commonly used PCR primers "specific to XMRV".
  • Oakes et al say that in 112 CFS patients and 36 healthy control, they detected "XMRV" in some samples but all of these samples were likely contaminated with mouse DNA because "all samples that tested positive for XMRV and/or MLV DNA were also positive for the highly abundant IAP long terminal repeat [found only in mice] and most were positive for murine mitochondrial cytochrome oxidase sequences [found only in mice]"
  • Robinson et al agree with Oakes et al: they found "XMRV" in some human samples, in this case prostate cancer cells, but they then found that all of the "infected" samples were contaminated with mouse DNA. They recommend that in future, samples should be tested for mouse genes such as the IAP long terminal repeat or cytochrome oxidase, and that researchers should not rely on tests for mouse mtDNA.
They're all open-access so everyone can take a peek. For another overview see this summary published alongside them in Retrovirology.

I lack the technical knowledge to evaluate these claims, no doubt plenty of people will be rushing to do that before long. (Update: The excellent virologyblog has a more technical discussion of these studies.) But there are a couple of things to bear in mind.

Firstly, these papers cast doubt on tests using PCR to detect XMRV DNA. However, they don't have anything to say about studies which have looked for antibodies against XMRV in human blood, at least not directly. There haven't been many of these, but the paper which started the whole story, Lombardi et al (2009), did look for, and found, anti-XMRV immunity, and also used various other methods to support the idea that XMRV is present in humans. So this isn't an "instant knock-out" of the XMRV theory, although it's certainly a serious blow.

Secondly, if the 'mouse theory' is true, it has serious implications for the idea that XMRV causes chronic fatigue syndrome and also for the older idea that it's linked to prostate cancer. But it still leaves a mystery: why were the samples from CFS or prostate cancer patients more likely to be contaminated with mouse DNA than the samples from healthy controls?

ResearchBlogging.orgRobert A Smith (2010). Contamination of clinical specimens with MLV-encoding nucleic acids: implications for XMRV and other candidate human retroviruses Retrovirology : 10.1186/1742-4690-7-112

The Almond of Horror

Remember the 90s, when No Fear stuff was cool, and when people still said "cool"?

Well, a new paper has brought No Fear back, by reporting on a woman who has no fear - due to brain damage. The article, The Human Amygdala and the Induction and Experience of Fear, is brought to you by a list of neuroscientists including big names such as Antonio Damasio (of Phineas Gage fame).

The basic story is nice and simple. There's a woman, SM, who lacks a part of the brain called the amygdala. They found that she can't feel fear. Therefore, it's reasonable to assume that the amygdala's required for fear. But there's a bit more to it than that...

The amygdala is a small nugget of the brain nestled in the medial temporal lobe. The name comes from the Greek for "almond" because apparently it looks like one, though I can't say I've noticed the resemblance myself.

What does it do? Good question. There are two main schools of thought. Some think that the amygdala is responsible for the emotion of fear, while others argue that its role is much broader and that it's responsible for measuring the "salience" or importance of stimuli, which covers fear but also much else.

That's where this new paper comes in, with the patient SM. She's not a new patient: she's been studied for years, and many papers have been published about her. I wonder if her acronym doesn't stand for "Scientific Motherlode"?

She's one of the very few living cases of Urbach-Wiethe disease, an extremely rare genetic disorder which causes selective degeneration of the amygdala as well as other symptoms such as skin problems.

Previous studies on SM mostly focussed on specific aspects of her neurological function e.g. memory, perception and so on. However there have been a few studies of her "everyday" experiences and personality. Thus we learned that:

Two experienced clinical psychologists conducted "blind" interviews of SM (the psychologists were not provided any background information)... Both reached the conclusion that SM expressed a normal range of affect and emotion... However, they both noted that SM was remarkably dispassionate when relating highly emotional and traumatic life experiences... To the psychologists, SM came across as a "survivor", as being "resilient" and even "heroic".
These observations were based on interviews under normal conditions; what would happen if you actually went out of your way to try and scare her? So they did.

First, they took her to an exotic pet store and got her to meet various snakes and spiders. She was perfectly happy picking up the various critters and had to be prevented from getting too closely acquainted with the more dangerous ones.

What's fascinating is that before she went to the store, she claimed to hate snakes and spiders! Why? Before she developed Urbach-Wiethe disease, she had a normal childhood up to about the age of 10. Presumably she used to be afraid of them, and just never updated this belief, a great example of how our own narratives about our feelings can clash with our real feelings.

They subsequently confirmed that SM was fearless by taking her to a "haunted asylum" (check it out, even the website is scary) and showing her various horror movie clips, as well as through interviews with herself and her son. They also describe an incredible incident from several years ago: SM was walking home late at night when she saw
A man, whom SM described as looking “drugged-out.” As she walked past the park, the man called out and motioned for her to come over. SM made her way to the park bench. As she got within arm’s reach of the man, he suddenly stood up, pulled her down to the bench by her shirt, stuck a knife to her throat, and exclaimed, “I’m going to cut you, bitch!”

SM claims that she remained calm, did not panic, and did not feel afraid. In the distance she could hear the church choir singing. She looked at the man and confidently replied, “If you’re going to kill me, you’re gonna have to go through my God’s angels first.” The man suddenly let her go. SM reports “walking” back to her home. On the following day, she walked past the same park again. There were no signs of avoidance behavior and no feelings of fear.
All this suggests that the amygdala has a key role in the experience of fear, as opposed to other emotions: there is no evidence to suggest that SM lacks the ability to experience happiness or sadness in the same way.

So this is an interesting contribution to the debate on the role of the amygdala, although we really need someone to do equally detailed studies on other Urbach-Wiethe patients to make sure that it's not just that SM happens to be unusually brave for some other reason. What's doubly interesting, though, is that Ralph Adolphs, one of the authors, has previously argued against the view of the amygdala as a "fear center".

Links: I've previously written about the psychology of horror movies and I've reviewed quite a lot of them too.

ResearchBlogging.orgJustin S. Feinstein, Ralph Adolphs, Antonio Damasio,, & and Daniel Tranel (2010). The Human Amygdala and the Induction and Experience of Fear Current Biology

Wikileaks: A Conversation

"Wikileaks is great. It lets people leak stuff."

"Hang on, so you're saying that no-one could leak stuff before? They invented it?"

"Well, no, but they brought leaking to the masses. Sure, people could post documents to the press before, but now anyone in the world can access the leaks!"

"Great, but isn't that just the internet that did that? If it weren't for Wikileaks, people could just upload their leaks to a blog. Or email them to 50 newspapers. Or put them on the torrents. Or start their own site. If it's good, it would go viral, and be impossible to take down. Just like Wikileaks, with all their mirrors, except even more secure, because there'd be literally no-one to arrest or cut off funding to."

"OK, but Wikileaks is a brand. It's not about the technical stuff - it's the message. Like one of their wallpapers says, they're synonymous with free speech."

"So you think it's a good thing that one organization has become synonymous with the whole process of leaking? With the whole concept of openness? What will happen to the idea of free speech, then, if that brand image suddenly gets tarnished - like, say, if their founder and figurehead gets convicted of a serious crime, or..."

"He's innocent! Justice for Julian!"

"Quite possibly, but why do you care? Is he a personal friend?"

"It's an attack on free speech!"

"So you agree that one man has become synonymous with free speech? Doesn't that bother you?"

"Erm... well. Look, fundamentally, we need Wikileaks. Before, there was no centralized system for leaking. Anyone could do it. It was a mess! Wikileaks put everything in one place, and put a committee of experts in a position to decide what was worth leaking and what wasn't. It brought much-needed efficiency and respectability to the idea of leaking. Before Wikileaks, it was anarchy. They're like... the government."

"..."

Edit: See also The Last Psychiatrist's take.

Online Comments: It's Not You, It's Them

Last week I was at a discussion about New Media, and someone mentioned that they'd been put off from writing content online because of a comment on one of their articles accusing them of being "stupid".

I found this surprising - not the comment, but that anyone would take it so personally. It's the internet. You will get called names. Everyone does. It doesn't mean there's anything wrong with you.

I suspect this is a generational issue. People who 'grew up online' know, as Penny Arcade explained, that

The sad fact is that there are millions of people whose idea of fun is to find people they disagree with, and mock them. And they're right, it can be fun - why else do you think people like Jon Stewart are so popular? - but that's all it is, entertainment. If you're on the receiving end, don't take it seriously.

If you write something online, and a lot of people read it, you will get slammed. Someone, somewhere, will disagree with you and they'll tell you so, in no uncertain terms. This is true whatever you write about, but some topics are like a big red rag to the herds of bulls out there.

Just to name a few, if you say anything vaguely related to climate change, religion, health, the economy, feminism or race, you might as well be holding a placard with a big arrow pointing down at you and "Sling Mud Here" on it.

The point is - it's them, not you. They are not interested in you, they don't know you, it's not you. True, they might tailor their insults a bit; if you're a young woman you might be, say, a "stupid girl" where a man would merely get called an "idiot". But this doesn't mean that the attacks are a reflection on you in any way. You just happen to be the one in the line of fire.

What do you do about this? Nothing.

Trying to enter into a serious debate is pointless. Insulting them back can be fun, just remember that if you find it fun, you've become one of them: "he who stares too long into the abyss...", etc. Complaining to the moderators might help, but unless the site has a rock solid zero-tolerance-for-fuckwads policy, probably not. Where the blight has taken root, like Comment is Free, I'd not waste your time complaining. Just ignore it and carry on.

The most important thing is not to take it personally. Do not get offended. Do not care. Because no-one else cares. Especially the people who wrote the comments. They presumably care about whatever "issue" prompted their attack, but they don't care about you. If anything, you should be pleased, because on the internet, the only stuff that doesn't attract stupid comments is the stuff that no-one reads.

I've heard these attacks referred to as "policing" existing hierarchies or "silencing" certain types of people. This seems to me to be granting them far more respect than they deserve. With the actual police, if you break the rules, they will physically arrest you. They have power. Internet trolls don't: if they succeed in policing or silencing anybody, it's because their targets let them boss them around. They're nobody; they're not your problem.

If you can't help being offended by such comments, don't read them, but ideally you shouldn't need to resort to that. For one thing, it means you miss the sensible comments (and there's always a few). But fundamentally, you shouldn't need to do this, because you really shouldn't care what some anonymous joker from the depths of the internet thinks about you.

The Town That Went Mad

Pont St. Esprit is a small town in southern France. In 1951 it became famous as the site of one of the most mysterious medical outbreaks of modern times.

As Dr's Gabbai, Lisbonne and Pourquier wrote to the British Medical Journal, 15 days after the "incident":

The first symptoms appeared after a latent period of 6 to 48 hours. In this first phase, the symptoms were generalized, and consisted in a depressive state with anguish and slight agitation.

After some hours the symptoms became more clearly defined, and most of the patients presented with digestive disturbances... Disturbances of the autonomic nervous system accompanied the digestive disorders-gusts of warmth, followed by the impression of "cold waves", with intense sweating crises. We also noted frequent excessive salivation.

The patients were pale and often showed a regular bradycardia (40 to 50 beats a minute), with weakness of the pulse. The heart sounds were rather muffled; the extremities were cold... Thereafter a constant symptom appeared - insomnia lasting several days... A state of giddiness persisted, accompanied by abundant sweating and a disagreeable odour. The special odour struck the patient and his attendants.
In most patients, these symptoms, including the total insomnia, persisted for several days. In some of the patients, these symptoms progressed to full-blown psychosis:
Logorrhoea [speaking a lot], psychomotor agitation, and absolute insomnia always presaged the appearance of mental disorders. Towards evening visual hallucinations appeared, recalling those of alcoholism. The particular themes were visions of animals and of flames. All these visions were fleeting and variable.

In many of the patients they were followed by dreamy delirium. The delirium seemed to be systematized, with animal hallucinations and self-accusation, and it was sometimes mystical or macabre. In some cases terrifying visions were followed by fugues, and two patients even threw themselves out of windows... Every attempt at restraint increased the agitation.

In severe cases muscular spasms appeared, recalling those of tetanus, but seeming to be less sustained and less painful... The duration of these periods of delirium was very varied. They lasted several hours in some patients, in others they still persist.
In total, about 150 people suffered some symptoms. About 25 severe cases developed the "delirium". 4 people died "in muscular spasm and in a state of cardiovascular collapse"; three of these were old and in poor health, but one was a healthy 25-year-old man.

At first, the cause was assumed to be ergotism - poisoning caused by chemicals produced by a fungus which can infect grain crops. Contaminated bread was, therefore, thought to be responsible. Ergotism produces symptoms similar to those reported at Pont St. Esprit, including hallucinations, because some of the toxins are chemically related to LSD.

However, there have been other theories. Some (including Albert Hofmann, the inventor of LSD) attribute the poisoning to pesticides containing mercury, or to the flour bleaching agent nitrogen trichloride.

More recently, journalist Hank Albarelli claimed that it was in fact a CIA experiment to test out the effects of LSD as a chemical weapon, though this is disputed. What really happened is, in other words, still a mystery.

Link: The Crazies (2010) is a movie about a remarkably similar outbreak of mass insanity in a small town.

ResearchBlogging.orgGABBAI, LISBONNE, & POURQUIER (1951). Ergot poisoning at Pont St. Esprit. British medical journal, 2 (4732), 650-1 PMID: 14869677

Brain Scans Prove That The Brain Does Stuff

According to the BBC (and many others)...

Libido problems 'brain not mind'

Scans appear to show differences in brain functioning in women with persistently low sex drives, claim researchers.

The US scientists behind the study suggest it provides solid evidence that the problem can have a physical origin.

The research in question (which hasn't been published yet) has been covered very well over at The Neurocritic. Basically the authors took some women with a diagnosis of "Hypoactive Sexual Desire Disorder" (HSDD), and some normal women, put them in an fMRI scanner and showed them porn. Different areas of the brain lit up.

So what? For starters we have no idea if these differences are real or not because the study only had a tiny 7 normal women, although strangely, it included a full 19 women with HSDD. Maybe they had difficulty finding women with healthy appetites in Detroit?

Either way, a study is only as big as its smallest group so this was tiny. We're also not told anything about the stats they used so for all we know they could have used the kind that give you "results" if you use them on a dead fish.

But let's grant that the results are valid. This doesn't tell us anything we didn't already know. We know the women differ in their sexual responses - because that's the whole point of the study. And we know that this must be something to do with their brain, because the brain is where sexual responses, and every other mental event, happ
en.

So we already know that HSDD "has a physical origin", but only in the sense that everything does; being a Democrat or a Republican has a physical origin; being Christian or Muslim has a physical origin; speaking French as opposed to English has a physical origin; etc. etc.
None of which is interesting or surprising in the slightest.

The point is that the fact that something is physical doesn't stop it being also psychological. Because psychology happens in the brain. Suppose you see a massive bear roaring and charging towards you, and as a result, you feel scared. The fear has a physical basis, and plenty of physical correlates like raised blood pressure, adrenaline release, etc.

But if someone asks "Why are you scared?", you would answer "Because there's a bear about to eat us", and you'd be right. Someone who came along and said, no, your anxiety is purely physical - I can measure all these physiological differences between you and a normal person - would be an idiot (and eaten).

Now sometimes anxiety is "purely physical" i.e. if you have a seizure which affects certain parts of the temporal lobe, you may experience panic and anxiety as a direct result of the abnormal brain activity. In that case the fear has a physiological cause, as well as a physiological basis.

Maybe "HSDD" has a physiological cause. I'm sure it sometimes does; it would be very weird if it didn't in some cases because physiology can cause all kinds of problems. But fMRI scans don't tell us anything about that.

Link: I've written about HSDD before in the context of flibanserin, a drug which was supposed to treat it (but didn't). Also, as always, British humour website The Daily Mash hit this one on the head.
..

 
powered by Blogger