by David Tuller, DrPH
David Tuller is academic coordinator of the concurrent masters degree program in public health and journalism at the University of California, Berkeley.
‘The PACE trial is a fraud.’ Ever since Virology Blog posted my 14,000-essord investigation of the PACE trial last October, I’ve wanted to write that sentence. (I should point out that Dr. Racaniello has already called the PACE trial a sham, and I’ve already referred to it as doggie-poo. I’m not sure that fraud is any worse. Whatever word you use, the trial stinks.)
Let me be clear: I don’t mean fraud in the legal sense, I’m not a lawyer–but in the sense that it’s a deceptive and morally bankrupt piece of research. The investigators made dramatic changes from the methodology they outlined in their protocol, which allowed them to report purported results that were much, much better than those they would have been able to claim under their originally planned methods. Then they reported only the better-looking results, with no sensitivity analyses to analyze the impact of the changes, the standard statistical approach in such circumstances.
This is simply not allowed in science. It means the reported benefits for cognitive behavior therapy and graded exercise therapy were largely illusory–an artifact of the huge shifts in outcome assessments the authors introduced mid-trial. (That’s putting aside all the other flaws, like juicing up responses with a mid-trial newsletter promoting the interventions under investigation, failing to obtain legitimate informed consent from the participants, etc.)
That PACE suffered from serious methodological deficiencies should have been obvious to anyone who read the studies. That includes the reviewers for The Lancet, which published the PACE results for improvement in 2011 after what editor Richard Horton has called endless rounds of peer-review, and the journal Psychological Medicine, which published results for recovery in 2013. Certainly the deficiencies should have been obvious to anyone who read the trenchant letters and commentaries that patients routinely published in response to the egregious errors committed by the PACE team. Even so, the entire U.K. medical, academic and public health establishments refused to acknowledge what was right before their eyes, finding it easier instead to brand patients as unstable, anti-science, and possibly dangerous.
Thanks to the efforts of the incredible Alem Matthees, a patient in Perth, Australia, the U.K.’s First-Tier Tribunal last month ordered the liberation of the PACE trial data he’d requested under a freedom-of-information request. (The brief he wrote for the April hearing, outlining the case against PACE in great detail, was a masterpiece.) Instead of appealing, Queen Mary University of London, the home institution of lead PACE investigator Peter White, made the right decision. On Friday, September 9, the university announced its intention to comply with the tribunal ruling, and sent the data file to Mr. Matthees. The university has a short window of time before it has to release the data publicly.
I’m guessing that QMUL forced the PACE team’s hand by refusing to allow an appeal of the tribunal decision. I doubt that Dr. White and his colleagues would ever have given up their data willingly, especially now that I’ve seen the actual results. Perhaps administrators had finally tired of the PACE shenanigans, recognized that the study was not worth defending, and understood that continuing to fight would further harm QMUL’s reputation. It must be clear to the university now that its own reputational interests diverge sharply from those of Dr. White and the PACE team. I predict that the split will become more apparent as the trial’s reputation and credibility crumble; I don’t expect QMUL spokespeople to be out there vigorously defending the unacceptable conduct of the PACE investigators.
Last weekend, several smart, savvy patients helped Mr. Matthees analyze the newly available data, in collaboration with two well-known academic statisticians, Bruce Levin from Columbia and Philip Stark from Berkeley. Yesterday, Virology Blog published the group’s findings of the single-digit, non-statistically significant recovery rates the trial would have been able to report had the investigators adhered to the methods they outlined in the protocol. That’s a remarkable drop from the original Psychological Medicine paper, which claimed that 22 percent of those in the favored intervention groups achieved recovery, compared to seven percent for the non-therapy group.
Now it’s clear: The PACE authors themselves are the anti-science faction. They tortured their data and ended up producing sexier results. Then they claimed they couldn’t share their data because of alleged worries about patient confidentiality and sociopathic anti-PACE vigilantes. The court dismissed these arguments as baseless, in scathing terms. (It should be noted that their ethical concerns for patients did not extend to complying with a critical promise they made in their protocol, to tell prospective participants about any possible conflicts of interest in obtaining informed consent. Given this omission, they have no legitimate informed consent for any of their 641 participants and therefore should not be allowed to publish any of their data at all.)
The day before QMUL released the imprisoned data to Mr. Matthees, the PACE authors themselves posted a pre-emptive re-analysis of results for the two primary outcomes of physical function and fatigue, according to the protocol methods. In the Lancet paper, they had revised and weakened their own definition of what constituted improvement. With this revised definition, they could report in The Lancetthat approximately 60 % in the cognitive behavior and graded exercise therapy arms improved to a clinically significant degree on both fatigue and physical function.
The re-analysis the PACE authors posted last week sought to put the best possible face on the very poor data they were required to release. Yet patients examining the new numbers quickly noted that, under the more stringent definition of improvement outlined in the protocol, only about 20 percent in the two groups could be called overall improvers.. Solely by introducing a more relaxed definition of improvement, the PACE team, enabled by The Lancet‘s negligence and an apparently inadequate endless review process–was able to triple the trial’s reported success rate..
So now it’s time to ask what happens to the papers already published. The editors have made their feelings clear. I have written multiple e-mails to Lancet editor Richard Horton since I first contacted him about my PACE investigation, almost a year before it ran. He never responded until September 9, the day QMUL liberated the PACE data. Given that the PACE authors’ own analysis showed that the new data showed significantly less impressive results than those published in The Lancet, I sent Dr. Horton a short e-mail asking when we could expect some sort of addendum or correction to the 2011 paper. He responded curtly: Mr. Tuller–We have no such plans.
The editors of Psychological Medicine are Kenneth Kendler of Virginia Commonwealth University and Robin Murray of Kings College London. After I wrote to the journal last December, pointing out the problems, I received the following from Dr. Murray, whose home base is KCL’s Department of Psychosis Studies: Obviously the best way of addressing the truth or otherwise of the findings is to attempt to replicate them. I would therefore like to encourage you to initiate an attempted replication of the study. This would be the best way for you to contribute to the debate€¦Should you do this, then Psychological Medicine will be most interested in the findings either positive or negative.
This was not an appropriate response. I told Dr. Murray it was disgraceful, given that the paper was so obviously flawed. This week, I wrote again to Dr. Murray and Dr. Kendler, asking if they now planned to deal with the paper’s problems, given the re-analysis by Matthees et al. In response, Dr. Murray suggested that I submit a re-analysis, based on the released data, and Psychological Medicine would be happy to consider it. We would, of course, send it out to referees for scientific scrutiny in the same manner as we did for the original paper, he wrote.
I explained that it was his and the journal’s responsibility to address the problems, whether or not anyone submitted a re-analysis. I also noted that I could not improve on the Matthees re-analysis, which completed rebutted the results reported in Psychological Medicine‘s paper. I urged Dr. Murray to contact either Dr. Racaniello or Mr. Matthees to discuss republishing it, if he truly wished to contribute to the debate. Finally, I noted that the peer-reviewers for the original paper had okayed a study in which participants could be disabled and recovered simultaneously, so I wasn’t sure if the journal’s assessment process could be trusted.
(By the way, Kings College London, where Dr. Murray is based, is also the home institution of PACE investigator Trudie Chalder as well as Simon Wessely, a close colleague of the PACE authors and president of the Royal College of Psychiatrists*. That could explain Dr. Murray’s inability or reluctance to acknowledge that the recovery paper his journal peer-reviewed and published is meaningless.)
Earlier today, the PACE authors posted a blog on The BMJ site, their latest effort to salvage their damaged reputations. They make no mention of their massive research errors and focus only on their supposed fears that releasing even anonymous data will frighten away future research participants. They have provided no evidence to back up this unfounded claim, and the tribunal flatly rejected it. They also state that only researchers who present pre-specified analysis plans should be able to obtain trial data. This is laughable, since Dr. White and his colleagues abandoned their own pre-specified analyses in favor of analyses they decided they preferred much later on, long after the trial started.
They have continued to refer to their reported analyses, deceptively, as pre-specified, even though these methods were revised mid-trial. The following point has been stated many times before, but bears repeating: In an open label trial like PACE, researchers are likely to know very well what the outcome trends are before they review any actual data. So the PACE team’s claim that the changes they made were pre-specified because they were made before reviewing outcome data is specious. I have tried to ask them about this issue multiple times, and have never received an answer.
Dr. White, his colleagues, and their defenders don’t yet seem to grasp that the intellectual construct they invented and came to believe in, the PACE paradigm or the PACE enterprise or the PACE cult, have your pick, is in a state of collapse. They are used to saying whatever they want about patients, Internet Abuse! Knife-wielding! Death threats!!–and having it be believed. In responding to legitimate concerns and questions, they have covered up their abuse of the scientific process by providing non-answers, evasions and misrepresentations, the academic publishing equivalent of the dog ate my homework. Amazingly, journal editors, health officials, reporters and others have accepted these non-responsive responses as reasonable and sufficient. I do not.
Now their work is finally being scrutinized the way it should have been by peer reviewers before this damaging research was ever published in the first place. The fallout is not going to be pretty. If nothing else, they have provided a great gift to academia with their $8 million** disaster, for years to come, graduate students in the U.S., the U.K. and elsewhere will be dissecting PACE as a classic case study of bad research and mass delusion.
*Correction: The original version of the post mistakenly called the organization the Royal Society of Psychiatrists.
**Correction: The original version of the post stated that PACE cost $8 million, not $6.4 million. In fact, PACE cost five million pounds, so the cost in dollars depends on the exchange rate used. The $8 million figure is based on the exchange rate from last October, when Virology Blog published my PACE investigation. But the pound has fallen since the Brexit vote in June, so the cost in dollars at the current exchange rate is lower.