But one piece is real simple. BMJ only makes its full research reports open access, when it probably should be the social and political analysis and commentary that's available to the public. This time, however, it works out with this report from RD Smyth and colleagues, PR Williamson senior author, Liverpudlians all.
To try to put this in a pistachio shell, they tracked down investigators whose published clinical trials seemed not to report all of the results they had obtained or specified in the trial protocol. We all know about publication bias -- negative findings (in this context meaning, the medication didn't work) tend not to get published, partly because of drug company perfidy and partly because of the biases of journal reviewers and editors. But these folks wanted to find out from the horse's mouth (where does that expression come from, anyway?) why these investigators only published some of what they found, or didn't publish on their protocol outcomes.
I will digress to remind readers that it is very important to a) report your pre-specified outcomes and b) report negative findings. Not doing so can bias the overall weight of evidence. Pre-specified outcomes are important because those represent true hypothesis tests. Therefore the associated p values are technically valid and the evidence is strong. Reporting on findings you weren't originally looking for can result in spurious observations being accepted as convincing evidence. Negative findings are important because bias toward positive findings obviously makes interventions look better than they really are. And even more obviously, evidence of harm must be revealed.
There was a very high refusal rate for this study, and you can reasonably presume that people were more likely to refuse when they thought they had something to hide. Also, industry funding was associated with frequent claims that trial protocols were confidential. So we can bet that the situation is worse than what the Liverpudlians could uncover.
Mostly, respondents said things like, they thought that negative and non-significant findings just weren't interesting; the word limits for journal articles made it hard to talk about everything; they just didn't think one or another result was important; or it turned out they couldn't bring in the sample size they needed within their budget. Somewhat shockingly, about half of protocols -- which managed to get funding and ethical approval -- did not have specified primary outcomes. Hmm. I'd like the contact info for those funders.
So this is not good news. For those who are suspicious of how evidence based evidence based medicine really is, your antennae should be tingling because it looks like a lot of investigators aren't being trained properly and don't understand what they need to be doing. But there is also this quote:
When we looked at that data, it actually showed an increase in harm amongst those who got the active treatment, and we ditched it because we weren’t expecting it and we were concerned that the presentation of these data would have an impact on people’s understanding of the study findings. It wasn’t a large increase but it was an increase. I did present the findings on harm at two scientific meetings, with lots of caveats, and we discussed could there be something harmful about this intervention, but the overwhelming feedback that we got from people was that there was very unlikely to be anything harmful about this intervention, and it was on that basis that we didn’t present those findings. The feedback from people was, look, we don’t, there doesn’t appear to be a kind of framework or a mechanism for understanding this association and therefore you know people didn’t have faith that this was a valid finding, a valid association, essentially it might be a chance finding. I was kind of keen to present it, but as a group we took the decision not to put it in the paper. The argument was, look, this intervention appears to help people, but if the paper says it may increase harm, that will, it will, be understood differently by, you know, service providers. So we buried it. I think if I was a member of the public I would be saying ‘what you are promoting this intervention you thought it might harm people—why aren’t you telling people that?
Res ipsa loquitur. And I'd sure like to know what that shit is so I can make sure not to take it.