The Story of Chad Kerksick and the OU – IRB

Wow!  Just wow!  I’ve been reading the background on the story from last November’s Huffington Post article on University of Oklahoma Assistant Professor Chad Kerksick.  As I mentioned in the previous post, I did some work with his lab a few years ago, and it was my first encounter with muscle punches or muscle cores in exercise physiology.  They were a nice group, lots of laughter and smiles and good questions and everybody seemed to be friends with everyone else.  As I page through the Institutional Review Board investigative findings, it paints a very different picture.  One of Dr. Kerksick’s grad students/study managers pulled a “whistle-blower” on his research practices with regards to the way he enrolled volunteers, performed biopsies of muscle and fat tissue, subsequently handled the material and even how he documented the results.

I might have a slightly different perspective on this than the average person.  I’ve mentioned before that what most people would find shocking in a research lab is how young everyone is.  Most research in academic labs is done by 20-26 year olds (look at the picture above!) and rarely will you find anyone over 35 in the lab proper.  This is different from medical centers and institutes as well as pharma and biotechs where the research staff tend to be older and better trained.

Imagine a company run entirely by one CEO and an army of interns, and you have an idea of what the academic research lab is like.  The “CEO” or principal investigator, in this case, is an exercise physiologist, and if my brief exposure is any guide, a good one.  He studies exercise, body-building and athletes.  He fulfills the image of the jock-scientist, and his group are mostly weight-lifters, student athletes and future sports medicine folks… not your classic molecular biology or medical group.

A needle biopsy device

They paid volunteers to exercise on a treadmill and then collected a sample of muscle tissue from the thigh using a hollow needle, larger than a syringe needle, or alternately, a sample  of fat/adipose tissue was taken from the abdomen.  The volunteers received a lidocaine shot to prevent local pain, but the tissue is certainly traumatized, resulting in bruising and some light bleeding.  I’m sure it’s a very low-risk procedure, but still makes me a bit queasy to imagine what is essentially minor invasive surgery being done in a gym area on a sweaty student by another student.  My own misgivings aside, this is not unlike what is going on at medical schools and med tech programs around the world.  It’s probably about as risky as having your ears pierced or a tattoo added.

On to the accusations.  The review came about as the result of video and research records submitted by one of Chad’s grad students, Patrick Dib.  Patrick, going by statements Chad made in response, has been the instigator of other, similar investigations at USC and UCLA.  I’m very suspicious that I’m not getting the full story on his background, and I’m having a hard time making sense of the “he said, she said” but three things are clear:

1.  Chad Kerksick didn’t adhere to the highest standards of human research in terms of record keeping and minimizing human risk or suffering.  He was sloppy and irresponsible in supervising his grad students, probably as the result of all that laughter and all those smiles.  People were probably TOO chummy to really bring the axe down on misconduct.  “Hindsight is 20/20″.

2.  The kind of things the Institutional Review Board found were typical of many labs at primarily undergraduate institutions.  It’s like any other audit; if you look hard enough, you’ll find something.  Some criticisms included leaving lidocaine in an unlocked drawer in a locked room, or submitting his own muscle tissue to a study (which I didn’t know wasn’t allowed either… I’ve done it myself).

3.  The story in the local media and the HuffPo doesn’t really convey the magnitude of infractions committed.  Emails with jokes like “bring me Dibs head in a box, or at least his thumbs” are represented as serious death threats.  Muscle cores sound very scary, but are routine in this field of study and have been performed hundreds of times without adverse event.

The resolution to the story:  Chad received a 1 year leave of absence, agreed to leave OU during that time, and received 18 months salary as part of an arbitration.  His existing research has been brought to a halt by the IRB.

One note:  the most serious allegations were data tampering, which I don’t think was ever substantiated, and forcing students to submit to fat biopsies, which Kerksick insists was part of the training.  I find his explanation makes sense and I’m inclined to give him the benefit of the doubt.  Medical techs and ER techs often work on each other or themselves as practice, and it sounds like that’s what happened here: the procedures were part of education, not research.  I also can’t speak to his private funding sources.

What I would like to be the message from Chad’s story is that IRB’s (the institutional review board that governs human research) really need to start conducting regular audits with the same scrutiny they use in investigations of incidents.  It protects the researcher and the institution from these kinds of “shocking revelations”

For the rest of the story and further reading:

Read the IRB decision here.

Read Kerksick’s responses.

News9 story.

6 comments on “The Story of Chad Kerksick and the OU – IRB

  1. Firstly, I love you site and youtube clips. They are excellent!
    I would like to comment on this though:
    “the most serious allegations were data tampering, which I don’t think was ever substantiated”
    It is very difficult to provide conclusively evidence that data fabrication has occurred. Even the most unlikely data set is still possible and all a researcher needs to do in maintain that the data are true. The best data fabricators (in all likelihood the most experienced and competent scientists) will be careful – they will know how to avoid obvious tell-tale signs of data fabrication/tampering (essentially the same thing in my opinion). It’s when people are sloppy, or just or poor or inexperienced researchers, that signs of data fabrication are more obvious. But even then it is very difficult to ‘prove beyond reasonable doubt’.
    I am peripherally involved in investigating a case of suspected data fabrication. Over 60 questionable papers have appeared in about 20 journals between 2001 and 2009. Initially I was the whistle-blower (simply after reading a number of articles with data appeared to be too good to be true) and contacted the journal editors at the end of 2009. Only 4 editors responded to my email, of which 3 seemed concerned, and one decided to follow it up. I am now compiling evidence to show that fraud has taken place. In this case the evidence is quite clear. Obvious indicators are:
    – Implausible hypotheses confirmed with very clean data
    – Data too good to be true
    – Experiments that are extremely unlikely to attract participants
    – Unlikely participants
    – A single intervention successfully treating many physical conditions (about half the papers)
    – A single physical condition treated with many different (and implausible) interventions (the other half of the papers)
    – Incredible consistency between results of experimental studies with entirely different IVs and DVs
    – Standard errors increasing 10-fold (from 0.05 to 0.5) after a specific date (as if a reviewer might have said “these standard errors look too small to be correct”), yet means remaining the same.
    – Many papers are short reports or research letters (less likely to be peer reviewed).
    – A lone researcher
    – Email address not linked to an institution
    – Institutions difficult to track down (and internet searches of institutions often only link back to author’s publications)
    – Author not responding to editor’s requests
    – Institutions that were tracked down not responding to editor’s requests
    – No more publications from this author since the investigation started
    – And on and on …

    A researcher working in a similar field with the same name did produce legitimate quality research (as far as I can tell – it’s not my area of expertise), but stopped publishing in 1998, The researcher in question emerges just over 2 years later with quite different studies.
    So how is this progressing? Not so well. The case has been with COPE (Committee on Publication Ethics) since early 2010 and it is still “on-going”. None of the papers have been retracted, no other editors appear to be interested in following this up (and some could get quite burnt if this went any further, not only for their poor editorial decisions but also for the lack of action regarding this issue), but I fear that little action will be taken. This particular example of misconduct appeared in a recent BMJ article illustrating how difficult it is to do anything about scientific misconduct.
    I should add that I am not saying that Kerksick is guilty of data tampering, but I’m just pointing out that it difficult it is to ‘prove’ and how easy it is to get away with it. Particularly if you are a smart and experienced researcher, but even if you are not so smart.
    Perhaps an interesting video on scientific misconduct would be a nice project.
    Contact me if you like – I am “Ag8MrE” on youtube (was meant to be ‘Agr8MrE’ but never saw the typo – duh!)
    Oh, and keep up the great work with the clips on youtube!

  2. Many years ago when I was more of an athlete, I agreed to have taken a muscle biopsy from my thigh as part of an sports research project(without payment). The biopsy was performed at a research center under what I can only describe as hospital settings. And while the needle, or what it is called, was about ½ a centimeter in diameter, it went well.
    I only had some muscle soreness for about 2 weeks and I have never any trouble with it since.

    It hurts when it is done, but besides that I could not care less. For all I know, I helped science in a very small but personal way, without any long term effects on my body. Ta daa.

    And hey, almost everyone I know have no problem with doing such things, and have so been part of every kind of research, be it psychology tests, blood tests and biopsies ect.

    Danes and the expansion of human knowledge FTW, because we almost all do it for free. That is altruism incarnate, so to speak.

  3. Don’t you think submitting your own samples as control would violate a double blind randomized trial design, because you would know that you are not receiving the treatment therefore eliminating the placebo response from the control group, and it would violate an observational study design because of selection bias.

    • Blind and double-blind are used when the conscious or subconscious can bias the results when certain information is known. So if you are testing depressants or something like taste then you have to do blinds tests. If the studies will not be biased by the subject knowing they are a control, it doesn’t need to be blind… I’m pretty sure anyway. It’s been a while since I took statistics.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s