A report obtained by Nature under the US Freedom of Information Act (FOIA) sheds light on the handling of a case of alleged misconduct at Duke University.
Duke continues to investigate the work of Anil Potti, a cancer genetics researcher whose publications formed the basis for three clinical trials that were closed in July 2010, after an investigative report in Cancer Letter revealed that Potti had inflated his resume, including falsely claiming to be a Rhodes scholar.
Potti claimed to have developed “predictors” – computer algorithms that convert gene expression data from patients’ cancer cells into yes/no answers for whether the cancer will be sensitive to particular drugs. The clinical trials involved more than 300 patients with lung or breast cancer, who were given one of several existing several front-line therapies depending on the results of the predictors run on data from their cancer cells.
Potti resigned from Duke in November 2010, and took responsibility for errors in his published data. A major issue in the university’s handling of the case has been the fact that concerns over the trials were raised publicly more than a year before they were terminated, when biostatisticians Keith Baggerly and Kevin Coombes of the University of Texas’ MD Anderson Cancer Center in Houston challenged the replicability of the predictors and questioned their experimental use in patients. In response, Duke briefly suspended the trials, only to restart them after a review panel charged by the university’s Institutional Review Board looked into the issue.
The newly-released report sheds light on that panel’s thinking. It shows that the panel members found they were able to validate Potti’s work, using original data he provided. But the report also reveals that the panel did not verify that the data provided matched original raw data. Baggerly says it did not: “they had numbers with labels that the Duke group said applied, but the labels were wrong,” he says. Baggerly is particularly concerned by this because on November 9, 2009, when the panel was still deliberating, he obtained data that Potti had placed online, noticed it contained errors relative to data available in public databases that it was supposed to be sourced from, and sent a document to Duke’s Vice-President for Research, Sally Kornbluth, and Duke’s Vice=President for Medical Affairs, Michael Cuffe, pointing that out. He alleges that information was never forwarded to the panel by the Duke administration. “We think the outside experts would have had a better chance of detecting the error if they’d been told that we’d already found it,” he says.
Kornbluth responds that the review was conducted under the auspices of the Duke Institituional Review Board, which did receive a copy of the document from her. But, she explains in a statement sent together with Cuffe, the board, in consultation with Duke’s leadership, decided not to forward it to the reviewers, “it was determined that it would be best to let the data, publications, etc., speak for themselves and not bias the independent investigation for or against any party. In retrospect, we did not realize that the data provided by our investigators were flawed (as the public record now shows), rendering an outside review addressing the methodology flawed as well. In hindsight, we would have ensured that the IRB provided all communication with Dr. Baggerly, recognizing the risk of bias. We’ve learned considerably from this process and are introducing key changes in the way we deal with research that will be translated to the clinical arena as a result,” they say.
In November Potti’s co-author Joseph Nevins admitted that the data appeared to suffer from labeling problems and retracted a paper published in the Journal of Clinical Oncology.
A redacted copy of the Duke-commissioned report was previously obtained under the FOIA by Cancer Letter. The redactions left some parts of the panel’s thinking ambiguous. One redacted section dealt with the review panel’s use of a reference set, a dataset intended to help establish baseline levels of expression for the genes that were the subject of the predictor. Potti and co-authors did not previously use a reference set for their published work, and the report skates over the question of how they obtained good results without it, says Baggerly.
Duke’s second investigation into the matter is ongoing, and expected to report to the Office of Research Integrity at the Department of Health and Human Services, which funded some of Potti’s work through NIH grants.
Image: West Campus / Duke Photography