University of Iowa Audit Reveals Shortcomings in its Human Subjects Research Program

University of Iowa Audit Reveals Shortcomings in its Human Subjects Research Program

The University of Iowa (UI) generates $467 million in external research funding for research—reaching an all-time high for the academic medical research center. However, those dollars could be at risk—along with a prestigious research program. Why? UI recently concluded an audit, which found some troubling observations in regard to administrative efficiency, productivity and business priority—or the lack thereof. Consequently, UI’s competitive positioning among its academic medical research center peers may be less than desired.  Full Institutional Review Board (IRB) application processes can take materially longer than the national median—making UI a candidate for transformation of at least some aspects of its research program—its culture, leadership, and way of conducting the business of research.

UI Takes a Month Longer

This audit of UI’s Human Subjects Research division revealed that the IRB application, review and approval process can take literally a month longer than the median average nationwide—a disturbingly high deviation from the norm. The audit team discovered that the at UI the median duration for the full IRB review and approval was 77 days as compared to the national median of 45 days.

Competitive Disadvantage

These findings, and others, make UI potentially at a competitive disadvantage as compared to other major academic research centers. Because the auditors found that UI also doesn’t really distinguish or prioritize research applications, they are truly at risk now as sponsors of clinical research, whether big pharma or emerging biopharma or other non-profit networks or even government may seek faster turnaround and more efficient research operations. Ultimately, as reported recently in The Gazette, this reality places UI research in a position where its’ leadership and management must look in the collective mirror—its ability to support and progress potentially lifesaving or life-changing research—a core mission is compromised.

The Audit

In November, an audit of the UI Human Subject Office, Institutional Review Board and other committees revealed that among other things, a bloated and bureaucratic culture was on the verge of stifling efficiency and expediency, leading to unnecessary duplication, deficient resources and overall research inefficiencies, as reported in The Gazette.

The auditors specifically reviewed trial proposals from the UI Holden Comprehensive Cancer Center, which are most often subject to greater scrutiny and involve more reviews and committee interface time—according to the audit findings, adding “a significant number of days” in the lead up to a clinical trial launch. 

The audit findings revealed that at UI Holden Comprehensive Cancer Center cancer trials take “between four and six months to review,” and “If (the cancer center) can decrease the amount of time it takes to review trials, it may be able to compete for more clinical trials, including early phase trials, which are often more complicated and more lucrative.”

A High Impact Flag

Due to the extent and level of audit findings, they assigned a “high” impact flag, which implies that the university’s shortcomings could have material effects on its ability to operate a clinical research program.  

Audit Quote

According to the audit findings, “There is currently no prioritization of studies reviewed by the (IRB), resulting in the increased risk that high dollar, high impact, and high risk research proposals are not being timely processed.”

Soaring Workloads

The auditors found “soaring workloads” when assessing the various business processes and workflows from key groups, including 1) Investigational Drug Services committee 2) Pharmacy and 3) Therapeutic Resources committee, yet the actual number of resources dedicated to these functions have remained the same. So, what we see here is commonplace in many academic research centers—the amount of bureaucratic work increases while they hold the line on hires, creating an unsustainable situation.

UI Management Responses

To UI management’s credit they are ready, open and willing to listen, learn and step back, reevaluate and take a course of action to improve the situation. According to the recent article in The Gazette, UI management actively will evaluate strategies to expedite and streamline reviews for relevant cancer center trials, for example. Their targeted responses to the audit findings are planned for July 2020.

Auditor Recommendations Basic

Some of the findings and recommendations include greater prioritization of research applications—for example, UI is including an assessment of financial viability earlier on in the application review process so that UI can expeditiously vet and cull studies that don’t contribute to a strategic mission. UI also has underway initiative “to improve budgeting and budget negotiations for clinical research.”

Audit Evidences Administrative Issues & Not Safety & Quality Issues

The good news for the university, as highlighted by Jeneane Beck, UI spokeswoman, is that the audit findings uncovered administrative challenges with carrying out their IRB programs but generally precluded actual safety and quality findings: she noted that they take pride in the quality of work and emphasis on human subject protection. 

Moreover, none of the audit findings point to non-compliance with federal or state regulations, international standards (e.g. ECH) or UI policies and procedures. UI appears quite keen to take on this ‘mid-life crisis’ head on and undertake the internal assessment and improvement program necessary for the pursuit of research excellence. The Gazette noted throughout the report that UI seeks proactive resolution to audit findings.

Bolster Committee Supports, Re-evaluate Processes & Review Peer Practices

UI is committed to improving its positioning in the market for clinical research services with an emphasis on select areas, such as the IRB application and review process. Hence, the UI management responded that “The (Human Subjects Office) will survey Big Ten Academic Alliance Group and Council on Government Relations colleagues to identify current prioritization practices among peer institutions.” This was in response to the audit finding that the UI review board indiscriminately processes unfunded study applications alongside high-impact, high-dollar or high-risk applications.

A Research Transformational Movement: A Culture of Excellence Key

Human research programs at major academic medical centers have been impacted by a bureaucratic and often stagnant organizational culture. TrialSite News discussed the topic of bureaucratization at major academic research centers in our article “Is Bureaucracy Strangling Clinical Research: A Quality Guru Chimes in,” where a provocative article in the BMJ discussed growing “bureaucratization of clinical research.”

A Movement

Larry Kennedy, a quality and productivity guru and Co-Founder of the Site Accreditation and Standards Institute (SASI) has worked with Dr. Greg Koski at the forefront of applying quality based fundamentals, systems theory  and pragmatic back-to-backs principals to help research sites truly get back to what they are good at—research.

In regard to research bureaucratization, Mr. Kennedy wrote that “the burden and weight of bureaucracy that has formed over clinical research is real and simply put, a good idea, namely inspecting the product of a work process, has gone bad by overemphasizing its importance.” 

As was conveyed in that TrialSite News article, “it’s a familiar story of demanding more and better output from the clinical research team but at the same time adding metrics and inspection routines that do virtually nothing to improve the process. Metrics only tell us how far the desired target we have wandered and inspections centered around those metrics only confirm the error rates.”

He continued, “It’s a given that we know ‘what’ we want as outcomes and we know ‘how’ to measure our failure rates. Beyond our efforts to codify GCP and vigorously demand compliance to it, has anyone gotten beyond the ‘what’ and provided the ‘why’ and the ‘how’ for doing things right that will produce reliable data from the people, processes and tools that produce trial data?”

Mr. Kennedy and the highly impressive SASI team are “accrediting” research sites based on the Alliance for Clinical Research Excellence and Safety Site Accreditation and Standards standard as published in the New England Journal of Medicine