Greg Holl
Medical College of Wisconsin provides physician practice at three major affiliates — Froedtert Hospital, Children’s Hospital of Wisconsin and the Zablocki VA Medical Center — and many other hospitals and clinics in the Milwaukee area. Each year, its providers, physician assistants, nurse practitioners and psychologists care for more than 425,000 patients, representing more than 1.6 million patient visits.
In this article, we talk with Greg Holl, radiology informatics system manager at the Medical College of Wisconsin, about how workflow intelligence software can drive performance improvement and help enhance patient safety in a busy radiology department.
Q: How has workflow orchestration software helped your radiology department and emergency department work more closely together?
A: The radiology department provides “wet reads” and after-hours reads for our emergency department and trauma service. Residents do the preliminary interpretations overnight and meet with staff radiologists in the morning to review their work. The ED was looking for a quality metric to show that the reads provided by our resident trainees are of high quality, and the solution gave us the ability to create this workflow. Conserus Workflow Intelligence also allows us to monitor recall rates and compare them to national averages and we’ve been able to show that our numbers are significantly better than the average.
Q: How are you using this technology?
A: Peer and resident review are frequently used workflows, but we’ve found that we’re constantly growing with the product and finding new ways to apply it. For example, in addition to the standard peer review workflow, we created discrepancy reviews for educational purposes. We gather and document discrepancies into a database from which we generate reports for stakeholders, which are then used for quality improvement purposes.
The tool is tightly integrated with our PACS and it was very convenient to configure a workflow that collects data and files this information so we can go back and report on it later. We’re able to document the correlation between the findings to show how often (discrepancies) occur.
Another situation where we track discrepancies is when an exam is performed at another facility and sent here for a second opinion. Tracking these discrepancies helps us show the value our radiologists bring to the table.
Q: What quality metrics do you track with this solution?
A: We look at certain quality metrics for compliance purposes, like the participation rates for the American College of Radiology’s (ACR) National Radiology Data Base. We collect the data around participation rates for random review generation, number of received discrepant reviews and other metrics to build a custom report card for our physicians. This report card is a searchable database by physician and date range that is used for periodic reviews with the department chair. Some of the other metrics we collect include relative value unit (RVU) by individual, compared to the section and national averages, report turnaround time for emergency department studies, status of certifications in basic life support, maintenance of certification (MOC), internal and external committee memberships, and more.
Q: Has it helped the college and its individual radiologists improve performance?
A: Participation in peer reviews has increased significantly, with nearly everyone meeting the requirement to randomly review colleagues’ exams within the subspecialty to a rate equal to 2 percent of their own study volumes.
Q: How have these processes affected peer review and resident review at the college?
A: The resident trainees asked for a tool that would let them receive feedback on their unattended wet reads. The resident review workflow has fit that need nicely. We are able to route the feedback to the resident for evaluation and it is ready for them when they come back on shift. The feedback is timely and current enough for the resident to still recall the details of the case. The attending radiologist also appreciates the ability to send that feedback without having to note it somewhere and remember to follow up later the next day.
Q: Do radiologists view it as a positive tool?
A: We don’t just collect discrepancies, but also use it for positive event reporting for both our radiologists and our technologists. We’re collecting events that had a very positive impact on patient outcomes. We provide accolades to our technologists when they do a really good job with an exam. We can go back and congratulate for a job well done. It really has helped the acceptance of the tool. We don’t want this to be viewed as a punitive system. We want it to be a teaching tool; something that people are supposed to learn and grow from on these reviews.
Q: What additional processes are you planning to your work processes?
A: We’re looking forward to the ability to fully anonymize reviews — to hide all identifying details on the exam including the author of the report and location of the exam. Right now it’s a very manual process to export images out of the PACS and drop them into a network share, and then redact the identifying information from reports. This process can now be automated with the workflow solution. We can clean up and display the images within the tool. It’s going to save a lot of time.
Case study supplied by McKesson Corp.