Methods: We were interested to see how accurate this number actually was, and so we closely examined the 30 day readmission rates for our department for an entire calendar year. Since its inception, the Crimson software has compiled data using information from more than 200 hospitals, accounting for over 25,000 physicians. Using this software, two separate major searches were used (by “Attending Physician” and by “Performing Physician”).
Results: Searching either way revealed a 30 day readmission rate that was higher for our department than the hospital average. The software reported that our Department had a 9.30% 30 day readmission rate, which is high compared to the hospital average of 7.28%. However, after manually sorting through each patient chart, there were cases that were not re-admissions due to the patient’s plastic surgery (eg, several re-admissions for staged procedures were being included in this calculation as were emergency room visits for unrelated, new orthopedic complaints). Once these cases were excluded from the original calculations, it brought our department’s readmission rate down to slightly lower (7.02%) than the hospital average for each search.
Discussion and Conclusion: This discrepancy highlights the margin of error of such automated physician performance programs, and brings to light pitfalls that physicians should be aware of with regards to similar programs at their own institutions that are increasingly being used to “rate” surgeon performance. As plastic surgeons are increasingly employed by hospitals, it behooves them to be cognizant of this issue, and to be empowered to question the data being used to assess their performance.