Saturday, October 24, 2009 - 1:05 PM
16520

The Use of Standardized Patients in Plastic Surgery Residency Curriculum: Teaching Core Competencies with the OSCEs

Drew J. Davis, MD and Gordon K. Lee, MD.

INTRODUCTION: Previously, the evaluation and promotion of plastic surgery residents has been based upon subjective faculty feedback through global evaluation rating scales and multiple choice exams. As of 2006, the Accreditation Council for Graduate Medical Education (ACGME) has defined six “core competencies” that all residency programs in the country must address and include in the teaching and evaluation of resident physicians: interpersonal communication skills (IPC), medical knowledge (MK), patient care (PC), professionalism (PROF), practice-based learning and improvement (PBLI), and systems-based practice (SB). PURPOSE: The purpose of our study was to evaluate the utility of Objective Structured Clinical Examinations (OSCEs) and standardized patients as educational tools in plastic surgery residency. OSCEs have become a mandatory part of the United States Licensing Examination (USMLE), but have not been widely adopted in residency training. We developed a novel use of the OSCE in plastic surgery residency education that assess all six ACGME competencies. METHODS: Six plastic surgery residents, two from each PGY-4 to -6 level, participated in the plastic surgery specific OSCE which focused on melanoma. The OSCE included a 30-minute video-taped encounter with a standardized patient actor who provided feedback on the ability of the resident to explain the diagnosis of melanoma (PC), answer questions (IPC), and provide appropriate informed consent and treatment recommendations (PROF). In addition, a post-encounter written examination (MK) and set of tasks that included submitting a bill to insurance (SB) and reviewing the literature on the controversial use of MOHS (PBLI) completed the exercise. The residents were scored on their performance in all six core competencies by faculty experts on a 3-point scale (1= novice, 2= moderately skilled, 3= proficient). A debriefing session was held a week later with the faculty and residents to review the results. RESULTS: Residents overall scored well in interpersonal communications skills (84%), patient care (83%), professionalism (86%), and practice-based learning (84%). Scores in medical knowledge showed a positive correlation with level of training. All residents scored comparatively lower in systems-based practice (65%). Unanimously, the residents reported that the OSCE was realistic, educational, and well-worthwhile. CONCLUSIONS: The addition of the OSCEs into our plastic surgery residency curriculum has had a tremendously positive impact on the educational experience of our trainees. It provides comprehensive and meaningful feedback in a dynamic and interactive format. It identifies areas of strengths and weakness for the residents and for the teaching program. Our residency program has made significant changes in the curriculum based upon the results of our study by increasing didactic sessions on skin cancer and melanoma management, and increasing emphasis by faculty regarding systems based practice with respect to insurance, billing, and CPT coding. The OSCE as an assessment tool for the core competencies is a valuable adjunct to residency training and may potentially have application to certifying examinations and Maintenance of Certification (M.O.C.) in plastic surgery.