Skip Navigation

GME Evaluation Task Force Recommendation

Back to GME Evaluation
and Assessment Portal
July 1, 2008

PRACTIC BASED LEARNING



PBLI: TEACHING SKILLS
Overview

Per ACGME Common Program Requirements teaching skills are a component of PBLI (“participate in the education of patients, families, students, residents and other health professionals”) and ICS (“communicate effectively with physicians, other health professionals, and health related agencies”). If respect for learners or patients is included, teaching evaluations also provide evidence of a component of Professionalism: “compassion, integrity and respect for others”

Core Measure for UCSF GME

Observation is the primary method by which clinical educators are rated. The SOM Clinical Educator Teaching Evaluation Form was developed at UCSF as a global assessment conducted at the end of a clinical rotation to assess the quality of medical student teaching by residents and faculty. This form consists of 19 items: 11 items on a 5-point Likert-type scale, 4 are narrative/open ended items, and 4 items are triggered only if low scores are received on certain critical items on the form. However, experience and internal studies of the form indicate it can be shortened without losing any reliability. We recommend this shorter form as our core measure of clinical teaching.

Reliability and Validity

There are many instruments developed to measure clinical teaching effectiveness. Most of these instruments do tend to measure a global teaching effectiveness score, interpersonal, and clinical teaching/pedagogic practices dimensions, and have high internal consistency. Each item in the SOM Clinical Educator form includes detailed anchors illustrating each point on the 5-point scale (1=poor, 5=excellent). Due to the internal consistency of these forms, a shorter item set has adequate reliability and content validity. Our proposed core measure includes: conveyance of information, teaching enthusiasm, direction and feedback, promotion of critical thinking, treat me with respect, treat others with respect, and overall teaching effectiveness. Research has recommended that scales be tailored to learner (medical student and resident) and setting (e.g. outpatient vs. inpatient); hence, additional items may be included but should be similar in format to the other items and include clear anchors.

Current Procedures by the SOM
Administration: Practices and evaluation frequency for faculty/residents as clinical teachers vary by each (SOM) clerkship. Some rotations require certain number/types of interactions to occur in order for form to be assigned, some ask learners to designate faculty/residents they worked with and then the form is assigned, so on and so forth.
Dissemination: The forms are disseminated by each department however the (SOM) Office of Medical Education centrally oversees policies surrounding the used of a standardize form, user management, reporting, and procedural record keeping.
Reporting: Faculty and residents are able to view their own evaluation in real time. The SOM Office of Medical Education annually reports on aggregate faculty and resident teaching scores for each clerkship by site.

Administration by GME programs
Frequency: It is recommended that teaching be evaluated after a designated number of interactions between teacher and learner. The number of interactions is dependent on the length of the rotation and should be designated accordingly.
Who Performs: The learners (students, more junior residents).
Scoring Criteria and Training: It is recommended that the form be publicly visible and that evaluators know the scoring/rating in advance so that they know what they are rating about their instructors. There is no training associated with the use of this evaluation.
Documentation: Twice annually as part of semi-annual review meetings.

Use of Data

How assessment results are used is dependent on the program. Timely feedback both written and oral between teachers and program directors will help encourage those good teachers and well as remediate and improve teaching. It is recommended that certain critical items on the form (e.g. teaching effectiveness, respect) create low score triggers. These trigger should prompt additional evaluation items, closed ended or narrative, in order to allow the evaluator to elaborate on the low scores. A low score on any of the items, particularly critical items, should trigger remediation.

Formative uses: Most important use as part of mentored review of progress, guiding individualized learning plans
Summative uses: Usually not unless low scores contribute to a pattern of difficulty in one or more competency areas
Program benchmarking: Yes - as % of residents and faculty achieving a criterion goal or standard for direct teaching performance


Optional Items

The GME Evaluation TF recommends that all GME programs use the SOM short form as a core set to facilitate benchmarking for individual programs and the School. Review of the other evaluation tools revealed potentially useful items that programs may choose to add to the basic form. We have included these as “Item Bank Recommendations.” Remember ultimately what matters the most is the overall teaching effectiveness and comments.

Item Bank for Optional Use:
    1. During this time I personally interacted with or observed the resident and base this evaluation on (very concrete item based on hours of contact)
    2. Refers learner to pertinent references for further reading
    3. Reviews exam findings with learner
    4. Discusses differential diagnosis and work-up with learner
    5. Reviews treatment options with learner
    6. Provides follow-up to learners on interesting cases
    7. Takes time to stress proper surgical technique
    8. Discusses rational for surgical judgment
    9. Please rate how well this resident emphasized problem solving (i.e. thought process leading to decisions)

Other Notes:
  • Neurological Surgery had each of their questions categorized by ACGME competencies. This was nice – easy to track later.

  • The IM Cardiology UC Consult form was nicely tailored to the specialty and type of education.

  • The LEAH Fellowship form was nice and brief although we would recommend a five point scale and spell out teaching effectiveness.

Specific Assessment of Lecturing Skills
Programs may choose to assess the development of residents’ didactic teaching skills, either alone or in conjunction with assessment of critical appraisal skills (e.g., after a lecture reviewing the evidence on a specific clinical question). The Teaching Observation Form developed by the Academy of Medical Educators is an excellent source of tailored and structured feedback by a trained peer or faculty evaluator. The resident giving the lecture would meet in advance with the trained observer and prioritize components of the lecture for feedback. Following the lecture structured feedback would be shared including future plans for improvement.

Quantitative measures of lecture quality also exist. An example is provided that asks students to rate 8 attributes of the instructor on a 5-level scale from strongly disagree to strongly agree. This sort of measure may be more appropriate for following the development of didactic teaching skills than the Clinical Educator Teaching Evaluation Form, which addresses global teaching performance in the clinical context.

PBLI: Teching Skills
PBLI: Teaching Skills
 

E*Value Forms
(pending)



Back to GME Evaluation and Assessment Portal

Curriculum Home