Back to Annual Meeting
|
Back to Annual Meeting
|
APHA Scientific Session and Event Listing |
Jing Tian, MD1, Nancy L. Atkinson, PhD1, Barry Portnoy, PhD2, and Robert S. Gold, PhD, DrPH1. (1) Department of Public and Community Health, Public Health Informatics Research Laboratory, University of Maryland, College Park, Suite 2387 Valley Drive, College Park, MD 20742, 301-405-9626, tianjing@mail.umd.edu, (2) Office of Disease Prevention, Office of the Director, National Institutes of Health, Senior Advisor for Disease Prevention, 6100 Executive Boulevard, Bethesda, MD 20892
The objective of continuing medical education (CME) is to help physicians keep abreast of advances in patient care, accept more-beneficial care, and discontinue use of less beneficial diagnostic and therapeutic strategies. Physicians report spending a considerable amount of time in CME activities to maintain their medical licenses. Evaluation of the effects of CME has been rare and more often assessed physician's satisfaction about the lecture and sometimes changes in physician's medical knowledge and attitudes. Physician clinical practices and improving patient health outcomes were less often evaluated. Previous literature reviews of CME evaluations have focused on randomized clinical trials (RCTs) only. These have suggested that interactive CME sessions enhance participant activity and provide the opportunity to practice skills can improve professional practice and on occasion, health care outcomes. Further study is needed to take advantage of the full range of CME evaluation studies. The goal of this review was to examine effectiveness of current CME tools and techniques in changing physician clinical practices and improving patient health outcomes for RCTs, quasi-experimental trials, and nonequivalent pre-/post-test studies.
Data sources were collected from a search of MEDLINE from January 2000 to January 2006 for English-language studies. Studies were included in the analysis if they were randomized controlled trails or well designed quasi-experimental trails. Nonequivalent pre-/post-test studies were also included to investigate the general trend of CME program effect. Both formal didactic, interactive or internet-based CME programs were included in the review. Programs in which at least 50 percent of the participants were practicing physicians were selected for the review. The review compared study designs, intervention techniques, established measurement instruments, evaluation strategies, physician's practice change and health care outcome change of different CME projects. Effect sizes were calculated for those that sufficient data were available. A discussion of variables affecting the impact of CME projects will be included. Suggestions about designing and evaluating effective CME projects aiming to change physician clinical practices and improve patient health outcomes will be provided. Established evaluation instruments/scales will be mentioned briefly.
Learning Objectives: At the conclusion of the session, the participant will be able to
Keywords: Professional Training, Evaluation
Presenting author's disclosure statement:
Not Answered
The 134th Annual Meeting & Exposition (November 4-8, 2006) of APHA