Main Article Content
Background: Often evaluations of training programs are limited — with many focusing on the aspects that are easy to measure (e.g., reaction of trainees) without addressing the important outcomes of training, such as how trainees applied their new knowledge, skills, and attitudes. Numerous evaluations fail to measure training’s effect on job performance because few effective methods are available to do so. Particularly difficult is the problem of evaluating multisite training programs that vary considerably in structure and implementation from one site to another.
Research Design: We devised a method of a consensus expert review to evaluate the quality of conference abstracts submitted by participants in Field Epidemiology Training Programs – an approach that can provide useful information on how well trainees apply knowledge and skills gained in training, complementing data obtained from other sources and methods. This method is practical, minimally intrusive, and resource-efficient, and it may prove useful for evaluation practice in diverse fields that require training.
Data Collection and Analysis: NA
Copyright and Permissions
Authors retain full copyright for articles published in JMDE. JMDE publishes under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY - NC 4.0). Users are allowed to copy, distribute, and transmit the work in any medium or format for noncommercial purposes, provided that the original authors and source are credited accurately and appropriately. Only the original authors may distribute the article for commercial or compensatory purposes. To view a copy of this license, visit creativecommons.org