Main Article Content
Background: In multi-case study program evaluations, the large amount of qualitative data that are generated from interviews can be difficult to utilize. This is particularly so when inference must be made as to why some cases succeed and some fail.
Purpose: This paper shows a method for comparing multiple evaluation sites by using a rubric to define ratings for relevant factors, and an Ishikawa fishbone diagram as a model to show relationships among those factors. We show how this technique identified reasons for differences in outcomes among the sites.
Setting: The evaluation setting was a large-scale safety innovation in the U.S. railroad industry. Four cases were considered—two passenger railroads and two freight railroads.
Intervention: The Confidential Close Call Reporting System (C3RS) program allowed railroad workers to confidentially submit “close calls” which were reviewed by a team made up of labor, management, and the Federal Railroad administration to determine ways to improve safety.
Research design: Multiple comparative case study, Ishikawa root and contributing cause modeling, evaluative rubric scoring, and data visualization techniques.
Data collection & analysis: Interview data were collected from four pilot railroad sites, each of which participated in a five-year test of C3RS. Testing periods overlapped, with the entire evaluation lasting about 12 years.
Findings: The method of using Ishikawa fishbone diagrams with ratings from an evaluative rubric was an effective method to summarize, analyze, and present large quantities of qualitative data. The approach succeeded in explaining degrees of success and failure across the sites. The sponsor and industry stakeholders were able to understand the analysis and the findings, and to develop deep insight into how to promote successful implementation.
Keywords: Multiple comparative case studies; qualitative methodology; qualitative coding; data visualization; fishbone diagrams; Ishikawa diagrams; evaluative rubrics; close calls; near miss; data visualization.
Copyright 2016 Journal of MultiDisciplinary Evaluation, Western Michigan University.