A Proposed Remedy for Grievances about Self-Report Methodologies

Main Article Content

Phil Winne

Abstract

This special issue’s editors invited discussion of three broad questions. Slightly rephrased, they are: How well do self-report data represent theoretical constructs? How should analyses of data be conditioned by properties of self report data? In what ways do interpretations of self-report data shape interpretations of a study’s findings? To approach these issues, I first recap the kinds of self-report data gathered by researchers reporting in this special issue. With that background, I take up a fundamental question. What are self-report data? I foreshadow later critical analysis by listing facets I observe in operational definitions of self-report data: nature of the datum, topic, property, setting or context, response scale, and assumptions setting a stage for analyzing data. Discussion of these issues leads to a proposal that ameliorates some of them: Help respondents become better at self reporting.

Article Details

How to Cite
Winne, P. (2020). A Proposed Remedy for Grievances about Self-Report Methodologies. Frontline Learning Research, 8(3), 164–173. https://doi.org/10.14786/flr.v8i3.625
Section
Articles

References

Berger, J.-L., & Karabenick, S. A. (2016). Construct validity of self-reported metacognitive learning strategies. Educational Assessment, 21(1), 19-33. https://doi.org/10.1080/10627197.2015.1127751

Cronbach, L. J., Gleser, G. C., Nanda, H., & Rajaratnam, N. (1972). The dependability of behavioral measurements. New York: Wiley.

Chauliac, M., Catrysse, L., Gijbels, D., & Donche V. (2020). It is all in the surv-eye: can eye tracking data shed light on the internal consistency in self-report questionnaires on cognitive processing strategies? Frontline Learning Research, 8(3), 26 – 39. https://doi.org/10.14786/flr.v8i3.489

Durik, A. M., & Jenkins J. S. (2020). Variability in Certainty of Self-Reported Interest: Implications for Theory and Research. Frontline Learning Research, 8(3) 85-103. https://doi.org/10.14786/flr.v8i3.491

Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data (revised edition). MIT Press.

Fox, M. C., Ericsson, K. A., & Best, R. (2011). Do procedures for verbal reporting of thinking have to be reactive? A meta-analysis and recommendations for best reporting methods. Psychological Bulletin, 137, 316-44. https://doi.org/10.1037/a0021663

Fryer, L. K., & Nakao K. (2020). The Future of Survey Self-report: An experiment contrasting Likert, VAS, Slide, and Swipe touch interfaces. Frontline Learning Research, 8(3),10-25. https://doi.org/10.14786/flr.v8i3.501

Gitelman, L. (Ed.). (2013). “Raw Data” Is an Oxymoron. The MIT Press: Cambridge, MA. https://doi.org/10.7551/mitpress/9302.001.0001

Van Halem, N., van Klaveren, C., Drachsler H., Schmitz, M., & Cornelisz, I. (2020). Tracking Patterns in Self-Regulated Learning Using Students’ Self-Reports and Online Trace Data. Frontline Learning Research, 8(3), 140-163. https://doi.org/10.14786/flr.v8i3.497

Iaconelli, R., & Wolters C.A. (2020). Insufficient Effort Responding in Surveys Assessing Self-Regulated Learning: Nuisance or Fatal Flaw? Frontline Learning Research, 8(3), 104 – 125. https://doi.org/10.14786/flr.v8i3.521

Karabenick, S. A., Woolley, M. E., Friedel, J. M., Ammon, B. V., Blazevski, J., Bonney, C. R., , De Groot, E., Gilbert, M. C., Musu, L., Kempler, T. M., & Kelly, K. L. (2007). Cognitive processing of self-report items in educational research: Do they think what we mean? Educational Psychologist, 42, 139–151. https://doi.org/10.1080/00461520701416231

Liddell, T. M., & Kruschke, J. K. (2018). Analyzing ordinal data with metric models: What could possibly go wrong? Journal of Experimental Social Psychology, 79, 328-348. https://doi.org/10.1016/j.jesp.2018.08.009

Moeller, J., Viljaranta, J., Kracke, B., & Dietrich, J. (2020). Disentangling objective characteristics of learning situations from subjective perceptions thereof, using an experience sampling method design. Frontline Learning Research, 8(3), 63-84. https://doi.org/10.14786/flr.v8i3.529

Robinson, D. H., Levin, J. R., Thomas, G. D., Pituch, K. A., & Vaughn, S. (2007). The incidence of “causal” statements in teaching-and-learning research journals. American Educational Research Journal, 44(2), 400-413 https://doi.org/10.3102/0002831207302174

Rogiers, A., Merchie, E., & Van Keer H. (2020). Opening the black box of students’ text-learning processes: A process mining perspective. Frontline Learning Research, 8(3), 40 – 62. https://doi.org/10.14786/flr.v8i3.527

Veenman, M. V. J., & van Cleef, D. (2018). Measuring metacognitive skills for mathematics: students’ self-reports versus on-line assessment methods. ZDM, 51(4), 691-701. https://doi.org/10.1007/s11858-018-1006-5

Vriesema, C.C., & McCaslin, M. (2020) Experience and Meaning in Small-Group Contexts: Fusing Observational and Self-Report Data to Capture Self and Other Dynamics. Frontline Learning Research, 8(3), 126-139. https://doi.org/10.14786/flr.v8i3.493

Winne, P. H. (1983). Distortions of construct validity in multiple regression analysis. Canadian Journal of Behavioural Science, 15, 187-202. https://doi.org/10.1037/h0080736

Winne, P. H., & Belfry, M. J. (1982). Interpretive problems when correcting for attenuation. Journal of

Educational Measurement, 19, 125-134. https://www.jstor.org/stable/1434905?seq=1#metadata_info_tab_contents

Winne, P. H., & Jamieson-Noel, D. L. (2002). Exploring students’ calibration of self-reports about study tactics and achievement. Contemporary Educational Psychology, 27, 551-572. https://doi.org/10.1016/S0361-476X(02)00006-1

Winne, P. H. (2018). Paradigmatic issues in state-of-the-art research using process data. Frontline Learning Research, 6, 250-258. https://doi.org/10.14786/flr.v6i3.551