Advances in Design-Based Research


Vanessa Svihlaa

aUniversity of New Mexico, USA

                                                                                                          

Article received 26 May 2014 / revised 1 November 2014 / accepted 11 November 2014 / available online 23 December 2014

Abstract

Design-based research (DBR) is a core methodology of the Learning Sciences. Historically rooted as a movement away from the methods of experimental psychology, it is a means to develop “humble” theory that takes into account numerous contextual effects for understanding how and why a design supported learning. DBR involves iterative refinement of both designs for learning and theory; this process is illustrated with retrospective analysis of six DBR cycles. Calls for educational research to parallel medical research has led learning scientists to strive for more specific standards about what constitutes DBR and what makes it desirable, especially regarding robustness and rigor. A recent trend in DBR involves efforts to extend the reach through scalability. These developments potentially endanger the designerly nature of DBR by orienting focus toward generalizability, meaning researchers must be vigilant in their pursuit of understanding how and why learning occurs in complex contexts.

Keywords: Design-Based Research; Learning Sciences; Research Methods

Corresponding author: Vanessa Svihla, Organization, Information & Learning Sciences, University of New Mexico, Albuquerque NM 87131, USA. Email: vsvihla@unm.edu

Doi: http://dx.doi.org/10.14786/flr.v2i4.114

1.                  Overview of Design-Based Research

Design-Based Research (DBR) is a core methodology of the learning sciences. Begun as a movement away from experimental psychology, DBR was proposed as means to study learning amidst the “blooming, buzzing confusion” of classrooms (Brown, 1992, p. 141). It is a way to develop theory that takes into account numerous contextual effects for understanding how and why a design supports learning; these theories are “humble in that they target domain-specific learning processes” (Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003, p. 9). DBR involves iterative refinement of both designs for learning and theory (Brown, 1992; Collins, 1992; The Design-Based Research Collective, 2003).

This paper outlines the methodological standards for conducting DBR, illustrated with an example, and describes recent advances.

2.                  Methodological Standards for Conducting Design-Based Research

DBR is sometimes conflated with mixed methods or action research; this, paired with calls for educational research to parallel medical research has led learning scientists to strive for more specific standards about what constitutes DBR and what makes it desirable, especially regarding robustness and rigor. This section details current methodological standards for conducting DBR.

2.1.         A collaborative effort conducted in context

DBR is typically conducted as a team of researchers, designers, and practitioners with intensive planning and debriefing sessions throughout the process. Rather than a wholly researcher-driven process, practitioners generally have greater ownership of the process. Working collaboratively, they identify a practical problem (Reeves, 2006) that is then investigated through literature review, learning theory, and question posing. The intervention instantiates this learning theory into the design.

Because learning is understood to be a process, and because DBR seeks understanding of how learning occurs, process data are prioritized in DBR, such as video records and artifacts of student work. This approach allows researchers to be opportunistic when something surprising or emergent occurs. The notion that emergence plays a central role in DBR is a shift away from the more positivistic origins in which variables are well-known a priori (Collins, 1992).

DBR allows for intervention while yet valuing the importance of social interaction rather than social isolation (Collins, Joseph, & Bielaczyc, 2004). This resonates the basic belief by learning scientists that learning is a fundamentally social, interactional process. By occurring in classrooms rather than in laboratories, DBR also allows for testing of designs and theory that address “the complexity that is a hallmark of educational settings” (Cobb et al., 2003, p. 9). The challenge is to apply lessons learned in context to a broader range of settings (Barab & Squire, 2004). Because DBR may not be replicated in the classical sense, given strong ties to context, it is critical to share the design along with thick description (Barab & Squire, 2004).

2.2.         Iterative cycles refine the design and the theory

Because of the contextual nature of DBR, some view DBR as a means to generate, but not validate conjectures about learning (Sandoval, 2004); however, because such conjectures are made visible in designs for learning, they become testable through iterative refinement. Simply conducting one study in the field does not qualify, although it may be reported as one cycle in a longer DBR effort. Iterative refinement across contexts allows conjectures to become robust (diSessa & Cobb, 2004) by placing theory “in harm’s way” (Cobb et al., 2003, p. 10).

The development of Interactive Learning Assessments (ILAs) illustrates the iterative refinement process (McKay, Cantarero, Svihla, Yakes Jimenez, & Castillo, 2014; Phillips et al., 2009; Svihla et al., 2010; Svihla, Phillips, et al., 2009; Svihla, Vye, et al., 2009; Svihla et al., 2013; Yakes et al., 2013). ILAs place the learner in an authentic, professional role giving advice to virtual clients.

ILAs were first developed in response to a call for high school biology assessments that did not pause learning, but instead assessed students as they learned; more specifically, we aimed to assess how students used resources to solve problems that were new to them. Because this call came from an organization interested in using our designs for all schools in one state, we faced early challenges; our design decisions were driven by the need for scalability. This led us to seek school partners to test our designs, but meant that we neglected some of the contextual influences that are typical of DBR. Initially, we did not involve instructors in the design process extensively, but we did debrief with them to inform redesign. We partnered with subject matter experts (e.g., a genetic counselor or dietitian) who helped ensure the problems reflected authentic professional practices, as this was central to our humble theory.

Our designs for and theory of learning evolved through six iterations (Figure 1 & Table 1). We initially provided authentic, real-world problems posed by virtual clients and access to resources as a way to support students to solve complex problems. Students took on the role of interns and gave counsel to virtual (avatar) clients. Our first design succeeded in supporting learning, but was too open-ended to be a useful assessment at scale. Beginning with iteration 2, we designed more specified sequences of questions and provided feedback from a virtual supervisor. We found the ILAs supported learning and provided useful data for assessment, but the student experience was too linear and scaffolded.

We moved to a new setting—a university nutrition program seeking to provide students with ways to learn about professional practices prior to internships as a means to recruit and retain diverse students (Svihla et al., 2013). With this different motivation driving our work, we sought to bring instructors more centrally into the role of designers of cases. To offset the linear feel of the cases, we sought to support greater agency, providing opportunities to make choices among story-like branches. Instructors found it cumbersome to design such cases. Instead of distancing the instructors from the design process, we changed how we instantiated agency into the design, creating short story-like loops; students could explore as many or as few of the loops as they liked. In these versions, students learned content and professional practices, and they enjoyed the opportunity to explore further according to their level of interest.

Figure 1. (see pdf file) Refinement of humble theory of learning instantiated in Interactive Learning Assessments

Retrospective analysis of DBR cycles provides an opportunity to “see order, pattern, and regularity” in messy, complex settings (diSessa & Cobb, 2004, p. 84) and supports the development of “useful, generalizable theories” (Edelson, 2002, p. 112). This analysis includes considering the conditions for success (Dede, 2004) and highlights the need to report failures (O'Neill, 2012).

Retrospective analysis of the six iterations – across varied contexts (rural, urban; high school, university; biology, nutrition) – highlights areas where our theory is robust: students consistently learned by taking on real roles and solving challenges posed by virtual clients. This hinged on our ability to place students in roles they could understand; when the role was further from their experience, the addition of vignettes of the virtual supervisor explaining the role bridged this gap. The distance between student experience and professional role also affected feedback given to students. For high school students, it was hard to design feedback that did not seem schoolish, lowering the authenticity. In contrast, the university students found the opportunity to see an expert answer and compare it to their own answers to be an authentic learning activity.

Table 1. Design-based iterations in the development of Interactive Learning Assessments

Iteration

Participants and setting

Role taken by students and problem

Implementation

Main findings

Sample design decisions

1

Biology students (n=34) at a rural southern US high school

As a conservation geneticist, student advises developers on conservation of two bird species

One case completed as think aloud task with researcher

Students learned about genetics and saw what they were learning as relevant

Increase scaffolding, create diagnostic yet authentic multiple choice questions

2

Biology students (n=24) at a rural southern US high school

As an intern genetic counselor, student counsels couple worried about potential for having a baby with sickle cell disease

One case completed as think aloud task with researcher

Students didn’t understand what an internship was, but did learn genetics content from the case

Provide explicit guidance about internship

3

Biology students (n=48) at an affluent west coast US suburban high school

As an intern genetic counselor, student counsels couple worried about potential for having a baby with sickle cell disease

One case completed in class session

Students who moved quickly from reading the problem to searching for information struggled; teacher unsure how/when to use case

Add generate ideas step and reflective prompts, make less linear; add teacher-as-designer

4

Undergraduate nutrition students (n=15) at a southwestern US research university

As an intern dietitian, student counsels family about nutrition needs of child with Down syndrome

One case completed as an online assignment

Students learned and retained content; designing branches was burdensome for instructor

Replace branches with loops

5

Graduate nutrition students (n=14) at a southwestern US research university

As an intern dietitian, student counsels pregnant woman about gestational diabetes

One case completed as an online assignment, one in-class discussion session

Students learned and retained content; instructor could design and teach with the case

Create more cases

6

Undergraduate nutrition students (n=25) at a southwestern US research university

As an intern dietitian, student counsels a range of clients on various nutrition topics

Seven cases completed in place of class meetings, plus seven in-class discussion sessions

Students learned and retained content; instructor developed more student-centered practice

Investigate ways to make branching design feasible

 

3.                  Extensibility of DBR: Design-Based Implementation Research (DBIR)

In the earlier example of ILAs, the initial goal was to help bring about statewide systemic change by providing a new way to embed assessment within learning. This driver necessitated changes to traditional DBR. When we changed settings, we also changed the role of the instructors from informants and consumers to designers of cases; this shift reflected our goal to help bring about smaller scale yet systematic change within a university program. In the first set of high school iterations, instructors were uncertain about how to use the cases. In the first iterations in the university setting in which the cases were designed by instructors, the cases were treated as homework, supplemental to in-class lectures. In the most recent iteration, the same instructors replaced lectures with the cases and further supplemented them with discussion (McKay et al., 2014). What we first viewed as a better assessment tool evolved into a tool for instructors to test their ideas about learning, resulting in more learner-centered teaching.

3.1.         Design-Based Implementation Research

Recently, others have similarly sought ways to expand the reach of DBR, such as through “implementation paths” that could lead the way to scaling a design (Bielaczyc, 2013), seeking to develop learning theory that can be adapted to contexts (Barab & Squire, 2004), and Design-Based Implementation Research (DBIR, Fishman, Penuel, Allen, Cheng, & Sabelli, 2013; Penuel & Fishman, 2012). DBIR includes “(a) a focus on persistent problems of practice from multiple stakeholders’ perspectives; (b) a commitment to iterative, collaborative design; (c) a concern with developing theory related to both classroom learning and implementation through systematic inquiry; and (d) a concern with developing capacity for sustaining change in systems” (Penuel, Fishman, Haugan Cheng, & Sabelli, 2011, p. 331).

In one example of DBIR, researchers partnered with four districts to develop a theory of action around improving mathematics instruction (Cobb, Jackson, Smith, Sorum, & Henrick, 2013); the partnership lasted four years through cycles of data collection and analysis focused on the strategies as implemented. In each cycle, they documented the intended strategies, recorded how they were actually enacted, and made recommendations based on analysis. In order to support and maintain the relationship between researchers and practitioners, the team used two means of data collection and analysis: first, they prioritized providing usable evidence for the districts to evaluate the impact of their policies; second, they iteratively tested their theory of action to refine it. In addition to being guided by and refining a theory of action, they created an interpretive framework; this tool was used to evaluate and guide design decisions prior to, during and after implementation. Following the four cycles of implementation, they began retrospective analysis to further test and refine their theory of action. This example highlights many parallels with DBR, including collaborative and contextual work with a focus on refining design and theory through iterative refinement and retrospective analysis. It also highlights the different scale at which DBIR is conducted, involving many districts, schools, and classrooms, and a focus on creating sustainable change. By working at this scale, the research is more easily generalizable; by testing conjectures across four districts, they were able to learn about strategies that were effective across districts given specific conditions. Because the target of their design was tied to how districts could support improved mathematics instruction, they were able to identify and refine strategies that were ineffective. For instance, school leaders had been receiving content-independent professional development to guide their feedback to mathematics teachers; however, this process uncovered that they were not able to distinguish between high and low quality enactments of the mathematics. By recommending school leaders instead receive content-based professional development, they were able to design a sustainable, lasting change.

DBIR researchers emphasize the practical nature of their work, from problem to design to theory (Dolle, Gomez, Russell, & Bryk, 2013). This approach takes a broader view of the context and attends to usability by jointly considering how to change larger entities or systems (e.g., school districts) and how to support their ability to sustainably adapt designs (Penuel & Fishman, 2012). DBIR has only begun to be taken up, bringing focus on scalability and sustainability, while respecting teachers and avoiding trying to “teacher-proof” the materials of reform, for instance, through productive adaptation.

3.2.         Productive Adaptation

One approach to DBIR is in teachers’ productive adaptations of curricula; this means staying faithful to the original intent of the design, reproducing “invariant principles” across sites while being responsive to local contexts (Kirshner & Polman, 2013). In particular, focusing on maintaining or increasing – rather than reducing – the complexity and students’ engagement can support productive adaptations (DeBarger, Choppin, Beauvineau, & Moorthy, 2013).

Dialogic interactions between teachers and researchers can support productive adaptions (Kirshner & Polman, 2013), but deliberate support – and spaces – are needed to ensure these are frequent enough and sustained (Donovan, Snow, & Daro, 2013). Related to this, it is also important to attend to power relationships and ownership of problems of practice; researchers bring different cultural norms and may have status not afforded to practitioners. Deliberately viewing this as a cultural exchange, in which researchers and practitioners can trade ideas, can mitigate these challenges (Penuel, Coburn, & Gallagher, 2013).

In some cases, district support for any particular program, professional development, or curriculum may be taken as another in a sequence of top-down mandates, and therefore meet with resistance at school sites (Borko & Klingner, 2013). This highlights the importance of attending to influences across levels in the system in which research is occurring. Because of this systems approach, not all DBIR research occurs within schools or formal settings; though less common, DBIR research has been conducted in communities, as a means to identify issues that might prevent youth from being successful and address them in creative, cross-institutional ways (McLaughlin & London, 2013). Such approaches are important because DBR has been critiqued for not sufficiently attending to equity and social justice (Confrey, 2005), though some work has sought this out (e.g., Barab, Dodge, Thomas, Jackson, & Tuzun, 2007).

4.                  Are DBR and DBIR Designerly?

Although design-based, not all DBR and DBIR appear to be designerly (Cross, 2001), explicitly applying design process by seeking needs, optimizing the design, and evaluating a solution in light of identified needs (Edelson, 2002). Because the targets of DBR are designs for learning and theories of learning, potential needs may be found both in review of research and in the world. Needs are sometimes implicit and the design process left to the reader’s imagination (e.g., “the tool was designed to scaffold learning of argumentation”). Aiming at scalability can strip the contextualist, designerly aspects from DBR, but committing to novel usability—and therefore a focus on context-- can mitigate this. DBIR focuses on design at scale, which would suggest a less designerly approach; yet, the emphasis on working in partnership with practitioners to support sustained change has helped focus DBIR research on worldly needs.

As these methods continue to evolve and incorporate bigger systems and big data, there are many opportunities for looking across streams of related data, such as logfiles and videos. These offer ways to evaluate the influence and refinement of designs for learning and of learning theories that are contextual and adaptive to the systems in which they reside.

4.1    Credibility of Design-Based Research

Concerns have been raised previously about the credibility of educational research in general (Levin & O'Donnell, 1999; National Research Council, 2002), urging researchers to employ methodologies influenced by medical research. In such approaches, tests of efficacy (whether the treatment works under optimal conditions) and effectiveness (whether the treatment works under real world conditions) “are often conflated” (Sloane, 2008, p. 625). Influenced by this, discussions about DBR have focused on robustness, rigor and validity, grounded in experimental perspectives, an odd choice given the contextual, qualitative work that is commonly done with DBR. However, trustworthiness and credibility – as applied in qualitative methods – have also been considered (Barab & Squire, 2004), resulting in other ways to evaluate DBR: Methodological alignment means the “research methods we use actually test what we think they are testing” (Hoadley, 2004, p. 203). Edelson holds that DBR should not be evaluated by the same standards as traditional approaches because the goals differ; instead, “novelty and usefulness” of the theory developed should be applied (2002, p. 118).

4.2    New types of data

With the increasing popularity of big data and the relatively common use of technology-enhanced learning, some have included these new types of data in DBR studies. For instance, complex statistical modeling has recently taken the traditional place of qualitative approaches (Markauskaite, 2010; Markauskaite & Reimann, 2008), arguing this approach avoids selection and confirmation bias.

Others remain skeptical about finding usable evidence of learning from big data, citing examples of contextual, interactional “in-room” events that are not logged automatically (Stevens, 2013); such events may explain successes and failures of designs in important ways. As an example, a long period of activity on a logfile might indicate a range of activities: a student spending a long time diligently reading the screen; a student absent from the activity, wandering the class out of boredom; or a teacher interaction in response to a reflective question by the student. These tell us very different things about how the design is or is not supporting learning, and do not, on average, provide useful design information.

To deal with this issue, others rely on a combination of video and logfiles. For instance, researchers first analyzed classroom and video data to redesign a feedback feature that students rarely used (Segedy, Kinnebrew, & Biswas, 2012). They then analyzed data from students’ interactions with the technology using hidden Markov Modeling, to evaluate the impact of their design decisions, leading to further refinement of both the design and theory guiding their work. Similarly, in our research, we have leveraged data from logfiles, field notes, student performance, and videos of implementations to test and inform design decisions (Svihla & Linn, 2012a, 2012b). For instance, based on review of video and logfiles and student performance, we chose to add a step to an instructional unit to support students to interpret interactive visualizations, but we feared students might use a guess-and-check approach as a result. By examining logfiles, we found that most students revisited an earlier step seeking information, rather than guessing. This led us to more closely examine logfiles for particular patterns of activities, such as revisiting steps from earlier activities. Though the primary theory guiding that work was well developed, the instantiation of it in the particular context and for the particular curricular goals was not, resulting in a much more humble, localized version that incorporated new ideas about how students revisit prior curricula to support their learning.

Keypoints

Acknowledgments

The author would like to acknowledge support from the USDA/NIFA Hispanic-Serving Institutions (HSI) Education Grants Program (#2012-38422-19836).

References

Barab, S. A., Dodge, T., Thomas, M. K., Jackson, C., & Tuzun, H. (2007). Our Designs and the Social Agendas They Carry. The Journal of the Learning Sciences, 16(2), 263-305. doi: 10.1080/10508400701193713

Barab, S. A., & Squire, K. (2004). Design-Based Research: Putting a Stake in the Ground. Journal of the Learning Sciences, 13(1), 1-14. doi: 10.1207/s15327809jls1301_1

Bielaczyc, K. (2013). Informing design research: Learning from teachers' designs of social infrastructure. Journal of the Learning Sciences, 22(2), 258-311. doi: 10.1080/10508406.2012.691925

Borko, H., & Klingner, J. (2013). Supporting teachers in schools to improve their instructional practice. National Society for the Study of Education Yearbook, 112(2), 274-297.

Brown, A. L. (1992). Design Experiments: Theoretical and Methodological Challenges in Creating Complex Interventions in Classroom Settings. The Journal of the Learning Sciences, 2(2), 141-178. doi: 10.1207/s15327809jls0202_2

Cobb, P., Confrey, J., diSessa, A. A., Lehrer, R., & Schauble, L. (2003). Design Experiments in Educational Research. Educational Researcher, 32(1), 9-13. doi: 10.3102/0013189X032001009

Cobb, P., Jackson, K., Smith, T., Sorum, M., & Henrick, E. (2013). Design research with educational systems: Investigating and supporting improvements in the quality of mathematics teaching and learning at scale. National Society for the Study of Education Yearbook, 112(2).

Collins, A. (1992). Toward a Design Science of Education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology (pp. 15-22). Berlin: Springer-Verlag.

Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design Research: Theoretical and Methodological Issues. Journal of the Learning Sciences, 13(1), 15-42. doi: 10.1207/s15327809jls1301_2

Confrey, J. (2005). The evolution of design studies as methodology. The Cambridge handbook of the learning sciences, 135-151.

Cross, N. (2001). Designerly Ways of Knowing: Design Discipline Versus Design Science. Design Issues, 17(3), 49-55. doi: 10.1162/074793601750357196

DeBarger, A. H., Choppin, J., Beauvineau, Y., & Moorthy, S. (2013). Designing for productive adaptations of curriculum interventions. National Society for the Study of Education Yearbook, 112(2).

Dede, C. (2004). If Design-Based Research is the Answer, What is the Question? A Commentary on Collins, Joseph, and Bielaczyc; diSessa and Cobb; and Fishman, Marx, Blumenthal, Krajcik, and Soloway in the JLS Special Issue on Design-Based Research. Journal of the Learning Sciences, 13(1), 105-114. doi: 10.1207/s15327809jls1301_5

diSessa, A. A., & Cobb, P. (2004). Ontological Innovation and the Role of Theory in Design Experiments. Journal of the Learning Sciences, 13(1), 77-10327. doi: 10.1207/s15327809jls1301_4

Dolle, J., Gomez, L. M., Russell, J., & Bryk, A. S. (2013). More than a network: Building professional communities for educational improvement. National Society for the Study of Education Yearbook, 112(2), 443-463.

Donovan, M. S., Snow, C., & Daro, P. (2013). The SERP approach to problem-solving research, development, and implementation. National Society for the Study of Education Yearbook, 112(2), 400-425.

Edelson, D. (2002). Design Research: What We Learn When We Engage in Design. Journal of the Learning Sciences, 11(1), 105-121. doi: 10.1207/S15327809JLS1101_4

Fishman, B., Penuel, W. R., Allen, A., Cheng, B. H., & Sabelli, N. (2013). Design-based implementation research: An emerging model for transforming the relationship of research and practice. National Society for the Study of Education Yearbook, 112(2), 136-156.

Hoadley, C. M. (2004). Methodological alignment in design-based research. Educational Psychologist, 39(4), 203-212. doi: 10.1207/s15326985ep3904_2

Kirshner, B., & Polman, J. L. (2013). Adaptation by design: A context-sensitive, dialogic approach to interventions. National Society for the Study of Education Yearbook, 112(2), 215-236.

Levin, J. R., & O'Donnell, A. M. (1999). What to do about educational research's credibility gaps? Issues in Education, 5(2), 177-229. doi: 10.1016/S1080-9724(00)00025-2

Markauskaite, L. (2010). Digital media, technologies and scholarship: Some shapes of eResearch in educational inquiry. The Australian Educational Researcher, 37(4), 79-101. doi: 10.1007/BF03216938

Markauskaite, L., & Reimann, P. (2008). Enhancing and scaling-up design-based research: The potential of e-research. Paper presented at the Proceedings of the 8th international conference on International conference for the learning sciences-Volume 2.

McKay, T., Cantarero, A., Svihla, V., Yakes Jimenez, E., & Castillo, T. (2014, June 23-27). Becoming a Professional through Virtual Practice. Paper presented at the 11th International Conference of the Learning Sciences (ICLS2014), Boulder, CO.

McLaughlin, M., & London, R. A. (2013). Taking a societal sector perspective on youth learning and development. National Society for the Study of Education Yearbook, 112(2), 192-214.

National Research Council. (2002). Scientific Research in Education. Washington, DC: The National Academies Press.

O'Neill, D. K. (2012). Designs that fly: what the history of aeronautics tells us about the future of design-based research in education. International Journal of Research & Method in Education, 35(2), 119-140. doi: 10.1080/1743727X.2012.683573

Penuel, W. R., Coburn, C. E., & Gallagher, D. J. (2013). Negotiating Problems of Practice in Research–Practice Design Partnerships. National Society for the Study of Education Yearbook, 112(2), 237-255.

Penuel, W. R., & Fishman, B. J. (2012). Largescale science education intervention research we can use. Journal of Research in Science Teaching. doi: 10.3102/0013189X11421826

Penuel, W. R., Fishman, B. J., Haugan Cheng, B., & Sabelli, N. (2011). Organizing research and development at the intersection of learning, implementation, and design. Educational Researcher, 40(7), 331-337. doi: 10.3102/0013189X11421826

Phillips, R., Gawel, D. J., Svihla, V., Brown, M., Vye, N. J., & Bransford, J. D. (2009). New technology supports for authentic science inquiry, practice, and assessment in the classroom. Paper presented at the AERA, San Diego.

Reeves, T. C. (2006). Design research from a technology perspective. Educational design research, 1(3), 52-66.

Sandoval, W. A. (2004). Developing learning theory by refining conjectures embodied in educational designs. Educational Psychologist, 39(4), 213-223. doi: 10.1207/s15326985ep3904_3

Segedy, J., Kinnebrew, J., & Biswas, G. (2012). Supporting student learning using conversational agents in a teachable agent environment. In J. van Aalst, K. Thompson, M. J. Jacobson & P. Reimann (Eds.), The future of learning: Proceedings of the 10th international conference of the learning sciences (ICLS 2012) – Volume 2, short papers, symposia, and abstracts (pp. 251-255). Sydney, Australia: ISLS.

Sloane, F. C. (2008). Randomized Trials in Mathematics Education: Recalibrating the Proposed High Watermark. Educational Researcher, 37(9), 624-630. doi: 10.3102/0013189X08328879

Stevens, R. (2013, 6/12-6/14). Big data, interaction analysis, and everything in between. Paper presented at the Games, Learning, Society 9.0, Madison, WI.

Svihla, V., Gawel, D. J., Brown, M., Moore, A., Vye, N. J., & Bransford, J. D. (2010). 21st Century Assessment: Redesigning to Optimize Learning. In K. Gomez, L. Lyons & J. Radinsky (Eds.), Learning in the Disciplines: Proceedings of the 9th International Conference of the Learning Sciences (ICLS) (Vol. 2, pp. 474-475). Chicago, IL: International Society of the Learning Sciences.

Svihla, V., & Linn, M. C. (2012a). A Design-based Approach to Fostering Understanding of Global Climate Change. International Journal of Science Education, 34(5), 651-676. doi: 10.1080/09500693.2011.597453

Svihla, V., & Linn, M. C. (2012b). Distributing practice: Challenges and opportunities for inquiry learning. In J. van Aalst, K. Thompson, M. J. Jacobson & P. Reimann (Eds.), The future of learning: Proceedings of the 10th international conference of the learning sciences (ICLS 2012) – Volume 1, Full Papers (pp. 371-378). Sydney, Australia: ISLS.

Svihla, V., Phillips, R., Gawel, D. J., Vye, N. J., Brown, M., & Bransford, J. D. (2009). A tool for 21st century learning and assessment. In A. Dimitracopoulou, C. O'Malley, D. Suthers & P. Reimann (Eds.), CSCL Practices: Proceedings of the 8th international conference on Computer supported collaborative learning (CSCL 09) (Vol. 2, pp. 46-48). Rhodes, Greece: International Society of the Learning Sciences.

Svihla, V., Vye, N. J., Brown, M., Phillips, R., Gawel, D. J., & Bransford, J. D. (2009). Interactive Learning Assessments for the 21st Century Education Canada, 49(3), 44-47.

Svihla, V., Yakes, E., Castillo, T., Cantarero, A., Valdez, I., & Dominguez, N. (2013). Interactive Learning Assessment: Providing Context and Simulating Professional Practices Proceedings of Games, Learning, Society 9.

The Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8. doi: 10.3102/0013189X032001005

Yakes, E., Cantarero, A., McKay, T., Svihla, V., Castillo, T., Valdez, I., & Hertel, J. (2013). Interactive Learning Assessment: Simulating Professional Practices. NACTA Journal, 57(Supplement 1).