Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Design-Based Research: A Methodology to Extend and Enrich Biology Education Research

  • Emily E. Scott
  • Mary Pat Wenderoth
  • Jennifer H. Doherty

*Address correspondence to: Emily E. Scott ( E-mail Address: [email protected] ).

Department of Biology, University of Washington, Seattle, WA 98195

Search for more papers by this author

Recent calls in biology education research (BER) have recommended that researchers leverage learning theories and methodologies from other disciplines to investigate the mechanisms by which students to develop sophisticated ideas. We suggest design-based research from the learning sciences is a compelling methodology for achieving this aim. Design-based research investigates the “learning ecologies” that move student thinking toward mastery. These “learning ecologies” are grounded in theories of learning, produce measurable changes in student learning, generate design principles that guide the development of instructional tools, and are enacted using extended, iterative teaching experiments. In this essay, we introduce readers to the key elements of design-based research, using our own research into student learning in undergraduate physiology as an example of design-based research in BER. Then, we discuss how design-based research can extend work already done in BER and foster interdisciplinary collaborations among cognitive and learning scientists, biology education researchers, and instructors. We also explore some of the challenges associated with this methodological approach.

INTRODUCTION

There have been recent calls for biology education researchers to look toward other fields of educational inquiry for theories and methodologies to advance, and expand, our understanding of what helps students learn to think like biologists ( Coley and Tanner, 2012 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Lo et al. , 2019 ). These calls include the recommendations that biology education researchers ground their work in learning theories from the cognitive and learning sciences ( Coley and Tanner, 2012 ) and begin investigating the underlying mechanisms by which students to develop sophisticated biology ideas ( Dolan, 2015 ; Lo et al. , 2019 ). Design-based research from the learning sciences is one methodology that seeks to do both by using theories of learning to investigate how “learning ecologies”—that is, complex systems of interactions among instructors, students, and environmental components—support the process of student learning ( Brown, 1992 ; Cobb et al. , 2003 ; Collins et al. , 2004 ; Peffer and Renken, 2016 ).

The purpose of this essay is twofold. First, we want to introduce readers to the key elements of design-based research, using our research into student learning in undergraduate physiology as an example of design-based research in biology education research (BER). Second, we will discuss how design-based research can extend work already done in BER and explore some of the challenges of its implementation. For a more in-depth review of design-based research, we direct readers to the following references: Brown (1992) , Barab and Squire (2004) , and Collins et al. (2004) , as well as commentaries by Anderson and Shattuck (2012) and McKenney and Reeves (2013) .

WHAT IS DESIGN-BASED RESEARCH?

Design-based research is a methodological approach that aligns with research methods from the fields of engineering or applied physics, where products are designed for specific purposes ( Brown, 1992 ; Joseph, 2004 ; Middleton et al. , 2008 ; Kelly, 2014 ). Consequently, investigators using design-based research approach educational inquiry much as an engineer develops a new product: First, the researchers identify a problem that needs to be addressed (e.g., a particular learning challenge that students face). Next, they design a potential “solution” to the problem in the form of instructional tools (e.g., reasoning strategies, worksheets; e.g., Reiser et al. , 2001 ) that theory and previous research suggest will address the problem. Then, the researchers test the instructional tools in a real-world setting (i.e., the classroom) to see if the tools positively impact student learning. As testing proceeds, researchers evaluate the instructional tools with emerging evidence of their effectiveness (or lack thereof) and progressively revise the tools— in real time —as necessary ( Collins et al. , 2004 ). Finally, the researchers reflect on the outcomes of the experiment, identifying the features of the instructional tools that were successful at addressing the initial learning problem, revising those aspects that were not helpful to learning, and determining how the research informed the theory underlying the experiment. This leads to another research cycle of designing, testing, evaluating, and reflecting to refine the instructional tools in support of student learning. We have characterized this iterative process in Figure 1 after Sandoval (2014) . Though we have portrayed four discrete phases to design-based research, there is often overlap of the phases as the research progresses (e.g., testing and evaluating can occur simultaneously).

FIGURE 1. The four phases of design-based research experienced in an iterative cycle (A). We also highlight the main features of each phase of our design-based research project investigating students’ use of flux in physiology (B).

Design-based research has no specific requirements for the form that instructional tools must take or the manner in which the tools are evaluated ( Bell, 2004 ; Anderson and Shattuck, 2012 ). Instead, design-based research has what Sandoval (2014) calls “epistemic commitments” 1 that inform the major goals of a design-based research project as well as how it is implemented. These epistemic commitments are: 1) Design based research should be grounded in theories of learning (e.g., constructivism, knowledge-in-pieces, conceptual change) that both inform the design of the instructional tools and are improved upon by the research ( Cobb et al. , 2003 ; Barab and Squire, 2004 ). This makes design-based research more than a method for testing whether or not an instructional tool works; it also investigates why the design worked and how it can be generalized to other learning environments ( Cobb et al. , 2003 ). 2) Design-based research should aim to produce measurable changes in student learning in classrooms around a particular learning problem ( Anderson and Shattuck, 2012 ; McKenney and Reeves, 2013 ). This requirement ensures that theoretical research into student learning is directly applicable, and impactful, to students and instructors in classroom settings ( Hoadley, 2004 ). 3) Design-based research should generate design principles that guide the development and implementation of future instructional tools ( Edelson, 2002 ). This commitment makes the research findings broadly applicable for use in a variety of classroom environments. 4) Design-based research should be enacted using extended, iterative teaching experiments in classrooms. By observing student learning over an extended period of time (e.g., throughout an entire term or across terms), researchers are more likely to observe the full effects of how the instructional tools impact student learning compared with short-term experiments ( Brown, 1992 ; Barab and Squire, 2004 ; Sandoval and Bell, 2004 ).

HOW IS DESIGN-BASED RESEARCH DIFFERENT FROM AN EXPERIMENTAL APPROACH?

Many BER studies employ experimental approaches that align with traditional scientific methods of experimentation, such as using treatment versus control groups, randomly assigning treatments to different groups, replicating interventions across multiple spatial or temporal periods, and using statistical methods to guide the kinds of inferences that arise from an experiment. While design-based research can similarly employ these strategies for educational inquiry, there are also some notable differences in its approach to experimentation ( Collins et al. , 2004 ; Hoadley, 2004 ). In this section, we contrast the differences between design-based research and what we call “experimental approaches,” although both paradigms represent a form of experimentation.

The first difference between an experimental approach and design-based research regards the role participants play in the experiment. In an experimental approach, the researcher is responsible for making all the decisions about how the experiment will be implemented and analyzed, while the instructor facilitates the experimental treatments. In design-based research, both researchers and instructors are engaged in all stages of the research from conception to reflection ( Collins et al. , 2004 ). In BER, a third condition frequently arises wherein the researcher is also the instructor. In this case, if the research questions being investigated produce generalizable results that have the potential to impact teaching broadly, then this is consistent with a design-based research approach ( Cobb et al. , 2003 ). However, when the research questions are self-reflective about how a researcher/instructor can improve his or her own classroom practices, this aligns more closely with “action research,” which is another methodology used in education research (see Stringer, 2013 ).

A second difference between experimental research and design-based research is the form that hypotheses take and the manner in which they are investigated ( Collins et al. , 2004 ; Sandoval, 2014 ). In experimental approaches, researchers develop a hypothesis about how a specific instructional intervention will impact student learning. The intervention is then tested in the classroom(s) while controlling for other variables that are not part of the study in order to isolate the effects of the intervention. Sometimes, researchers designate a “control” situation that serves as a comparison group that does not experience the intervention. For example, Jackson et al. (2018) were interested in comparing peer- and self-grading of weekly practice exams to if they were equally effective forms of deliberate practice for students in a large-enrollment class. To test this, the authors (including authors of this essay J.H.D., M.P.W.) designed an experiment in which lab sections of students in a large lecture course were randomly assigned to either a peer-grading or self-grading treatment so they could isolate the effects of each intervention. In design-based research, a hypothesis is conceptualized as the “design solution” rather than a specific intervention; that is, design-based researchers hypothesize that the designed instructional tools, when implemented in the classroom, will create a learning ecology that improves student learning around the identified learning problem ( Edelson, 2002 ; Bell, 2004 ). For example, Zagallo et al. (2016) developed a laboratory curriculum (i.e., the hypothesized “design solution”) for molecular and cellular biology majors to address the learning problem that students often struggle to connect scientific models and empirical data. This curriculum entailed: focusing instruction around a set of target biological models; developing small-group activities in which students interacted with the models by analyzing data from scientific papers; using formative assessment tools for student feedback; and providing students with a set of learning objectives they could use as study tools. They tested their curriculum in a novel, large-enrollment course of upper-division students over several years, making iterative changes to the curriculum as the study progressed.

By framing the research approach as an iterative endeavor of progressive refinement rather than a test of a particular intervention when all other variables are controlled, design-based researchers recognize that: 1) classrooms, and classroom experiences, are unique at any given time, making it difficult to truly “control” the environment in which an intervention occurs or establish a “control group” that differs only in the features of an intervention; and 2) many aspects of a classroom experience may influence the effectiveness of an intervention, often in unanticipated ways, which should be included in the research team’s analysis of an intervention’s success. Consequently, the research team is less concerned with controlling the research conditions—as in an experimental approach—and instead focuses on characterizing the learning environment ( Barab and Squire, 2004 ). This involves collecting data from multiple sources as the research progresses, including how the instructional tools were implemented, aspects of the implementation process that failed to go as planned, and how the instructional tools or implementation process was modified. These characterizations can provide important insights into what specific features of the instructional tools, or the learning environment, were most impactful to learning ( DBR Collective, 2003 ).

A third difference between experimental approaches and design-based research is when the instructional interventions can be modified. In experimental research, the intervention is fixed throughout the experimental period, with any revisions occurring only after the experiment has concluded. This is critical for ensuring that the results of the study provide evidence of the efficacy of a specific intervention. By contrast, design-based research takes a more flexible approach that allows instructional tools to be modified in situ as they are being implemented ( Hoadley, 2004 ; Barab, 2014 ). This flexibility allows the research team to modify instructional tools or strategies that prove inadequate for collecting the evidence necessary to evaluate the underlying theory and ensures a tight connection between interventions and a specific learning problem ( Collins et al. , 2004 ; Hoadley, 2004 ).

Finally, and importantly, experimental approaches and design-based research differ in the kinds of conclusions they draw from their data. Experimental research can “identify that something meaningful happened; but [it is] not able to articulate what about the intervention caused that story to unfold” ( Barab, 2014 , p. 162). In other words, experimental methods are robust for identifying where differences in learning occur, such as between groups of students experiencing peer- or self-grading of practice exams ( Jackson et al. , 2018 ) or receiving different curricula (e.g., Chi et al. , 2012 ). However, these methods are not able to characterize the underlying learning process or mechanism involved in the different learning outcomes. By contrast, design-based research has the potential to uncover mechanisms of learning, because it investigates how the nature of student thinking changes as students experience instructional interventions ( Shavelson et al. , 2003 ; Barab, 2014 ). According to Sandoval (2014) , “Design research, as a means of uncovering causal processes, is oriented not to finding effects but to finding functions , to understanding how desired (and undesired) effects arise through interactions in a designed environment” (p. 30). In Zagallo et al. (2016) , the authors found that their curriculum supported students’ data-interpretation skills, because it stimulated students’ spontaneous use of argumentation during which group members coconstructed evidence-based claims from the data provided. Students also worked collaboratively to decode figures and identify data patterns. These strategies were identified from the researchers’ qualitative data analysis of in-class recordings of small-group discussions, which allowed them to observe what students were doing to support their learning. Because design-based research is focused on characterizing how learning occurs in classrooms, it can begin to answer the kinds of mechanistic questions others have identified as central to advancing BER ( National Research Council [NRC], 2012 ; Dolan, 2015 ; Lo et al. , 2019 ).

DESIGN-BASED RESEARCH IN ACTION: AN EXAMPLE FROM UNDERGRADUATE PHYSIOLOGY

To illustrate how design-based research could be employed in BER, we draw on our own research that investigates how students learn physiology. We will characterize one iteration of our design-based research cycle ( Figure 1 ), emphasizing how our project uses Sandoval’s four epistemic commitments (i.e., theory driven, practically applied, generating design principles, implemented in an iterative manner) to guide our implementation.

Identifying the Learning Problem

Understanding physiological phenomena is challenging for students, given the wide variety of contexts (e.g., cardiovascular, neuromuscular, respiratory; animal vs. plant) and scales involved (e.g., using molecular-level interactions to explain organism functioning; Wang, 2004 ; Michael, 2007 ; Badenhorst et al. , 2016 ). To address these learning challenges, Modell (2000) identified seven “general models” that undergird most physiology phenomena (i.e., control systems, conservation of mass, mass and heat flow, elastic properties of tissues, transport across membranes, cell-to-cell communication, molecular interactions). Instructors can use these models as a “conceptual framework” to help students build intellectual coherence across phenomena and develop a deeper understanding of physiology ( Modell, 2000 ; Michael et al. , 2009 ). This approach aligns with theoretical work in the learning sciences that indicates that providing students with conceptual frameworks improves their ability to integrate and retrieve knowledge ( National Academies of Sciences, Engineering, and Medicine, 2018 ).

Before the start of our design-based project, we had been using Modell’s (2000) general models to guide our instruction. In this essay, we will focus on how we used the general models of mass and heat flow and transport across membranes in our instruction. These two models together describe how materials flow down gradients (e.g., pressure gradients, electrochemical gradients) against sources of resistance (e.g., tube diameter, channel frequency). We call this flux reasoning. We emphasized the fundamental nature and broad utility of flux reasoning in lecture and lab and frequently highlighted when it could be applied to explain a phenomenon. We also developed a conceptual scaffold (the Flux Reasoning Tool) that students could use to reason about physiological processes involving flux.

Although these instructional approaches had improved students’ understanding of flux phenomena, we found that students often demonstrated little commitment to using flux broadly across physiological contexts. Instead, they considered flux to be just another fact to memorize and applied it to narrow circumstances (e.g., they would use flux to reason about ions flowing across membranes—the context where flux was first introduced—but not the bulk flow of blood in a vessel). Students also struggled to integrate the various components of flux (e.g., balancing chemical and electrical gradients, accounting for variable resistance). We saw these issues reflected in students’ lower than hoped for exam scores on the cumulative final of the course. From these experiences, and from conversations with other physiology instructors, we identified a learning problem to address through design-based research: How do students learn to use flux reasoning to explain material flows in multiple physiology contexts?

The process of identifying a learning problem usually emerges from a researcher’s own experiences (in or outside a classroom) or from previous research that has been described in the literature ( Cobb et al. , 2003 ). To remain true to Sandoval’s first epistemic commitment, a learning problem must advance a theory of learning ( Edelson, 2002 ; McKenney and Reeves, 2013 ). In our work, we investigated how conceptual frameworks based on fundamental scientific concepts (i.e., Modell’s general models) could help students reason productively about physiology phenomena (National Academies of Sciences, Engineering, and Medicine, 2018; Modell, 2000 ). Our specific theoretical question was: Can we characterize how students’ conceptual frameworks around flux change as they work toward robust ideas? Sandoval’s second epistemic commitment stated that a learning problem must aim to improve student learning outcomes. The practical significance of our learning problem was: Does using the concept of flux as a foundational idea for instructional tools increase students’ learning of physiological phenomena?

We investigated our learning problem in an introductory biology course at a large R1 institution. The introductory course is the third in a biology sequence that focuses on plant and animal physiology. The course typically serves between 250 and 600 students in their sophomore or junior years each term. Classes have the following average demographics: 68% male, 21% from lower-income situations, 12% from an underrepresented minority, and 26% first-generation college students.

Design-Based Research Cycle 1, Phase 1: Designing Instructional Tools

The first phase of design-based research involves developing instructional tools that address both the theoretical and practical concerns of the learning problem ( Edelson, 2002 ; Wang and Hannafin, 2005 ). These instructional tools can take many forms, such as specific instructional strategies, classroom worksheets and practices, or technological software, as long as they embody the underlying learning theory being investigated. They must also produce classroom experiences or materials that can be evaluated to determine whether learning outcomes were met ( Sandoval, 2014 ). Indeed, this alignment between theory, the nature of the instructional tools, and the ways students are assessed is central to ensuring rigorous design-based research ( Hoadley, 2004 ; Sandoval, 2014 ). Taken together, the instructional tools instantiate a hypothesized learning environment that will advance both the theoretical and practical questions driving the research ( Barab, 2014 ).

In our work, the theoretical claim that instruction based on fundamental scientific concepts would support students’ flux reasoning was embodied in our instructional approach by being the central focus of all instructional materials, which included: a revised version of the Flux Reasoning Tool ( Figure 2 ); case study–based units in lecture that explicitly emphasized flux phenomena in real-world contexts ( Windschitl et al. , 2012 ; Scott et al. , 2018 ; Figure 3 ); classroom activities in which students practiced using flux to address physiological scenarios; links to online videos describing key flux-related concepts; constructed-response assessment items that cued students to use flux reasoning in their thinking; and pretest/posttest formative assessment questions that tracked student learning ( Figure 4 ).

FIGURE 2. The Flux Reasoning Tool given to students at the beginning of the quarter.

FIGURE 3. An example flux case study that is presented to students at the beginning of the neurophysiology unit. Throughout the unit, students learn how ion flows into and out of cells, as mediated by chemical and electrical gradients and various ion/molecular channels, sends signals throughout the body. They use this information to better understand why Jaime experiences persistent neuropathy. Images from: uz.wikipedia.org/wiki/Fayl:Blausen_0822_SpinalCord.png and commons.wikimedia.org/wiki/File:Figure_38_01_07.jpg.

FIGURE 4. An example flux assessment question about ion flows given in a pre-unit/post-unit formative assessment in the neurophysiology unit.

Phase 2: Testing the Instructional Tools

In the second phase of design-based research, the instructional tools are tested by implementing them in classrooms. During this phase, the instructional tools are placed “in harm’s way … in order to expose the details of the process to scrutiny” ( Cobb et al. , 2003 , p. 10). In this way, researchers and instructors test how the tools perform in real-world settings, which may differ considerably from the design team’s initial expectations ( Hoadley, 2004 ). During this phase, if necessary, the design team may make adjustments to the tools as they are being used to account for these unanticipated conditions ( Collins et al. , 2004 ).

We implemented the instructional tools during the Autumn and Spring quarters of the 2016–2017 academic year. Students were taught to use the Flux Reasoning Tool at the beginning of the term in the context of the first case study unit focused on neurophysiology. Each physiology unit throughout the term was associated with a new concept-based case study (usually about flux) that framed the context of the teaching. Embedded within the daily lectures were classroom activities in which students could practice using flux. Students were also assigned readings from the textbook and videos related to flux to watch during each unit. Throughout the term, students took five exams that each contained some flux questions as well as some pre- and post-unit formative assessment questions. During Winter quarter, we conducted clinical interviews with students who would take our course in the Spring term (i.e., “pre” data) as well as students who had just completed our course in Autumn (i.e., “post” data).

Phase 3: Evaluating the Instructional Tools

The third phase of a design-based research cycle involves evaluating the effectiveness of instructional tools using evidence of student learning ( Barab and Squire, 2004 ; Anderson and Shattuck, 2012 ). This can be done using products produced by students (e.g., homework, lab reports), attitudinal gains measured with surveys, participation rates in activities, interview testimonials, classroom discourse practices, and formative assessment or exam data (e.g., Reiser et al. , 2001 ; Cobb et al. , 2003 ; Barab and Squire, 2004 ; Mohan et al. , 2009 ). Regardless of the source, evidence must be in a form that supports a systematic analysis that could be scrutinized by other researchers ( Cobb et al. , 2003 ; Barab, 2014 ). Also, because design-based research often involves multiple data streams, researchers may need to use both quantitative and qualitative analytical methods to produce a rich picture of how the instructional tools affected student learning ( Collins et al. , 2004 ; Anderson and Shattuck, 2012 ).

In our work, we used the quality of students’ written responses on exams and formative assessment questions to determine whether students improved their understanding of physiological phenomena involving flux. For each assessment question, we analyzed a subset of student’s pretest answers to identify overarching patterns in students’ reasoning about flux, characterized these overarching patterns, then ordinated the patterns into different levels of sophistication. These became our scoring rubrics, which identified five different levels of student reasoning about flux. We used the rubrics to code the remainder of students’ responses, with a code designating the level of student reasoning associated with a particular reasoning pattern. We used this ordinal rubric format because it would later inform our theoretical understanding of how students build flux conceptual frameworks (see phase 4). This also allowed us to both characterize the ideas students held about flux phenomena and identify the frequency distribution of those ideas in a class.

By analyzing changes in the frequency distributions of students’ ideas across the rubric levels at different time points in the term (e.g., pre-unit vs. post-unit), we could track both the number of students who gained more sophisticated ideas about flux as the term progressed and the quality of those ideas. If the frequency of students reasoning at higher levels increased from pre-unit to post-unit assessments, we could conclude that our instructional tools as a whole were supporting students’ development of sophisticated flux ideas. For example, on one neuromuscular ion flux assessment question in the Spring of 2017, we found that relatively more students were reasoning at the highest levels of our rubric (i.e., levels 4 and 5) on the post-unit test compared with the pre-unit test. This meant that more students were beginning to integrate sophisticated ideas about flux (i.e., they were balancing concentration and electrical gradients) in their reasoning about ion movement.

To help validate this finding, we drew on three additional data streams: 1) from in-class group recordings of students working with flux items, we noted that students increasingly incorporated ideas about gradients and resistance when constructing their explanations as the term progressed; 2) from plant assessment items in the latter part of the term, we began to see students using flux ideas unprompted; and 3) from interviews, we observed that students who had already taken the course used flux ideas in their reasoning.

Through these analyses, we also noticed an interesting pattern in the pre-unit test data for Spring 2017 when compared with the frequency distribution of students’ responses with a previous term (Autumn 2016). In Spring 2017, 42% of students reasoned at level 4 or 5 on the pre-unit test, indicating these students already had sophisticated ideas about ion flux before they took the pre-unit assessment. This was surprising, considering only 2% of students reasoned at these levels for this item on the Autumn 2016 pre-unit test.

Phase 4: Reflecting on the Instructional Tools and Their Implementation

The final phase of a design-based research cycle involves a retrospective analysis that addresses the epistemic commitments of this methodology: How was the theory underpinning the research advanced by the research endeavor (theoretical outcome)? Did the instructional tools support student learning about the learning problem (practical outcome)? What were the critical features of the design solution that supported student learning (design principles)? ( Cobb et al. , 2003 ; Barab and Squire, 2004 ).

Theoretical Outcome (Epistemic Commitment 1).

Reflecting on how a design-based research experiment advances theory is critical to our understanding of how students learn in educational settings ( Barab and Squire, 2004 ; Mohan et al. , 2009 ). In our work, we aimed to characterize how students’ conceptual frameworks around flux change as they work toward robust ideas. To do this, we drew on learning progression research as our theoretical framing ( NRC, 2007 ; Corcoran et al. , 2009 ; Duschl et al. , 2011 ; Scott et al. , 2019 ). Learning progression frameworks describe empirically derived patterns in student thinking that are ordered into levels representing cognitive shifts in the ways students conceive a topic as they work toward mastery ( Gunckel et al. , 2012 ). We used our ion flux scoring rubrics to create a preliminary five-level learning progression framework ( Table 1 ). The framework describes how students’ ideas about flux often start with teleological-driven accounts at the lowest level (i.e., level 1), shift to focusing on driving forces (e.g., concentration gradients, electrical gradients) in the middle levels, and arrive at complex ideas that integrate multiple interacting forces at the higher levels. We further validated these reasoning patterns with our student interviews. However, our flux conceptual framework was largely based on student responses to our ion flux assessment items. Therefore, to further validate our learning progression framework, we needed a greater diversity of flux assessment items that investigated student thinking more broadly (i.e., about bulk flow, water movement) across physiological systems.

Practical Outcome (Epistemic Commitment 2).

In design-based research, learning theories must “do real work” by improving student learning in real-world settings ( DBR Collective, 2003 ). Therefore, design-based researchers must reflect on whether or not the data they collected show evidence that the instructional tools improved student learning ( Cobb et al. , 2003 ; Sharma and McShane, 2008 ). We determined whether our flux-based instructional approach aided student learning by analyzing the kinds of answers students provided to our assessment questions. Specifically, we considered students who reasoned at level 4 or above as demonstrating productive flux reasoning. Because almost half of students were reasoning at level 4 or 5 on the post-unit assessment after experiencing the instructional tools in the neurophysiology unit (in Spring 2017), we concluded that our tools supported student learning in physiology. Additionally, we noticed that students used language in their explanations that directly tied to the Flux Reasoning Tool ( Figure 2 ), which instructed them to use arrows to indicate the magnitude and direction of gradient-driving forces. For example, in a posttest response to our ion flux item ( Figure 4 ), one student wrote:

Ion movement is a function of concentration and electrical gradients . Which arrow is stronger determines the movement of K+. We can make the electrical arrow bigger and pointing in by making the membrane potential more negative than Ek [i.e., potassium’s equilibrium potential]. We can make the concentration arrow bigger and pointing in by making a very strong concentration gradient pointing in.

Given that almost half of students reasoned at level 4 or above, and that students used language from the Flux Reasoning Tool, we concluded that using fundamental concepts was a productive instructional approach for improving student learning in physiology and that our instructional tools aided student learning. However, some students in the 2016–2017 academic year continued to apply flux ideas more narrowly than intended (i.e., for ion and simple diffusion cases, but not water flux or bulk flow). This suggested that students had developed nascent flux conceptual frameworks after experiencing the instructional tools but could use more support to realize the broad applicability of this principle. Also, although our cross-sectional interview approach demonstrated how students’ ideas, overall, could change after experiencing the instructional tools, it did not provide information about how a student developed flux reasoning.

Reflecting on practical outcomes also means interpreting any learning gains in the context of the learning ecology. This reflection allowed us to identify whether there were particular aspects of the instructional tools that were better at supporting learning than others ( DBR Collective, 2003 ). Indeed, this was critical for our understanding why 42% of students scored at level 3 and above on the pre-unit ion assessment in the Spring of 2017, while only 2% of students scored level 3 and above in Autumn of 2016. When we reviewed notes of the Spring 2017 implementation scheme, we saw that the pretest was due at the end of the first day of class after students had been exposed to ion flux ideas in class and in a reading/video assignment about ion flow, which may be one reason for the students’ high performance on the pretest. Consequently, we could not tell whether students’ initial high performance was due to their learning from the activities in the first day of class or for other reasons we did not measure. It also indicated we needed to close pretests before the first day of class for a more accurate measure of students’ incoming ideas and the effectiveness of the instructional tools employed at the beginning of the unit.

Design Principles (Epistemic Commitment 3).

Although design-based research is enacted in local contexts (i.e., a particular classroom), its purpose is to inform learning ecologies that have broad applications to improve learning and teaching ( Edelson, 2002 ; Cobb et al. , 2003 ). Therefore, design-based research should produce design principles that describe characteristics of learning environments that researchers and instructors can use to develop instructional tools specific to their local contexts (e.g., Edelson, 2002 ; Subramaniam et al. , 2015 ). Consequently, the design principles must balance specificity with adaptability so they can be used broadly to inform instruction ( Collins et al. , 2004 ; Barab, 2014 ).

From our first cycle of design-based research, we developed the following design principles: 1) Key scientific concepts should provide an overarching framework for course organization. This way, the individual components that make up a course, like instructional units, activities, practice problems, and assessments, all reinforce the centrality of the key concept. 2) Instructional tools should explicitly articulate the principle of interest, with specific guidance on how that principle is applied in context. This stresses the applied nature of the principle and that it is more than a fact to be memorized. 3) Instructional tools need to show specific instances of how the principle is applied in multiple contexts to combat students’ narrow application of the principle to a limited number of contexts.

Design-Based Research Cycle 2, Phase 1: Redesign and Refine the Experiment

The last “epistemic commitment” Sandoval (2014) articulated was that design-based research be an iterative process with an eye toward continually refining the instructional tools, based on evidence of student learning, to produce more robust learning environments. By viewing educational inquiry as formative research, design-based researchers recognize the difficulty in accounting for all variables that could impact student learning, or the implementation of the instructional tools, a priori ( Collins et al. , 2004 ). Robust instructional designs are the products of trial and error, which are strengthened by a systematic analysis of how they perform in real-world settings.

To continue to advance our work investigating student thinking using the principle of flux, we began a second cycle of design-based research that continued to address the learning problem of helping students reason with fundamental scientific concepts. In this cycle, we largely focused on broadening the number of physiological systems that had accompanying formative assessment questions (i.e., beyond ion flux), collecting student reasoning from a more diverse population of students (e.g., upper division, allied heath, community college), and refining and validating the flux learning progression with both written and interview data in a student through time. We developed a suite of constructed-response flux assessment questions that spanned neuromuscular, cardiovascular, respiratory, renal, and plant physiological contexts and asked students about several kinds of flux: ion movement, diffusion, water movement, and bulk flow (29 total questions; available at beyondmultiplechoice.org). This would provide us with rich qualitative data that we could use to refine the learning progression. We decided to administer written assessments and conduct interviews in a pretest/posttest manner at the beginning and end of each unit both as a way to increase our data about student reasoning and to provide students with additional practice using flux reasoning across contexts.

From this second round of designing instructional tools (i.e., broader range of assessment items), testing them in the classroom (i.e., administering the assessment items to diverse student populations), evaluating the tools (i.e., developing learning progression–aligned rubrics across phenomena from student data, tracking changes in the frequency distribution of students across levels through time), and reflecting on the tools’ success, we would develop a more thorough and robust characterization of how students use flux across systems that could better inform our creation of new instructional tools to support student learning.

HOW CAN DESIGN-BASED RESEARCH EXTEND AND ENRICH BER?

While design-based research has primarily been used in educational inquiry at the K–12 level (see Reiser et al. , 2001 ; Mohan et al. , 2009 ; Jin and Anderson, 2012 ), other science disciplines at undergraduate institutions have begun to employ this methodology to create robust instructional approaches (e.g., Szteinberg et al. , 2014 in chemistry; Hake, 2007 , and Sharma and McShane, 2008 , in physics; Kelly, 2014 , in engineering). Our own work, as well as that by Zagallo et al. (2016) , provides two examples of how design-based research could be implemented in BER. Below, we articulate some of the ways incorporating design-based research into BER could extend and enrich this field of educational inquiry.

Design-Based Research Connects Theory with Practice

One critique of BER is that it does not draw heavily enough on learning theories from other disciplines like cognitive psychology or the learning sciences to inform its research ( Coley and Tanner, 2012 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Davidesco and Milne, 2019 ). For example, there has been considerable work in BER developing concept inventories as formative assessment tools that identify concepts students often struggle to learn (e.g., Marbach-Ad et al. , 2009 ; McFarland et al. , 2017 ; Summers et al. , 2018 ). However, much of this work is detached from a theoretical understanding of why students hold misconceptions in the first place, what the nature of their thinking is, and the learning mechanisms that would move students to a more productive understanding of domain ideas ( Alonzo, 2011 ). Using design-based research to understand the basis of students’ misconceptions would ground these practical learning problems in a theoretical understanding of the nature of student thinking (e.g., see Coley and Tanner, 2012 , 2015 ; Gouvea and Simon, 2018 ) and the kinds of instructional tools that would best support the learning process.

Design-Based Research Fosters Collaborations across Disciplines

Recently, there have been multiple calls across science, technology, engineering, and mathematics education fields to increase collaborations between BER and other disciplines so as to increase the robustness of science education research at the collegiate level ( Coley and Tanner, 2012 ; NRC, 2012 ; Talanquer, 2014 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Mestre et al. , 2018 ; Davidesco and Milne, 2019 ). Engaging in design-based research provides both a mechanism and a motivation for fostering interdisciplinary collaborations, as it requires the design team to have theoretical knowledge of how students learn, domain knowledge of practical learning problems, and instructional knowledge for how to implement instructional tools in the classroom ( Edelson, 2002 ; Hoadley, 2004 ; Wang and Hannafin, 2005 ; Anderson and Shattuck, 2012 ). For example, in our current work, our research team consists of two discipline-based education learning scientists from an R1 institution, two physiology education researchers/instructors (one from an R1 institution the other from a community college), several physiology disciplinary experts/instructors, and a K–12 science education expert.

Design-based research collaborations have several distinct benefits for BER: first, learning or cognitive scientists could provide theoretical and methodological expertise that may be unfamiliar to biology education researchers with traditional science backgrounds ( Lo et al. , 2019 ). This would both improve the rigor of the research project and provide biology education researchers with the opportunity to explore ideas and methods from other disciplines. Second, collaborations between researchers and instructors could help increase the implementation of evidence-based teaching practices by instructors/faculty who are not education researchers and would benefit from support while shifting their instructional approaches ( Eddy et al. , 2015 ). This may be especially true for community college and primarily undergraduate institution faculty who often do not have access to the same kinds of resources that researchers and instructors at research-intensive institutions do ( Schinske et al. , 2017 ). Third, making instructors an integral part of a design-based research project ensures they are well versed in the theory and learning objectives underlying the instructional tools they are implementing in the classroom. This can improve the fidelity of implementation of the instructional tools, because the instructors understand the tools’ theoretical and practical purposes, which has been cited as one reason there have been mixed results on the impact of active learning across biology classes ( Andrews et al. , 2011 ; Borrego et al. , 2013 ; Lee et al. , 2018 ; Offerdahl et al. , 2018 ). It also gives instructors agency to make informed adjustments to the instructional tools during implementation that improve their practical applications while remaining true to the goals of the research ( Hoadley, 2004 ).

Design-Based Research Invites Using Mixed Methods to Analyze Data

The diverse nature of the data that are often collected in design-based research can require both qualitative and quantitative methodologies to produce a rich picture of how the instructional tools and their implementation influenced student learning ( Anderson and Shattuck, 2012 ). Using mixed methods may be less familiar to biology education researchers who were primarily trained in quantitative methods as biologists ( Lo et al. , 2019 ). However, according to Warfa (2016 , p. 2), “Integration of research findings from quantitative and qualitative inquiries in the same study or across studies maximizes the affordances of each approach and can provide better understanding of biology teaching and learning than either approach alone.” Although the number of BER studies using mixed methods has increased over the past decade ( Lo et al. , 2019 ), engaging in design-based research could further this trend through its collaborative nature of bringing social scientists together with biology education researchers to share research methodologies from different fields. By leveraging qualitative and quantitative methods, design-based researchers unpack “mechanism and process” by characterizing the nature of student thinking rather than “simply reporting that differences did or did not occur” ( Barab, 2014 , p. 158), which is important for continuing to advance our understanding of student learning in BER ( Dolan, 2015 ; Lo et al. , 2019 ).

CHALLENGES TO IMPLEMENTING DESIGN-BASED RESEARCH IN BER

As with any methodological approach, there can be challenges to implementing design-based research. Here, we highlight three that may be relevant to BER.

Collaborations Can Be Difficult to Maintain

While collaborations between researchers and instructors offer many affordances (as discussed earlier), the reality of connecting researchers across departments and institutions can be challenging. For example, Peffer and Renken (2016) noted that different traditions of scholarship can present barriers to collaboration where there is not mutual respect for the methods and ideas that are part and parcel to each discipline. Additionally, Schinske et al. (2017) identified several constraints that community college faculty face for engaging in BER, such as limited time or support (e.g., infrastructural, administrative, and peer support), which could also impact their ability to form the kinds of collaborations inherent in design-based research. Moreover, the iterative nature of design-based research requires these collaborations to persist for an extended period of time. Attending to these challenges is an important part of forming the design team and identifying the different roles researchers and instructors will play in the research.

Design-Based Research Experiments Are Resource Intensive

The focus of design-based research on studying learning ecologies to uncover mechanisms of learning requires that researchers collect multiple data streams through time, which often necessitates significant temporal and financial resources ( Collins et al., 2004 ; O’Donnell, 2004 ). Consequently, researchers must weigh both practical as well as methodological considerations when formulating their experimental design. For example, investigating learning mechanisms requires that researchers collect data at a frequency that will capture changes in student thinking ( Siegler, 2006 ). However, researchers may be constrained in the number of data-collection events they can anticipate depending on: the instructor’s ability to facilitate in-class collection events or solicit student participation in extracurricular activities (e.g., interviews); the cost of technological devices to record student conversations; the time and logistical considerations needed to schedule and conduct student interviews; the financial resources available to compensate student participants; the financial and temporal costs associated with analyzing large amounts of data.

Identifying learning mechanisms also requires in-depth analyses of qualitative data as students experience various instructional tools (e.g., microgenetic methods; Flynn et al. , 2006 ; Siegler, 2006 ). The high intensity of these in-depth analyses often limits the number of students who can be evaluated in this way, which must be balanced with the kinds of generalizations researchers wish to make about the effectiveness of the instructional tools ( O’Donnell, 2004 ). Because of the large variety of data streams that could be collected in a design-based research experiment—and the resources required to collect and analyze them—it is critical that the research team identify a priori how specific data streams, and the methods of their analysis, will provide the evidence necessary to address the theoretical and practical objectives of the research (see the following section on experimental rigor; Sandoval, 2014 ). These are critical management decisions because of the need for a transparent, systematic analysis of the data that others can scrutinize to evaluate the validity of the claims being made ( Cobb et al. , 2003 ).

Concerns with Experimental Rigor

The nature of design-based research, with its use of narrative to characterize versus control experimental environments, has drawn concerns about the rigor of this methodological approach. Some have challenged its ability to produce evidence-based warrants to support its claims of learning that can be replicated and critiqued by others ( Shavelson et al. , 2003 ; Hoadley, 2004 ). This is a valid concern that design-based researchers, and indeed all education researchers, must address to ensure their research meets established standards for education research ( NRC, 2002 ).

One way design-based researchers address this concern is by “specifying theoretically salient features of a learning environment design and mapping out how they are predicted to work together to produce desired outcomes” ( Sandoval, 2014 , p. 19). Through this process, researchers explicitly show before they begin the work how their theory of learning is embodied in the instructional tools to be tested, the specific data the tools will produce for analysis, and what outcomes will be taken as evidence for success. Moreover, by allowing instructional tools to be modified during the testing phase as needed, design-based researchers acknowledge that it is impossible to anticipate all aspects of the classroom environment that might impact the implementation of instructional tools, “as dozens (if not millions) of factors interact to produce the measureable outcomes related to learning” ( Hoadley, 2004 , p. 204; DBR Collective, 2003 ). Consequently, modifying instructional tools midstream to account for these unanticipated factors can ensure they retain their methodological alignment with the underlying theory and predicted learning outcomes so that inferences drawn from the design experiment accurately reflect what was being tested ( Edelson, 2002 ; Hoadley, 2004 ). Indeed, Barab (2014) states, “the messiness of real-world practice must be recognized, understood, and integrated as part of the theoretical claims if the claims are to have real-world explanatory value” (p. 153).

CONCLUSIONS

providing a methodology that integrates theories of learning with practical experiences in classrooms,

using a range of analytical approaches that allow for researchers to uncover the underlying mechanisms of student thinking and learning,

fostering interdisciplinary collaborations among researchers and instructors, and

characterizing learning ecologies that account for the complexity involved in student learning

By employing this methodology from the learning sciences, biology education researchers can enrich our current understanding of what is required to help biology students achieve their personal and professional aims during their college experience. It can also stimulate new ideas for biology education that can be discussed and debated in our research community as we continue to explore and refine how best to serve the students who pass through our classroom doors.

1 “Epistemic commitment” is defined as engaging in certain practices that generate knowledge in an agreed-upon way.

ACKNOWLEDGMENTS

We thank the UW Biology Education Research Group’s (BERG) feedback on drafts of this essay as well as Dr. L. Jescovich for last-minute analyses. This work was supported by a National Science Foundation award (NSF DUE 1661263/1660643). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the NSF. All procedures were conducted in accordance with approval from the Institutional Review Board at the University of Washington (52146) and the New England Independent Review Board (120160152).

  • Alonzo, A. C. ( 2011 ). Learning progressions that support formative assessment practices . Measurement , 9 (2/3), 124–129. Google Scholar
  • Anderson, T., & Shattuck, J. ( 2012 ). Design-based research: A decade of progress in education research? Educational Researcher , 41 (1), 16–25. Google Scholar
  • Andrews, T. M., Leonard, M. J., Colgrove, C. A., & Kalinowski, S. T. ( 2011 ). Active learning not associated with student learning in a random sample of college biology courses . CBE—Life Sciences Education , 10 (4), 394–405. Link ,  Google Scholar
  • Badenhorst, E., Hartman, N., & Mamede, S. ( 2016 ). How biomedical misconceptions may arise and affect medical students’ learning: A review of theoretical perspectives and empirical evidence . Health Professions Education , 2 (1), 10–17. Google Scholar
  • Barab, S. ( 2014 ). Design-based research: A methodological toolkit for engineering change . In The Cambridge handbook of the learning sciences (2nd ed., pp. 151–170). Cambridge University Press. https://doi.org/10.1017/CBO9781139519526.011 Google Scholar
  • Barab, S., & Squire, K. ( 2004 ). Design-based research: Putting a stake in the ground . Journal of the Learning Sciences , 13 (1), 1–14. Google Scholar
  • Bell, P. ( 2004 ). On the theoretical breadth of design-based research in education . Educational Psychologist , 39 (4), 243–253. Google Scholar
  • Borrego, M., Cutler, S., Prince, M., Henderson, C., & Froyd, J. E. ( 2013 ). Fidelity of implementation of research-based instructional strategies (RBIS) in engineering science courses . Journal of Engineering Education , 102 (3), 394–425. Google Scholar
  • Brown, A. L. ( 1992 ). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings . Journal of the Learning Sciences , 2 (2), 141–178. Google Scholar
  • Chi, M. T. H., Roscoe, R. D., Slotta, J. D., Roy, M., & Chase, C. C. ( 2012 ). Misconceived causal explanations for emergent processes . Cognitive Science , 36 (1), 1–61. Medline ,  Google Scholar
  • Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. ( 2003 ). Design experiments in educational research . Educational Researcher , 32 (1), 9–13. Google Scholar
  • Coley, J. D., & Tanner, K. D. ( 2012 ). Common origins of diverse misconceptions: Cognitive principles and the development of biology thinking . CBE—Life Sciences Education , 11 (3), 209–215. Link ,  Google Scholar
  • Coley, J. D., & Tanner, K. ( 2015 ). Relations between intuitive biological thinking and biological misconceptions in biology majors and nonmajors . CBE—Life Sciences Education , 14 (1). https://doi.org/10.1187/cbe.14-06-0094 Medline ,  Google Scholar
  • Collins, A., Joseph, D., & Bielaczyc, K. ( 2004 ). Design research: Theoretical and methodological issues . Journal of the Learning Sciences , 13 (1), 15–42. Google Scholar
  • Corcoran, T., Mosher, F. A., & Rogat, A. D. ( 2009 ). Learning progressions in science: An evidence-based approach to reform (CPRE Research Report No. RR-63) . Philadelphia, PA: Consortium for Policy Research in Education. Google Scholar
  • Davidesco, I., & Milne, C. ( 2019 ). Implementing cognitive science and discipline-based education research in the undergraduate science classroom . CBE—Life Sciences Education , 18 (3), es4. Link ,  Google Scholar
  • Design-Based Research Collective . ( 2003 ). Design-based research: An emerging paradigm for educational inquiry . Educational Researcher , 32 (1), 5–8. Google Scholar
  • Dolan, E. L. ( 2015 ). Biology education research 2.0 . CBE—Life Sciences Education , 14 (4), ed1. Link ,  Google Scholar
  • Duschl, R., Maeng, S., & Sezen, A. ( 2011 ). Learning progressions and teaching sequences: A review and analysis . Studies in Science Education , 47 (2), 123–182. Google Scholar
  • Eddy, S. L., Converse, M., & Wenderoth, M. P. ( 2015 ). PORTAAL: A classroom observation tool assessing evidence-based teaching practices for active learning in large science, technology, engineering, and mathematics classes . CBE—Life Sciences Education , 14 (2), ar23. Link ,  Google Scholar
  • Edelson, D. C. ( 2002 ). Design research: What we learn when we engage in design . Journal of the Learning Sciences , 11 (1), 105–121. Google Scholar
  • Flynn, E., Pine, K., & Lewis, C. ( 2006 ). The microgenetic method—Time for change? The Psychologist , 19 (3), 152–155. Google Scholar
  • Gouvea, J. S., & Simon, M. R. ( 2018 ). Challenging cognitive construals: A dynamic alternative to stable misconceptions . CBE—Life Sciences Education , 17 (2), ar34. Link ,  Google Scholar
  • Gunckel, K. L., Mohan, L., Covitt, B. A., & Anderson, C. W. ( 2012 ). Addressing challenges in developing learning progressions for environmental science literacy . In Alonzo, A. C.Gotwals, A. W. (Eds.), Learning progressions in science: Current challenges and future directions (pp. 39–75). Rotterdam: SensePublishers. https://doi.org/10.1007/978-94-6091-824-7_4 Google Scholar
  • Hake, R. R. ( 2007 ). Design-based research in physics education research: A review . In Kelly, A. E.Lesh, R. A.Baek, J. Y. (Eds.), Handbook of design research methods in mathematics, science, and technology education (p. 24). New York: Routledge. Google Scholar
  • Hoadley, C. M. ( 2004 ). Methodological alignment in design-based research . Educational Psychologist , 39 (4), 203–212. Google Scholar
  • Jackson, M., Tran, A., Wenderoth, M. P., & Doherty, J. H. ( 2018 ). Peer vs. self-grading of practice exams: Which is better? CBE—Life Sciences Education , 17 (3), es44. https://doi.org/10.1187/cbe.18-04-0052 Link ,  Google Scholar
  • Jin, H., & Anderson, C. W. ( 2012 ). A learning progression for energy in socio-ecological systems . Journal of Research in Science Teaching , 49 (9), 1149–1180. Google Scholar
  • Joseph, D. ( 2004 ). The practice of design-based research: Uncovering the interplay between design, research, and the real-world context . Educational Psychologist , 39 (4), 235–242. Google Scholar
  • Kelly, A. E. ( 2014 ). Design-based research in engineering education . In Cambridge handbook of engineering education research (pp. 497–518). New York, NY: Cambridge University Press. https://doi.org/10.1017/CBO9781139013451.032 Google Scholar
  • Lee, C. J., Toven-Lindsey, B., Shapiro, C., Soh, M., Mazrouee, S., Levis-Fitzgerald, M., & Sanders, E. R. ( 2018 ). Error-discovery learning boosts student engagement and performance, while reducing student attrition in a bioinformatics course . CBE—Life Sciences Education , 17 (3), ar40. https://doi.org/10.1187/cbe.17-04-0061 Link ,  Google Scholar
  • Lo, S. M., Gardner, G. E., Reid, J., Napoleon-Fanis, V., Carroll, P., Smith, E., & Sato, B. K. ( 2019 ). Prevailing questions and methodologies in biology education research: A longitudinal analysis of research in CBE — life sciences education and at the society for the advancement of biology education research . CBE—Life Sciences Education , 18 (1), ar9. Link ,  Google Scholar
  • Marbach-Ad, G., Briken, V., El-Sayed, N. M., Frauwirth, K., Fredericksen, B., Hutcheson, S., … & Smith, A. C. ( 2009 ). Assessing student understanding of host pathogen interactions using a concept inventory . Journal of Microbiology & Biology Education , 10 (1), 43–50. Medline ,  Google Scholar
  • McFarland, J. L., Price, R. M., Wenderoth, M. P., Martinková, P., Cliff, W., Michael, J. , … & Wright, A. ( 2017 ). Development and validation of the homeostasis concept inventory . CBE—Life Sciences Education , 16 (2), ar35. Link ,  Google Scholar
  • McKenney, S., & Reeves, T. C. ( 2013 ). Systematic review of design-based research progress: Is a little knowledge a dangerous thing? Educational Researcher , 42 (2), 97–100. Google Scholar
  • Mestre, J. P., Cheville, A., & Herman, G. L. ( 2018 ). Promoting DBER-cognitive psychology collaborations in STEM education . Journal of Engineering Education , 107 (1), 5–10. Google Scholar
  • Michael, J. A. ( 2007 ). What makes physiology hard for students to learn? Results of a faculty survey . AJP: Advances in Physiology Education , 31 (1), 34–40. Medline ,  Google Scholar
  • Michael, J. A., Modell, H., McFarland, J., & Cliff, W. ( 2009 ). The “core principles” of physiology: What should students understand? Advances in Physiology Education , 33 (1), 10–16. Medline ,  Google Scholar
  • Middleton, J., Gorard, S., Taylor, C., & Bannan-Ritland, B. ( 2008 ). The “compleat” design experiment: From soup to nuts . In Kelly, A. E.Lesh, R. A.Baek, J. Y. (Eds.), Handbook of design research methods in education: Innovations in science, technology, engineering, and mathematics learning and teaching (pp. 21–46). New York, NY: Routledge. Google Scholar
  • Modell, H. I. ( 2000 ). How to help students understand physiology? Emphasize general models . Advances in Physiology Education , 23 (1), S101–S107. Medline ,  Google Scholar
  • Mohan, L., Chen, J., & Anderson, C. W. ( 2009 ). Developing a multi-year learning progression for carbon cycling in socio-ecological systems . Journal of Research in Science Teaching , 46 (6), 675–698. Google Scholar
  • National Academies of Sciences, Engineering, and Medicine . ( 2018 ). How People Learn II: Learners, Contexts, and Cultures . Washington, DC: National Academies Press. Retrieved June 24, 2019, from https://doi.org/10.17226/24783 Google Scholar
  • National Research Council (NRC) . ( 2002 ). Scientific research in education . Washington, DC: National Academies Press. Retrieved January 31, 2019, from https://doi.org/10.17226/10236 Google Scholar
  • NRC . ( 2007 ). Taking science to school: Learning and teaching science in grades K–8 . Washington, DC: National Academies Press. Retrieved March 22, 2019, from www.nap.edu/catalog/11625/taking-science-to-school-learning-and-teaching-science-in-grades . https://doi.org/10.17226/11625 Google Scholar
  • NRC . ( 2012 ). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering . Washington, DC: National Academies Press. Retrieved from www.nap.edu/catalog/13362/discipline-based-education-research-understanding-and-improving-learning-in-undergraduate . https://doi.org/10.17226/13362 Google Scholar
  • NRC . ( 2018 ). How people learn II: Learners, contexts, and cultures . Washington, DC: National Academies Press. Retrieved from www.nap.edu/read/24783/chapter/7 . https://doi.org/10.17226/24783 Google Scholar
  • O’Donnell, A. M. ( 2004 ). A commentary on design research . Educational Psychologist , 39 (4), 255–260. Google Scholar
  • Offerdahl, E. G., McConnell, M., & Boyer, J. ( 2018 ). Can I have your recipe? Using a fidelity of implementation (FOI) framework to identify the key ingredients of formative assessment for learning . CBE—Life Sciences Education , 17 (4), es16. Link ,  Google Scholar
  • Peffer, M., & Renken, M. ( 2016 ). Practical strategies for collaboration across discipline-based education research and the learning sciences . CBE—Life Sciences Education , 15 (4), es11. Link ,  Google Scholar
  • Reiser, B. J., Smith, B. K., Tabak, I., Steinmuller, F., Sandoval, W. A., & Leone, A. J. ( 2001 ). BGuILE: Strategic and conceptual scaffolds for scientific inquiry in biology classrooms . In Carver, S. M.Klahr, D. (Eds.), Cognition and instruction: Twenty-five years of progress (pp. 263–305). Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Google Scholar
  • Sandoval, W. ( 2014 ). Conjecture mapping: An approach to systematic educational design research . Journal of the Learning Sciences , 23 (1), 18–36. Google Scholar
  • Sandoval, W. A., & Bell, P. ( 2004 ). Design-based research methods for studying learning in context: Introduction . Educational Psychologist , 39 (4), 199–201. Google Scholar
  • Schinske, J. N., Balke, V. L., Bangera, M. G., Bonney, K. M., Brownell, S. E., Carter, R. S. , … & Corwin, L. A. ( 2017 ). Broadening participation in biology education research: Engaging community college students and faculty . CBE—Life Sciences Education , 16 (2), mr1. Link ,  Google Scholar
  • Scott, E., Anderson, C. W., Mashood, K. K., Matz, R. L., Underwood, S. M., & Sawtelle, V. ( 2018 ). Developing an analytical framework to characterize student reasoning about complex processes . CBE—Life Sciences Education , 17 (3), ar49. https://doi.org/10.1187/cbe.17-10-0225 Link ,  Google Scholar
  • Scott, E., Wenderoth, M. P., & Doherty, J. H. ( 2019 ). Learning progressions: An empirically grounded, learner-centered framework to guide biology instruction . CBE—Life Sciences Education , 18 (4), es5. https://doi.org/10.1187/cbe.19-03-0059 Link ,  Google Scholar
  • Sharma, M. D., & McShane, K. ( 2008 ). A methodological framework for understanding and describing discipline-based scholarship of teaching in higher education through design-based research . Higher Education Research & Development , 27 (3), 257–270. Google Scholar
  • Shavelson, R. J., Phillips, D. C., Towne, L., & Feuer, M. J. ( 2003 ). On the science of education design studies . Educational Researcher , 32 (1), 25–28. Google Scholar
  • Siegler, R. S. ( 2006 ). Microgenetic analyses of learning . In Damon, W.Lerner, R. M. (Eds.), Handbook of child psychology (pp. 464–510). Hoboken, NJ: John Wiley & Sons, Inc. https://doi.org/10.1002/9780470147658.chpsy0211 Google Scholar
  • Stringer, E. T. ( 2013 ). Action research . Thousand Oaks, CA: Sage Publications, Inc. Google Scholar
  • Subramaniam, M., Jean, B. S., Taylor, N. G., Kodama, C., Follman, R., & Casciotti, D. ( 2015 ). Bit by bit: Using design-based research to improve the health literacy of adolescents . JMIR Research Protocols , 4 (2), e62. Medline ,  Google Scholar
  • Summers, M. M., Couch, B. A., Knight, J. K., Brownell, S. E., Crowe, A. J., Semsar, K. , … & Batzli, J. ( 2018 ). EcoEvo-MAPS: An ecology and evolution assessment for introductory through advanced undergraduates . CBE—Life Sciences Education , 17 (2), ar18. Link ,  Google Scholar
  • Szteinberg, G., Balicki, S., Banks, G., Clinchot, M., Cullipher, S., Huie, R. , … & Sevian, H. ( 2014 ). Collaborative professional development in chemistry education research: Bridging the gap between research and practice . Journal of Chemical Education , 91 (9), 1401–1408. Google Scholar
  • Talanquer, V. ( 2014 ). DBER and STEM education reform: Are we up to the challenge? Journal of Research in Science Teaching , 51 (6), 809–819. Google Scholar
  • Wang, F., & Hannafin, M. J. ( 2005 ). Design-based research and technology-enhanced learning environments . Educational Technology Research and Development , 53 (4), 5–23. Google Scholar
  • Wang, J.-R. ( 2004 ). Development and validation of a Two-tier instrument to examine understanding of internal transport in plants and the human circulatory system . International Journal of Science and Mathematics Education , 2 (2), 131–157. Google Scholar
  • Warfa, A.-R. M. ( 2016 ). Mixed-methods design in biology education research: Approach and uses . CBE—Life Sciences Education , 15 (4), rm5. Link ,  Google Scholar
  • Windschitl, M., Thompson, J., Braaten, M., & Stroupe, D. ( 2012 ). Proposing a core set of instructional practices and tools for teachers of science . Science Education , 96 (5), 878–903. Google Scholar
  • Zagallo, P., Meddleton, S., & Bolger, M. S. ( 2016 ). Teaching real data interpretation with models (TRIM): Analysis of student dialogue in a large-enrollment cell and developmental biology course . CBE—Life Sciences Education , 15 (2), ar17. Link ,  Google Scholar
  • Codéveloppement d’un programme d’autogestion de la douleur chronique en ligne: un projet de recherche basé sur la conception et axé sur l’engagement des patients 12 March 2024 | Canadian Journal of Pain, Vol. 8, No. 1
  • Enhancing undergraduates’ engagement in a learning community by including their voices in the technological and instructional design Computers & Education, Vol. 214
  • Practice-Based Teacher Education Benefits Graduate Trainees and Their Students Through Inclusive and Active Teaching Methods 16 October 2023 | Journal for STEM Education Research, Vol. 7, No. 1
  • Leveraging learning experience design: digital media approaches to influence motivational traits that support student learning behaviors in undergraduate online courses 11 October 2022 | Journal of Computing in Higher Education, Vol. 35, No. 3
  • Investigating an Assessment Design that Prevents Students from Using ChatGPT as the Sole Basis to Pass Assessment at the Tertiary Level 30 November 2023 | E-Journal of Humanities, Arts and Social Sciences
  • Spatial Variations in Aquatic Insect Community Structure in the Winam Gulf of Lake Victoria, Kenya International Journal of Ecology, Vol. 2023
  • The Perceived Effectiveness of Various Forms of Feedback on the Acquisition of Technical Skills by Advanced Learners in Simulation-Based Health Professions Education Cureus, Vol. 44
  • Occupational therapists' acceptance of 3D printing 22 August 2023 | South African Journal of Occupational Therapy, Vol. 53, No. 2
  • An app by students for students – the DPaCK-model for a digital collaborative teamwork project to identify butterflies 4 August 2023 | Frontiers in Education, Vol. 8
  • Applying DBR to design protocols for synchronous online Chinese learning: An activity theoretic perspective System, Vol. 116
  • Defining the Nature of Augmented Feedback for Learning Intraosseous Access Skills in Simulation-Based Health Professions Education Cureus, Vol. 86
  • Practice-based 21st-century teacher education: Design principles for adaptive expertise Teaching and Teacher Education, Vol. 128
  • Undergraduate students’ neurophysiological reasoning: what we learn from the attractive distractors students select Advances in Physiology Education, Vol. 47, No. 2
  • Oaks to arteries: the Physiology Core Concept of flow down gradients supports transfer of student reasoning Advances in Physiology Education, Vol. 47, No. 2
  • Audrey Chen ,
  • Kimberley A. Phillips ,
  • Jennifer E. Schaefer , and
  • Patrick M. Sonner
  • Kyle Frantz,, Monitoring Editor
  • Optimizing the Learner’s Role in Feedback: Development of a Feedback-Preparedness Online Application for Medical Students in the Clinical Setting Cureus, Vol. 42
  • History, Status, and Development of AI-Based Learning Science 8 April 2023 | SN Computer Science, Vol. 4, No. 3
  • An Analytical Dashboard of Collaborative Activities for the Knowledge Building 4 March 2023 | Technology, Knowledge and Learning, Vol. 29
  • The Application of a Design-Based Research Framework for Simulation-Based Education Cureus, Vol. 22
  • Erin Stanfield ,
  • Corin D. Slown ,
  • Quentin Sedlacek , and
  • Suzanne E. Worcester
  • James Hewlett, Monitoring Editor
  • The effect of the e-mentoring-based education program on professional development of preschool teachers 3 July 2021 | Education and Information Technologies, Vol. 27, No. 1
  • Education Sciences, Vol. 12, No. 8
  • Training Digital Competences of Educators in Continuing Education: A Three-Level Approach 27 October 2022
  • Design-based research as a framework for developing and deploying augmented reality applications and scenarios for intercultural exchange 13 December 2021
  • Repetition Is Important to Students and Their Understanding during Laboratory Courses That Include Research Journal of Microbiology & Biology Education, Vol. 22, No. 2
  • Another look at the core concepts of physiology: revisions and resources Advances in Physiology Education, Vol. 44, No. 4

design based research project

Submitted: 18 November 2019 Revised: 3 March 2020 Accepted: 25 March 2020

© 2020 E. E. Scott et al. CBE—Life Sciences Education © 2020 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  • Introduction
  • Acknowledgements
  • 1. Groundwork
  • 1.1. Research
  • 1.2. Knowing
  • 1.3. Theories
  • 1.4. Ethics
  • 2. Paradigms
  • 2.1. Inferential Statistics
  • 2.2. Sampling
  • 2.3. Qualitative Rigor
  • 2.4. Design-Based Research
  • 2.5. Mixed Methods
  • 3. Learning Theories
  • 3.1. Behaviorism
  • 3.2. Cognitivism
  • 3.3. Constructivism
  • 3.4. Socioculturalism
  • 3.5. Connectivism
  • Appendix A. Supplements
  • Appendix B. Example Studies
  • Example Study #1. Public comment sentiment on educational videos
  • Example Study #2. Effects of open textbook adoption on teachers' open practices
  • Appendix C. Historical Readings
  • Manifesto of the Communist Party (1848)
  • On the Origin of Species (1859)
  • Science and the Savages (1905)
  • Theories of Knowledge (1916)
  • Theories of Morals (1916)
  • Translations

Design-Based Research

Choose a sign-in option.

Tools and Settings

Questions and Tasks

Citation and Embed Code

design based research project

In an educational setting, design-based research is a research approach that engages in iterative designs to develop knowledge that improves educational practices. This chapter will provide a brief overview of the origin, paradigms, outcomes, and processes of design-based research (DBR). In these sections we explain that (a) DBR originated because some researchers believed that traditional research methods failed to improve classroom practices, (b) DBR places researchers as agents of change and research subjects as collaborators, (c) DBR produces both new designs and theories, and (d) DBR consists of an iterative process of design and evaluation to develop knowledge.

Origin of DBR

DBR originated as researchers like Allan Collins (1990) and Ann Brown (1992) recognized that educational research often failed to improve classroom practices. They perceived that much of educational research was conducted in controlled, laboratory-like settings. They believed that this laboratory research was not as helpful as possible for practitioners.

Proponents of DBR claim that educational research is often detached from practice (The Design-Based Research Collective, 2002). There are at least two problems that arise from this detachment: (a) practitioners do not benefit from researchers’ work and (b) research results may be inaccurate because they fail to account for context (The Design-Based Research Collective, 2002).

Practitioners do not benefit from researchers’ work if the research is detached from practice. Practitioners are able to benefit from research when they see how the research can inform and improve their designs and practices. Some practitioners believe that educational research is often too abstract or sterilized to be useful in real contexts (The Design-Based Research Collective, 2002).

Not only is lack of relevance a problem, but research results can also be inaccurate by failing to account for context. Findings and theories based on lab results may not accurately reflect what happens in real-world educational settings.

Conversely, a problem that arises from an overemphasis on practice is that while individual practices may improve, the general body of theory and knowledge does not increase. Scholars like Collins (1990) and Brown (1992) believed that the best way to conduct research would be to achieve the right balance between theory-building and practical impact.

Paradigms of DBR

Proponents of DBR believe that conducting research in context, rather than in a controlled laboratory setting, and iteratively designing interventions yields authentic and useful knowledge. Sasha Barab (2004) says that the goal of DBR is to “directly impact practice while advancing theory that will be of use to others” (p. 8). This implies “a pragmatic philosophical underpinning, one in which the value of a theory lies in its ability to produce changes in the world” (p. 6). The aims of DBR and the role of researchers and subjects are informed by this philosophical underpinning.

Aims of DBR

Traditional, experimental research is conducted by theorists focused on isolating variables to test and refine theory. DBR is conducted by designers focused on (a) understanding contexts, (b) designing effective systems, and (c) making meaningful changes for the subjects of their studies (Barab & Squire, 2004; Collins, 1990). Traditional methods of research generate refined understandings of how the world works, which may indirectly affect practice. In DBR there is an intentionality in the research process to both refine theory and practice (Collins et al., 2004).

Role of DBR Researcher

In DBR, researchers assume the roles of “curriculum designers, and implicitly, curriculum theorists” (Barab & Squire, 2004, p.2). As curriculum designers, DBR researchers come into their contexts as informed experts with the purpose of creating, “test[ing] and refin[ing] educational designs based on principles derived from prior research” (Collins et al., 2004, p. 15). These educational designs may include curricula, practices, software, or tangible objects beneficial to the learning process (Barab & Squire, 2004). As curriculum theorists, DBR researchers also come into their research contexts with the purpose to refine extant theories about learning (Brown, 1992).

This duality of roles for DBR researchers contributes to a greater sense of responsibility and accountability within the field. Traditional, experimental researchers isolate themselves from the subjects of their study (Barab & Squire, 2004). This separation is seen as a virtue, allowing researchers to make dispassionate observations as they test and refine their understandings of the world around them. In comparison, design-based researchers “bring agendas to their work,” see themselves as necessary agents of change and see themselves as accountable for the work they do (Barab & Squire, 2004, p. 2).

Role of DBR Subjects

Within DBR, research subjects are seen as key contributors and collaborators in the research process. Classic experimentalism views the subjects of research as things to be observed or experimented on, suggesting a unidirectional relationship between researcher and research subject. The role of the research subject is to be available and genuine so that the researcher can make meaningful observations and collect accurate data. In contrast, design-based researchers view the subjects of their research (e.g., students, teachers, schools) as “co-participants” (Barab & Squire, 2004, p. 3) and “co-investigators” (Collins, 1990, p. 4). Research subjects are seen as necessary in “helping to formulate the questions,” “making refinements in the designs,” “evaluating the effects of...the experiment,” and “reporting the results of the experiment to other teachers and researchers” (Collins, 1990, pp. 4-5). Research subjects are co-workers with the researcher in iteratively pushing the study forward.

Outcomes of DBR

DBR educational research develops knowledge through this collaborative, iterative research process. The knowledge developed by DBR can be separated into two categories: (a) tangible, practical outcomes and (b) intangible, theoretical outcomes.

Tangibles Outcomes

A major goal of design-based research is producing meaningful interventions and practices. Within educational research these interventions may “involve the development of technological tools [and] curricula” (Barab & Squire, 2004, p. 1). But more than just producing meaningful educational products for a specific context, DBR aims to produce meaningful, effective educational products that can be transferred and adapted (Barab & Squire, 2004). As expressed by Brown (1992), “an effective intervention should be able to migrate from our experimental classroom to average classrooms operated by and for average students and teachers” (p.143).

Intangible Outcomes

It is important to recognize that DBR is not only concerned with improving practice but also aims to advance theory and understanding (Collins et al., 2004). DBR’s emphasis on the importance of context enhances the knowledge claims of the research. “Researchers investigate cognition in context...with the broad goal of developing evidence-based claims derived from both laboratory-based and naturalistic investigations that result in knowledge about how people learn” (Barab & Squire, 2004, p.1). This new knowledge about learning then drives future research and practice.

Process of DBR

A hallmark of DBR is the iterative nature of its interventions. As each iteration progresses, researchers refine and rework the intervention drawing on a variety of research methods that best fit the context. This flexibility allows the end result to take precedence over the process. While each researcher may use different methods, McKenny and Reeves (2012) outlined three core processes of DBR: (a) analysis and exploration, (b) design and construction, and (c) evaluation and reflection. To put these ideas in context, we will refer to a recent DBR study completed by Siko and Barbour regarding the use of PowerPoint games in the classroom.

DBR Cycle

Analysis and Exploration

Analysis is a critical aspect of DBR and must be used throughout the entire process. At the start of a DBR project, it is critical to understand and define which problem will be addressed. In collaboration with practitioners, researchers seek to understand all aspects of a problem. Additionally, they “seek out and learn from how others have viewed and solved similar problems ” (McKenny & Reeves, 2012, p. 85). This analysis helps to provide an understanding of the context within which to execute an intervention.

Since theories cannot account for the variety of variables in a learning situation, exploration is needed to fill the gaps. DBR researchers can draw from a number of disciplines and methodologies as they execute an intervention. The decision of which methodologies to use should be driven by the research context and goals.

Siko and Barbour (2016) used the DBR process to address a gap they found in research regarding the effectiveness of having students create their own PowerPoint games to review for a test. In analyzing existing research, they found studies that stated teaching students to create their own PowerPoint games did not improve content retention. Siko and Barbour wanted to “determine if changes to the implementation protocol would lead to improved performance” (Siko & Barbour, 2016, p. 420). They chose to test their theory in three different phases and adapt the curriculum following each phase.

Design and Construction

Informed by the analysis and exploration, researchers design and construct interventions, which may be a specific technology or “less concrete aspects such as activity structures, institutions, scaffolds, and curricula” (Design-Based Research Collective, 2003, pp. 5–6). This process involves laying out a variety of options for a solution and then creating the idea with the most promise.

Within Siko and Barbour’s design, they planned to observe three phases of a control group and a test group. Each phase would use t-tests to compare two unit tests for each group. They worked with teachers to implement time for playing PowerPoint games as well as a discussion on what makes games successful. The first implementation was a control phase that replicated past research and established a baseline. Once they finished that phase, they began to evaluate.

Evaluation and Reflection

Researchers can evaluate their designs both before and after use. The cyclical process involves careful, constant evaluation for each iteration so that improvements can be made. While tests and quizzes are a standard way of evaluating educational progress, interviews and observations also play a key role, as they allow for better understanding of how teachers and students might see the learning situation.

Reflection allows the researcher to make connections between actions and results. Researchers must take the time to analyze what changes allowed them to have success or failure so that theory and practice at large can be benefited. Collins (1990) states:

It is important to analyze the reasons for failure and to take steps to fix them. It is critical to document the nature of the failures and the attempted revisions, as well as the overall results of the experiment, because this information informs the path to success. (pg. 5)

As researchers reflect on each change they made, they find what is most useful to the field at large, whether it be a failure or a success.

After evaluating results of the first phase, Siko and Barbour revisited the literature of instructional games. Based on that research, they first tried extending the length of time students spent creating the games. They also discovered that the students struggled to design effective test questions, so the researchers tried working with teachers to spend more time explaining how to ask good questions. As they explored these options, researchers were able to see unit test scores improve.

Reflection on how the study was conducted allowed the researchers to properly place their experiences within the context of existing research. They recognized that while they found positive impacts as a result of their intervention, there were a number of limitations with the study. This is an important realization for the research and allows readers to not misinterpret the scope of the findings.

This chapter has provided a brief overview of the origin, paradigms, outcomes, and processes of Design-Based Research (DBR). We explained that (a) DBR originated because some researchers believed that traditional research methods failed to improve classroom practices, (b) DBR places researchers as agents of change and research subjects as collaborators, (c) DBR produces both new designs and theories, and (d) DBR consists of an iterative process of design and evaluation to develop knowledge.

Barab, S., & Squire, K. (2004). Design-based research: putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14.

Brown, A. L. (1992). Design experiments: theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141–178.

Collins, A. (1990). Toward a design science of education (Report No. 1). Washington, DC: Center for Technology in Education.

Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15–42.

Mckenney, S., & Reeves, T.C. (2012) Conducting Educational Design Research. New York, NY: Routledge.

Siko, J. P., & Barbour, M. K. (2016). Building a better mousetrap: how design-based research was used to improve homemade PowerPoint games. TechTrends, 60(5), 419–424.

The Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8.

This content is provided to you freely by BYU Open Learning Network.

Access it online or download it at https://open.byu.edu/education_research/design_based_research .

Design-based research

What is design-based research (DBR), and how should you develop your first DBR project?

Eleanor C Sayre

Design-based research (DBR) is a key method in the learning sciences, which is used to simultaneously develop both learning theory and the design of instructional interventions. This article is an exceptionally short introduction to design-based research; there’s tons of scholarship on this idea.

What is design-based research (DBR)?

Design-based research (DBR) begins with a design problem in education, such as a teaching a difficult concept, improving a dismal undergrad success rate, or similar. DBR turns to learning theories (or other theories in education) to explain why that problem might be difficult to learn or undergrads might not succeed, and to suggest what the shape of solution might be. These ideas from theory are used to inform the design of the intervention: new curricula, new learning environment, new experiences.

The intervention is tested on real learners in an ecologically-valid context, like a classroom (though early cycles may occur in more controlled environments). Data are collected on how learners interact with the intervention, preferably process data as well as outcome data. Data are analyzed in light of the theory to understand the ways in which the design and implementation of the intervention are successful and to suggest what elements of the design or implementation could be altered to improve student learning (or whatever the key outcomes of the design problem are). Researchers engage in multiple cycles of design construction, implementation, evaluation, and revision in order to (a) solve the design problem and (b) generate new knowledge & learning theory about student learning and development.

The DBR process from start to end

The outcome of a DBR project is twofold: the solution to the design problem as well as the new knowledge about learning theories. Both aspects of the outcome are necessary for a successful project; you can’t just do stuff and not generate new knowledge about it, and you can’t just generate new knowledge without finding a solution to the design problem.

While there is an aspect of emergence in subsequent cycles for intervention design and learning theories, theory and intervention are necessary for all cycles. Because theory guides which kinds of data are meaningful, the general family of theories usually does not change radically from cycle to cycle. Similarly, while the design problem may grow or change in subsequent cycles, because the intervention is intentionally chosen and modified, the design problem generally does not change radically from cycle to cycle.

A “design problem” stands at the same level as “research question”: it’s a guiding idea that shapes your project. Design problems and research questions are living. They will grow and change with time.

I’m using “students” to mean “participants” because this is often used in classroom settings, but “participants” is also ok.

I’m using “intervention” to mean a wide variety of things: new curriculum, new course policies, new learning environments, new learning experiences, etc. The key parts are: there’s something new about it, and it is enacted in a natural environment (e.g. a classroom, not just a psych lab).

Designing a design-based research project

The first part of a DBR project: articulating a design problem, developing an intervention, and choosing helpful theory

Articulate a design problem

Physics graduates need computational skills, but Maria’s departmental curriculum doesn’t cover computation. She would like to add training in computational skills, but her department is unwilling to require a new course for everyone.

Articulating a good design problem entails figuring out what problem you want to solve, why it is a problem, and what the shape of a feasible solution might look like. As you work on this,

  • Be specific, and ground it in a specific context.
  • Who has this problem? How do you know?
  • What’s the scope of this problem? What can you actually change or improve?
  • What is the current situation, and why is that a problem?
  • Talk to stakeholders: do they think this is a problem too? why?
  • What are other solutions (perhaps partial) that exist?

As you work on articulating your design problem, you may find that you need to conduct a literature review to understand elements of the problem or look for solutions to related problems. As you talk to more people about it, you may find that your ideas shift about what the problem is and why it is a problem. This is great.

A design problem is a problem in your students’ environment that you could, in principle, solve. You can build new curricula or improve programs to solve your design problem and help your students, but you don’t get to design your students.

For example, you might notice that the student failure rate (“DFW rate”) in calculus is 30% (yikes!). The design problem is not “my students are too poorly prepared” (a problem with the students); the design problem might be “our placement test doesn’t sort students well” (a problem with the environment before calculus enrollment) or “calculus class doesn’t support student learning well” (a problem with the environment of the calculus class).

Use Theory to refine your problem and suggest a solution

Maria decides to use two theories to support her work: the Resources Framework helps her identify how students bring ideas together to solve problems, and Computational Thinking helps her identify three particular ideas ( abstraction , automation , and analysis ) for her intervention to improve.

Design-based research relies on theory to shape why these problems are problems and to suggest the shape of a solution. If you want to fix a problem, you need to understand why it is a problem so that you can tackle the cause. The job of theory is to suggest possible causes for the problem, and to help shape potential solutions.

For example, you might notice that some students struggle to solve mathematical problems in their physics classes. Different theories conceptualize this struggle differently:

A cognitive theory of learning might suggest that students have a hard time seeing how to apply math ideas in physics contexts. We need to support their transfer of ideas by explicitly eliciting math class procedures and drawing direct parallels.

A different cognitive theory of learning might suggest that their problem solving skills need support, especially for complicated problems that coordinate math and physics ideas. We need to support their growing problem solving ability by scaffolding the steps of problem solving.

A socio-cultural theory of learning might suggest that different students struggle at different points. We need to support them working together to connect and rediscover math ideas in this new context.

A critical theory of learning might suggest that students struggle to solve these problems because they seem irrelevant to their lives in the face of gross structural inequities. We should reimagine the topics our physics courses cover to better support a just and educated populace.

There are tons of different theories available to you. Thinking about why your problem is a problem and what kind of solution might solve it can suggest to you what family of theory you should use. Through your conversations with stakeholders and literature, you will discover potential theories. Try them on: how does each one suggest the shape of a solution? What feels satisfying to you? You may discover that you want to use different theories to articulate and refine different aspects of your design problem. That’s normal.

Design an intervention

Through conversations with stakeholders and literature, you will start to see the shape of a possible solution to your design problem. The shape of this solution should include some kind of intervention that changes the environment in which your participants learn. For example, you might decide to develop new curricula, as guided by the theory you chose.

Scope: How “big” is your intervention?

Maria decides to focus on one course, Electromagnetism, that she teaches every year. This course is required of all physics majors, so she will be able to affect everyone without adding another course to their workload. Because the course occurs once every year, Maria will have enough time to revise her materials when she isn’t teaching it.

  • How much time do you expect your participants to interact with it? Very brief interventions are easier, but generally have less potential for impact.
  • How many topics will you cover? Which aspects of the class?
  • How many participants will you have? Are they enrolled together, or are these multiple classes?
  • How many times can you iterate your intervention to make adjustments?

DBR projects rely on iterative cycles of implementation and development. As you think about your intervention, think about how many times you can iterate through it. Two cycles are a minimum; 3-4 are more common. It’s also common to iterate through each piece several times, and to use lessons learned in each piece to suggest developments in other pieces. For example, if you want to develop seven new labs, you might plan to develop two in the first year, then refine them and develop five more in the second year, and refine all seven for coherence in the third year. For each iteration, identify: what’s the minimum viable product for each cycle? What additional features would be nice to have, but not necessary in the core?

Map your project

Maria decides to develop homeworks and in-class activities to improve her students’ computational thinking. She replaces homework problems that rely on manual problem solving with problems that use computational methods, writing about 20% new homework problems spread across the whole semester.

Build a plan for your intervention that explicits maps between the theory you choose, the elements of your design problem, and the elements of your intervention. For each piece of the map, ask yourself:

  • How does this piece of the intervention help solve my design problem? How does it use theory?
  • How does this theory help articulate my design problem? How does this theory suggest the shape of the intervention?
  • Why does this design problem require this piece of the intervention? Why do I need this theory?
  • How does each iteration build new features or topical coverage in the intervention?

Building this map is a lot like engaging in conjecture mapping, a more robust tool from the Learning Sciences that explicitly draws together your theory and your observations to make meaning and conclusions.

Gather evidence

To assess how her students are learning the three elements of computational thinking, Maria will look at their homework solutions. For progress in abstraction , she will look at how they represent physical quantities in their code. For progress in automation , she will look at how they structure their code. For progress in analysis , she will look at how they interpret their results for physical reasonableness. Because she thinks her students will be grappling with these ideas during class, Maria also plans to record their work during class.

The last piece of a DBR design is data. How will you know if your intervention is successful? In what ways does it succeed, and how should you make changes for the next iteration?

Your theory and design will help guide you about what kinds of evidence can show success. If you want students to learn more, for example, you will need to show what they knew coming in and what they know coming out; if you want them to have stronger problem solving skills, you need evidence of their problem solving skills; if you want them to feel like they belong in STEM, you need measures of belongingness. All of these are outcome measures of success: what is true at the end of the intervention?

Because DBR relies on iterations to make changes and improve the intervention, it is crucial that you also collect process measures of how your participants interact with your materials. It’s not enough to know that your materials are helpful (or not); you need to know how they use them so that you can improve them. Common process measures include observations or interviews; it’s also possible to look at artifacts like whiteboards or homeworks. Because you’re going to use this information to help you make decisions about what or how to change, it’s important that your data are as rich as possible: they should show processes of interaction among students and how they arrive at their answers, and include information like the problems students are working on as well as their solutions. You can’t just compare answers on the homework to answers on the final exam: that information doesn’t tell you about how they got to each answer.

As you think about the design of each iteration, plan to collect evidence. Ask yourself:

  • What data will you collect? How will it be analyzed? Make sure this data is well-aligned with your theory.
  • Will some data streams be consistent in each iteration? Will some be unique to specific iterations?
  • How much data do you need to make good development decisions? To show success?

Dangers in data collection plans

For some people, it’s really tempting to collect all possible data because they don’t know what they’ll need, or because they think it might be interesting or useful later. A major danger is that they’ll spend so much time on data collection that they won’t have time for meaningful analysis.

Analyzing all of her classroom video & in-class assignments would take too much time before Maria needs teach this course again. Maria decides to scale back: she will develop half as many problems at at time so that she can focus her analysis and development efforts each year.

If this is you, ask yourself: How much capacity do you have to collect and analyze data between iterations, so that you can make good choices? Which data streams are necessary, and which are “nice to have”? Judiciously pare out data streams that do not directly help you solve your design problem and speak to your theory.

For some people, their access to these learning environments constrains their data-taking abilities. Perhaps they cannot do classroom observations because of university policies, or they don’t have enough research staff to perform interviews.

If this is you, ask yourself: What information do I need to make evidence-informed choices about how to iterate? Can I get this information in another way? For example, instead of classroom observations you might ask faculty for their reflections on what went well, or instead of 20 interviews in person, you might have 2-3 shorter zoom interviews with a selection of stakeholders.

Classroom video won’t help Maria know what’s on her students’ screens. Maria decides to change her data stream: instead of classroom video, she asks a senior (who already took this class) to take field notes on select days.

Return to your map

Return to your map between design problem, theory, and intervention. For each element, identify which data you will collect and how you will analyze it in order to (a) show success; and (b) suggest iterative changes.

Doing your DBR project

The second part of a DBR project: trying an intervention, conducting research on learning, and redeveloping the intervention.  Include success criteria.

The second part of your DBR project is to iteratively try your intervention, conduct research on its effectiveness, and redevelop it. You’ll engage with these three activities cyclically several times until you’ve solved your design problem and developed new theory.

Hassan developed some simulations to help undergraduate biology students visualize ion transport through cell membranes. Students can control physical characteristics of the cell membrane as well as the environments inside and outside the cell. He would like to investigate how these simulations help students think about biochemistry concepts, and improve the simulations. Eventually, he would like to have other faculty use his simulations in their classes.

Hassan’s project really centers the development – and redevelopment – of his simulations. Even though he already has some simulations completed, design-based research is still a great choice for him as he considers how to make them better or develop new ones.

Rory has been teaching upper-level quantum mechanics for several years and always scans written homework, quizzes, and exams for their records before grading them. They have decided that they would like to use this written work to explore what resources students bring to quantum mechanics in order to design activities to build on these resources. They hope to publish their findings in The Physics Teacher so other instructors can also benefit from their work.

Rory’s project might be a good candidate for design-based research, if they want to do a research project. But it’s also possible that Rory isn’t super interested in doing research, and they just want to develop some great materials, then tell the world about them. If that’s true, they still need to engage in evaluation to make sure their materials are helpful for their students

Additional topics to consider

Irb regulations.

What regulations govern research with human subjects in the US, and what should you do about them?

Interviews: a primer

How to do interviews in education research: a primer

Data and Access

What are the common kinds of data in education research, and what are their affordances and constraints?

This article was first written on May 3, 2023, and last modified on February 8, 2024.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.19(3); Fall 2020

Design-Based Research: A Methodology to Extend and Enrich Biology Education Research

Emily e. scott.

† Department of Biology, University of Washington, Seattle, WA 98195

Mary Pat Wenderoth

Jennifer h. doherty.

Recent calls in biology education research (BER) have recommended that researchers leverage learning theories and methodologies from other disciplines to investigate the mechanisms by which students to develop sophisticated ideas. We suggest design-based research from the learning sciences is a compelling methodology for achieving this aim. Design-based research investigates the “learning ecologies” that move student thinking toward mastery. These “learning ecologies” are grounded in theories of learning, produce measurable changes in student learning, generate design principles that guide the development of instructional tools, and are enacted using extended, iterative teaching experiments. In this essay, we introduce readers to the key elements of design-based research, using our own research into student learning in undergraduate physiology as an example of design-based research in BER. Then, we discuss how design-based research can extend work already done in BER and foster interdisciplinary collaborations among cognitive and learning scientists, biology education researchers, and instructors. We also explore some of the challenges associated with this methodological approach.

INTRODUCTION

There have been recent calls for biology education researchers to look toward other fields of educational inquiry for theories and methodologies to advance, and expand, our understanding of what helps students learn to think like biologists ( Coley and Tanner, 2012 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Lo et al. , 2019 ). These calls include the recommendations that biology education researchers ground their work in learning theories from the cognitive and learning sciences ( Coley and Tanner, 2012 ) and begin investigating the underlying mechanisms by which students to develop sophisticated biology ideas ( Dolan, 2015 ; Lo et al. , 2019 ). Design-based research from the learning sciences is one methodology that seeks to do both by using theories of learning to investigate how “learning ecologies”—that is, complex systems of interactions among instructors, students, and environmental components—support the process of student learning ( Brown, 1992 ; Cobb et al. , 2003 ; Collins et al. , 2004 ; Peffer and Renken, 2016 ).

The purpose of this essay is twofold. First, we want to introduce readers to the key elements of design-based research, using our research into student learning in undergraduate physiology as an example of design-based research in biology education research (BER). Second, we will discuss how design-based research can extend work already done in BER and explore some of the challenges of its implementation. For a more in-depth review of design-based research, we direct readers to the following references: Brown (1992) , Barab and Squire (2004) , and Collins et al. (2004) , as well as commentaries by Anderson and Shattuck (2012) and McKenney and Reeves (2013) .

WHAT IS DESIGN-BASED RESEARCH?

Design-based research is a methodological approach that aligns with research methods from the fields of engineering or applied physics, where products are designed for specific purposes ( Brown, 1992 ; Joseph, 2004 ; Middleton et al. , 2008 ; Kelly, 2014 ). Consequently, investigators using design-based research approach educational inquiry much as an engineer develops a new product: First, the researchers identify a problem that needs to be addressed (e.g., a particular learning challenge that students face). Next, they design a potential “solution” to the problem in the form of instructional tools (e.g., reasoning strategies, worksheets; e.g., Reiser et al. , 2001 ) that theory and previous research suggest will address the problem. Then, the researchers test the instructional tools in a real-world setting (i.e., the classroom) to see if the tools positively impact student learning. As testing proceeds, researchers evaluate the instructional tools with emerging evidence of their effectiveness (or lack thereof) and progressively revise the tools— in real time —as necessary ( Collins et al. , 2004 ). Finally, the researchers reflect on the outcomes of the experiment, identifying the features of the instructional tools that were successful at addressing the initial learning problem, revising those aspects that were not helpful to learning, and determining how the research informed the theory underlying the experiment. This leads to another research cycle of designing, testing, evaluating, and reflecting to refine the instructional tools in support of student learning. We have characterized this iterative process in Figure 1 after Sandoval (2014) . Though we have portrayed four discrete phases to design-based research, there is often overlap of the phases as the research progresses (e.g., testing and evaluating can occur simultaneously).

An external file that holds a picture, illustration, etc.
Object name is cbe-19-es11-g001.jpg

The four phases of design-based research experienced in an iterative cycle (A). We also highlight the main features of each phase of our design-based research project investigating students’ use of flux in physiology (B).

Design-based research has no specific requirements for the form that instructional tools must take or the manner in which the tools are evaluated ( Bell, 2004 ; Anderson and Shattuck, 2012 ). Instead, design-based research has what Sandoval (2014) calls “epistemic commitments” 1 that inform the major goals of a design-based research project as well as how it is implemented. These epistemic commitments are: 1) Design based research should be grounded in theories of learning (e.g., constructivism, knowledge-in-pieces, conceptual change) that both inform the design of the instructional tools and are improved upon by the research ( Cobb et al. , 2003 ; Barab and Squire, 2004 ). This makes design-based research more than a method for testing whether or not an instructional tool works; it also investigates why the design worked and how it can be generalized to other learning environments ( Cobb et al. , 2003 ). 2) Design-based research should aim to produce measurable changes in student learning in classrooms around a particular learning problem ( Anderson and Shattuck, 2012 ; McKenney and Reeves, 2013 ). This requirement ensures that theoretical research into student learning is directly applicable, and impactful, to students and instructors in classroom settings ( Hoadley, 2004 ). 3) Design-based research should generate design principles that guide the development and implementation of future instructional tools ( Edelson, 2002 ). This commitment makes the research findings broadly applicable for use in a variety of classroom environments. 4) Design-based research should be enacted using extended, iterative teaching experiments in classrooms. By observing student learning over an extended period of time (e.g., throughout an entire term or across terms), researchers are more likely to observe the full effects of how the instructional tools impact student learning compared with short-term experiments ( Brown, 1992 ; Barab and Squire, 2004 ; Sandoval and Bell, 2004 ).

HOW IS DESIGN-BASED RESEARCH DIFFERENT FROM AN EXPERIMENTAL APPROACH?

Many BER studies employ experimental approaches that align with traditional scientific methods of experimentation, such as using treatment versus control groups, randomly assigning treatments to different groups, replicating interventions across multiple spatial or temporal periods, and using statistical methods to guide the kinds of inferences that arise from an experiment. While design-based research can similarly employ these strategies for educational inquiry, there are also some notable differences in its approach to experimentation ( Collins et al. , 2004 ; Hoadley, 2004 ). In this section, we contrast the differences between design-based research and what we call “experimental approaches,” although both paradigms represent a form of experimentation.

The first difference between an experimental approach and design-based research regards the role participants play in the experiment. In an experimental approach, the researcher is responsible for making all the decisions about how the experiment will be implemented and analyzed, while the instructor facilitates the experimental treatments. In design-based research, both researchers and instructors are engaged in all stages of the research from conception to reflection ( Collins et al. , 2004 ). In BER, a third condition frequently arises wherein the researcher is also the instructor. In this case, if the research questions being investigated produce generalizable results that have the potential to impact teaching broadly, then this is consistent with a design-based research approach ( Cobb et al. , 2003 ). However, when the research questions are self-reflective about how a researcher/instructor can improve his or her own classroom practices, this aligns more closely with “action research,” which is another methodology used in education research (see Stringer, 2013 ).

A second difference between experimental research and design-based research is the form that hypotheses take and the manner in which they are investigated ( Collins et al. , 2004 ; Sandoval, 2014 ). In experimental approaches, researchers develop a hypothesis about how a specific instructional intervention will impact student learning. The intervention is then tested in the classroom(s) while controlling for other variables that are not part of the study in order to isolate the effects of the intervention. Sometimes, researchers designate a “control” situation that serves as a comparison group that does not experience the intervention. For example, Jackson et al. (2018) were interested in comparing peer- and self-grading of weekly practice exams to if they were equally effective forms of deliberate practice for students in a large-enrollment class. To test this, the authors (including authors of this essay J.H.D., M.P.W.) designed an experiment in which lab sections of students in a large lecture course were randomly assigned to either a peer-grading or self-grading treatment so they could isolate the effects of each intervention. In design-based research, a hypothesis is conceptualized as the “design solution” rather than a specific intervention; that is, design-based researchers hypothesize that the designed instructional tools, when implemented in the classroom, will create a learning ecology that improves student learning around the identified learning problem ( Edelson, 2002 ; Bell, 2004 ). For example, Zagallo et al. (2016) developed a laboratory curriculum (i.e., the hypothesized “design solution”) for molecular and cellular biology majors to address the learning problem that students often struggle to connect scientific models and empirical data. This curriculum entailed: focusing instruction around a set of target biological models; developing small-group activities in which students interacted with the models by analyzing data from scientific papers; using formative assessment tools for student feedback; and providing students with a set of learning objectives they could use as study tools. They tested their curriculum in a novel, large-enrollment course of upper-division students over several years, making iterative changes to the curriculum as the study progressed.

By framing the research approach as an iterative endeavor of progressive refinement rather than a test of a particular intervention when all other variables are controlled, design-based researchers recognize that: 1) classrooms, and classroom experiences, are unique at any given time, making it difficult to truly “control” the environment in which an intervention occurs or establish a “control group” that differs only in the features of an intervention; and 2) many aspects of a classroom experience may influence the effectiveness of an intervention, often in unanticipated ways, which should be included in the research team’s analysis of an intervention’s success. Consequently, the research team is less concerned with controlling the research conditions—as in an experimental approach—and instead focuses on characterizing the learning environment ( Barab and Squire, 2004 ). This involves collecting data from multiple sources as the research progresses, including how the instructional tools were implemented, aspects of the implementation process that failed to go as planned, and how the instructional tools or implementation process was modified. These characterizations can provide important insights into what specific features of the instructional tools, or the learning environment, were most impactful to learning ( DBR Collective, 2003 ).

A third difference between experimental approaches and design-based research is when the instructional interventions can be modified. In experimental research, the intervention is fixed throughout the experimental period, with any revisions occurring only after the experiment has concluded. This is critical for ensuring that the results of the study provide evidence of the efficacy of a specific intervention. By contrast, design-based research takes a more flexible approach that allows instructional tools to be modified in situ as they are being implemented ( Hoadley, 2004 ; Barab, 2014 ). This flexibility allows the research team to modify instructional tools or strategies that prove inadequate for collecting the evidence necessary to evaluate the underlying theory and ensures a tight connection between interventions and a specific learning problem ( Collins et al. , 2004 ; Hoadley, 2004 ).

Finally, and importantly, experimental approaches and design-based research differ in the kinds of conclusions they draw from their data. Experimental research can “identify that something meaningful happened; but [it is] not able to articulate what about the intervention caused that story to unfold” ( Barab, 2014 , p. 162). In other words, experimental methods are robust for identifying where differences in learning occur, such as between groups of students experiencing peer- or self-grading of practice exams ( Jackson et al. , 2018 ) or receiving different curricula (e.g., Chi et al. , 2012 ). However, these methods are not able to characterize the underlying learning process or mechanism involved in the different learning outcomes. By contrast, design-based research has the potential to uncover mechanisms of learning, because it investigates how the nature of student thinking changes as students experience instructional interventions ( Shavelson et al. , 2003 ; Barab, 2014 ). According to Sandoval (2014) , “Design research, as a means of uncovering causal processes, is oriented not to finding effects but to finding functions , to understanding how desired (and undesired) effects arise through interactions in a designed environment” (p. 30). In Zagallo et al. (2016) , the authors found that their curriculum supported students’ data-interpretation skills, because it stimulated students’ spontaneous use of argumentation during which group members coconstructed evidence-based claims from the data provided. Students also worked collaboratively to decode figures and identify data patterns. These strategies were identified from the researchers’ qualitative data analysis of in-class recordings of small-group discussions, which allowed them to observe what students were doing to support their learning. Because design-based research is focused on characterizing how learning occurs in classrooms, it can begin to answer the kinds of mechanistic questions others have identified as central to advancing BER ( National Research Council [NRC], 2012 ; Dolan, 2015 ; Lo et al. , 2019 ).

DESIGN-BASED RESEARCH IN ACTION: AN EXAMPLE FROM UNDERGRADUATE PHYSIOLOGY

To illustrate how design-based research could be employed in BER, we draw on our own research that investigates how students learn physiology. We will characterize one iteration of our design-based research cycle ( Figure 1 ), emphasizing how our project uses Sandoval’s four epistemic commitments (i.e., theory driven, practically applied, generating design principles, implemented in an iterative manner) to guide our implementation.

Identifying the Learning Problem

Understanding physiological phenomena is challenging for students, given the wide variety of contexts (e.g., cardiovascular, neuromuscular, respiratory; animal vs. plant) and scales involved (e.g., using molecular-level interactions to explain organism functioning; Wang, 2004 ; Michael, 2007 ; Badenhorst et al. , 2016 ). To address these learning challenges, Modell (2000) identified seven “general models” that undergird most physiology phenomena (i.e., control systems, conservation of mass, mass and heat flow, elastic properties of tissues, transport across membranes, cell-to-cell communication, molecular interactions). Instructors can use these models as a “conceptual framework” to help students build intellectual coherence across phenomena and develop a deeper understanding of physiology ( Modell, 2000 ; Michael et al. , 2009 ). This approach aligns with theoretical work in the learning sciences that indicates that providing students with conceptual frameworks improves their ability to integrate and retrieve knowledge ( National Academies of Sciences, Engineering, and Medicine, 2018 ).

Before the start of our design-based project, we had been using Modell’s (2000) general models to guide our instruction. In this essay, we will focus on how we used the general models of mass and heat flow and transport across membranes in our instruction. These two models together describe how materials flow down gradients (e.g., pressure gradients, electrochemical gradients) against sources of resistance (e.g., tube diameter, channel frequency). We call this flux reasoning. We emphasized the fundamental nature and broad utility of flux reasoning in lecture and lab and frequently highlighted when it could be applied to explain a phenomenon. We also developed a conceptual scaffold (the Flux Reasoning Tool) that students could use to reason about physiological processes involving flux.

Although these instructional approaches had improved students’ understanding of flux phenomena, we found that students often demonstrated little commitment to using flux broadly across physiological contexts. Instead, they considered flux to be just another fact to memorize and applied it to narrow circumstances (e.g., they would use flux to reason about ions flowing across membranes—the context where flux was first introduced—but not the bulk flow of blood in a vessel). Students also struggled to integrate the various components of flux (e.g., balancing chemical and electrical gradients, accounting for variable resistance). We saw these issues reflected in students’ lower than hoped for exam scores on the cumulative final of the course. From these experiences, and from conversations with other physiology instructors, we identified a learning problem to address through design-based research: How do students learn to use flux reasoning to explain material flows in multiple physiology contexts?

The process of identifying a learning problem usually emerges from a researcher’s own experiences (in or outside a classroom) or from previous research that has been described in the literature ( Cobb et al. , 2003 ). To remain true to Sandoval’s first epistemic commitment, a learning problem must advance a theory of learning ( Edelson, 2002 ; McKenney and Reeves, 2013 ). In our work, we investigated how conceptual frameworks based on fundamental scientific concepts (i.e., Modell’s general models) could help students reason productively about physiology phenomena (National Academies of Sciences, Engineering, and Medicine, 2018; Modell, 2000 ). Our specific theoretical question was: Can we characterize how students’ conceptual frameworks around flux change as they work toward robust ideas? Sandoval’s second epistemic commitment stated that a learning problem must aim to improve student learning outcomes. The practical significance of our learning problem was: Does using the concept of flux as a foundational idea for instructional tools increase students’ learning of physiological phenomena?

We investigated our learning problem in an introductory biology course at a large R1 institution. The introductory course is the third in a biology sequence that focuses on plant and animal physiology. The course typically serves between 250 and 600 students in their sophomore or junior years each term. Classes have the following average demographics: 68% male, 21% from lower-income situations, 12% from an underrepresented minority, and 26% first-generation college students.

Design-Based Research Cycle 1, Phase 1: Designing Instructional Tools

The first phase of design-based research involves developing instructional tools that address both the theoretical and practical concerns of the learning problem ( Edelson, 2002 ; Wang and Hannafin, 2005 ). These instructional tools can take many forms, such as specific instructional strategies, classroom worksheets and practices, or technological software, as long as they embody the underlying learning theory being investigated. They must also produce classroom experiences or materials that can be evaluated to determine whether learning outcomes were met ( Sandoval, 2014 ). Indeed, this alignment between theory, the nature of the instructional tools, and the ways students are assessed is central to ensuring rigorous design-based research ( Hoadley, 2004 ; Sandoval, 2014 ). Taken together, the instructional tools instantiate a hypothesized learning environment that will advance both the theoretical and practical questions driving the research ( Barab, 2014 ).

In our work, the theoretical claim that instruction based on fundamental scientific concepts would support students’ flux reasoning was embodied in our instructional approach by being the central focus of all instructional materials, which included: a revised version of the Flux Reasoning Tool ( Figure 2 ); case study–based units in lecture that explicitly emphasized flux phenomena in real-world contexts ( Windschitl et al. , 2012 ; Scott et al. , 2018 ; Figure 3 ); classroom activities in which students practiced using flux to address physiological scenarios; links to online videos describing key flux-related concepts; constructed-response assessment items that cued students to use flux reasoning in their thinking; and pretest/posttest formative assessment questions that tracked student learning ( Figure 4 ).

An external file that holds a picture, illustration, etc.
Object name is cbe-19-es11-g002.jpg

The Flux Reasoning Tool given to students at the beginning of the quarter.

An external file that holds a picture, illustration, etc.
Object name is cbe-19-es11-g003.jpg

An example flux case study that is presented to students at the beginning of the neurophysiology unit. Throughout the unit, students learn how ion flows into and out of cells, as mediated by chemical and electrical gradients and various ion/molecular channels, sends signals throughout the body. They use this information to better understand why Jaime experiences persistent neuropathy. Images from: uz.wikipedia.org/wiki/Fayl:Blausen_0822_SpinalCord.png and commons.wikimedia.org/wiki/File:Figure_38_01_07.jpg.

An external file that holds a picture, illustration, etc.
Object name is cbe-19-es11-g004.jpg

An example flux assessment question about ion flows given in a pre-unit/post-unit formative assessment in the neurophysiology unit.

Phase 2: Testing the Instructional Tools

In the second phase of design-based research, the instructional tools are tested by implementing them in classrooms. During this phase, the instructional tools are placed “in harm’s way … in order to expose the details of the process to scrutiny” ( Cobb et al. , 2003 , p. 10). In this way, researchers and instructors test how the tools perform in real-world settings, which may differ considerably from the design team’s initial expectations ( Hoadley, 2004 ). During this phase, if necessary, the design team may make adjustments to the tools as they are being used to account for these unanticipated conditions ( Collins et al. , 2004 ).

We implemented the instructional tools during the Autumn and Spring quarters of the 2016–2017 academic year. Students were taught to use the Flux Reasoning Tool at the beginning of the term in the context of the first case study unit focused on neurophysiology. Each physiology unit throughout the term was associated with a new concept-based case study (usually about flux) that framed the context of the teaching. Embedded within the daily lectures were classroom activities in which students could practice using flux. Students were also assigned readings from the textbook and videos related to flux to watch during each unit. Throughout the term, students took five exams that each contained some flux questions as well as some pre- and post-unit formative assessment questions. During Winter quarter, we conducted clinical interviews with students who would take our course in the Spring term (i.e., “pre” data) as well as students who had just completed our course in Autumn (i.e., “post” data).

Phase 3: Evaluating the Instructional Tools

The third phase of a design-based research cycle involves evaluating the effectiveness of instructional tools using evidence of student learning ( Barab and Squire, 2004 ; Anderson and Shattuck, 2012 ). This can be done using products produced by students (e.g., homework, lab reports), attitudinal gains measured with surveys, participation rates in activities, interview testimonials, classroom discourse practices, and formative assessment or exam data (e.g., Reiser et al. , 2001 ; Cobb et al. , 2003 ; Barab and Squire, 2004 ; Mohan et al. , 2009 ). Regardless of the source, evidence must be in a form that supports a systematic analysis that could be scrutinized by other researchers ( Cobb et al. , 2003 ; Barab, 2014 ). Also, because design-based research often involves multiple data streams, researchers may need to use both quantitative and qualitative analytical methods to produce a rich picture of how the instructional tools affected student learning ( Collins et al. , 2004 ; Anderson and Shattuck, 2012 ).

In our work, we used the quality of students’ written responses on exams and formative assessment questions to determine whether students improved their understanding of physiological phenomena involving flux. For each assessment question, we analyzed a subset of student’s pretest answers to identify overarching patterns in students’ reasoning about flux, characterized these overarching patterns, then ordinated the patterns into different levels of sophistication. These became our scoring rubrics, which identified five different levels of student reasoning about flux. We used the rubrics to code the remainder of students’ responses, with a code designating the level of student reasoning associated with a particular reasoning pattern. We used this ordinal rubric format because it would later inform our theoretical understanding of how students build flux conceptual frameworks (see phase 4). This also allowed us to both characterize the ideas students held about flux phenomena and identify the frequency distribution of those ideas in a class.

By analyzing changes in the frequency distributions of students’ ideas across the rubric levels at different time points in the term (e.g., pre-unit vs. post-unit), we could track both the number of students who gained more sophisticated ideas about flux as the term progressed and the quality of those ideas. If the frequency of students reasoning at higher levels increased from pre-unit to post-unit assessments, we could conclude that our instructional tools as a whole were supporting students’ development of sophisticated flux ideas. For example, on one neuromuscular ion flux assessment question in the Spring of 2017, we found that relatively more students were reasoning at the highest levels of our rubric (i.e., levels 4 and 5) on the post-unit test compared with the pre-unit test. This meant that more students were beginning to integrate sophisticated ideas about flux (i.e., they were balancing concentration and electrical gradients) in their reasoning about ion movement.

To help validate this finding, we drew on three additional data streams: 1) from in-class group recordings of students working with flux items, we noted that students increasingly incorporated ideas about gradients and resistance when constructing their explanations as the term progressed; 2) from plant assessment items in the latter part of the term, we began to see students using flux ideas unprompted; and 3) from interviews, we observed that students who had already taken the course used flux ideas in their reasoning.

Through these analyses, we also noticed an interesting pattern in the pre-unit test data for Spring 2017 when compared with the frequency distribution of students’ responses with a previous term (Autumn 2016). In Spring 2017, 42% of students reasoned at level 4 or 5 on the pre-unit test, indicating these students already had sophisticated ideas about ion flux before they took the pre-unit assessment. This was surprising, considering only 2% of students reasoned at these levels for this item on the Autumn 2016 pre-unit test.

Phase 4: Reflecting on the Instructional Tools and Their Implementation

The final phase of a design-based research cycle involves a retrospective analysis that addresses the epistemic commitments of this methodology: How was the theory underpinning the research advanced by the research endeavor (theoretical outcome)? Did the instructional tools support student learning about the learning problem (practical outcome)? What were the critical features of the design solution that supported student learning (design principles)? ( Cobb et al. , 2003 ; Barab and Squire, 2004 ).

Theoretical Outcome (Epistemic Commitment 1).

Reflecting on how a design-based research experiment advances theory is critical to our understanding of how students learn in educational settings ( Barab and Squire, 2004 ; Mohan et al. , 2009 ). In our work, we aimed to characterize how students’ conceptual frameworks around flux change as they work toward robust ideas. To do this, we drew on learning progression research as our theoretical framing ( NRC, 2007 ; Corcoran et al. , 2009 ; Duschl et al. , 2011 ; Scott et al. , 2019 ). Learning progression frameworks describe empirically derived patterns in student thinking that are ordered into levels representing cognitive shifts in the ways students conceive a topic as they work toward mastery ( Gunckel et al. , 2012 ). We used our ion flux scoring rubrics to create a preliminary five-level learning progression framework ( Table 1 ). The framework describes how students’ ideas about flux often start with teleological-driven accounts at the lowest level (i.e., level 1), shift to focusing on driving forces (e.g., concentration gradients, electrical gradients) in the middle levels, and arrive at complex ideas that integrate multiple interacting forces at the higher levels. We further validated these reasoning patterns with our student interviews. However, our flux conceptual framework was largely based on student responses to our ion flux assessment items. Therefore, to further validate our learning progression framework, we needed a greater diversity of flux assessment items that investigated student thinking more broadly (i.e., about bulk flow, water movement) across physiological systems.

The preliminary flux learning progression framework characterizing the patterns of reasoning students may exhibit as they work toward mastery of flux reasoning. The student exemplars are from the ion flux formative assessment question presented in Figure 4 . The “/” divides a student’s answers to the first and second parts of the question. Level 5 represents the most sophisticated ideas about flux phenomena.

Practical Outcome (Epistemic Commitment 2).

In design-based research, learning theories must “do real work” by improving student learning in real-world settings ( DBR Collective, 2003 ). Therefore, design-based researchers must reflect on whether or not the data they collected show evidence that the instructional tools improved student learning ( Cobb et al. , 2003 ; Sharma and McShane, 2008 ). We determined whether our flux-based instructional approach aided student learning by analyzing the kinds of answers students provided to our assessment questions. Specifically, we considered students who reasoned at level 4 or above as demonstrating productive flux reasoning. Because almost half of students were reasoning at level 4 or 5 on the post-unit assessment after experiencing the instructional tools in the neurophysiology unit (in Spring 2017), we concluded that our tools supported student learning in physiology. Additionally, we noticed that students used language in their explanations that directly tied to the Flux Reasoning Tool ( Figure 2 ), which instructed them to use arrows to indicate the magnitude and direction of gradient-driving forces. For example, in a posttest response to our ion flux item ( Figure 4 ), one student wrote:

Ion movement is a function of concentration and electrical gradients . Which arrow is stronger determines the movement of K+. We can make the electrical arrow bigger and pointing in by making the membrane potential more negative than Ek [i.e., potassium’s equilibrium potential]. We can make the concentration arrow bigger and pointing in by making a very strong concentration gradient pointing in.

Given that almost half of students reasoned at level 4 or above, and that students used language from the Flux Reasoning Tool, we concluded that using fundamental concepts was a productive instructional approach for improving student learning in physiology and that our instructional tools aided student learning. However, some students in the 2016–2017 academic year continued to apply flux ideas more narrowly than intended (i.e., for ion and simple diffusion cases, but not water flux or bulk flow). This suggested that students had developed nascent flux conceptual frameworks after experiencing the instructional tools but could use more support to realize the broad applicability of this principle. Also, although our cross-sectional interview approach demonstrated how students’ ideas, overall, could change after experiencing the instructional tools, it did not provide information about how a student developed flux reasoning.

Reflecting on practical outcomes also means interpreting any learning gains in the context of the learning ecology. This reflection allowed us to identify whether there were particular aspects of the instructional tools that were better at supporting learning than others ( DBR Collective, 2003 ). Indeed, this was critical for our understanding why 42% of students scored at level 3 and above on the pre-unit ion assessment in the Spring of 2017, while only 2% of students scored level 3 and above in Autumn of 2016. When we reviewed notes of the Spring 2017 implementation scheme, we saw that the pretest was due at the end of the first day of class after students had been exposed to ion flux ideas in class and in a reading/video assignment about ion flow, which may be one reason for the students’ high performance on the pretest. Consequently, we could not tell whether students’ initial high performance was due to their learning from the activities in the first day of class or for other reasons we did not measure. It also indicated we needed to close pretests before the first day of class for a more accurate measure of students’ incoming ideas and the effectiveness of the instructional tools employed at the beginning of the unit.

Design Principles (Epistemic Commitment 3).

Although design-based research is enacted in local contexts (i.e., a particular classroom), its purpose is to inform learning ecologies that have broad applications to improve learning and teaching ( Edelson, 2002 ; Cobb et al. , 2003 ). Therefore, design-based research should produce design principles that describe characteristics of learning environments that researchers and instructors can use to develop instructional tools specific to their local contexts (e.g., Edelson, 2002 ; Subramaniam et al. , 2015 ). Consequently, the design principles must balance specificity with adaptability so they can be used broadly to inform instruction ( Collins et al. , 2004 ; Barab, 2014 ).

From our first cycle of design-based research, we developed the following design principles: 1) Key scientific concepts should provide an overarching framework for course organization. This way, the individual components that make up a course, like instructional units, activities, practice problems, and assessments, all reinforce the centrality of the key concept. 2) Instructional tools should explicitly articulate the principle of interest, with specific guidance on how that principle is applied in context. This stresses the applied nature of the principle and that it is more than a fact to be memorized. 3) Instructional tools need to show specific instances of how the principle is applied in multiple contexts to combat students’ narrow application of the principle to a limited number of contexts.

Design-Based Research Cycle 2, Phase 1: Redesign and Refine the Experiment

The last “epistemic commitment” Sandoval (2014) articulated was that design-based research be an iterative process with an eye toward continually refining the instructional tools, based on evidence of student learning, to produce more robust learning environments. By viewing educational inquiry as formative research, design-based researchers recognize the difficulty in accounting for all variables that could impact student learning, or the implementation of the instructional tools, a priori ( Collins et al. , 2004 ). Robust instructional designs are the products of trial and error, which are strengthened by a systematic analysis of how they perform in real-world settings.

To continue to advance our work investigating student thinking using the principle of flux, we began a second cycle of design-based research that continued to address the learning problem of helping students reason with fundamental scientific concepts. In this cycle, we largely focused on broadening the number of physiological systems that had accompanying formative assessment questions (i.e., beyond ion flux), collecting student reasoning from a more diverse population of students (e.g., upper division, allied heath, community college), and refining and validating the flux learning progression with both written and interview data in a student through time. We developed a suite of constructed-response flux assessment questions that spanned neuromuscular, cardiovascular, respiratory, renal, and plant physiological contexts and asked students about several kinds of flux: ion movement, diffusion, water movement, and bulk flow (29 total questions; available at beyondmultiplechoice.org). This would provide us with rich qualitative data that we could use to refine the learning progression. We decided to administer written assessments and conduct interviews in a pretest/posttest manner at the beginning and end of each unit both as a way to increase our data about student reasoning and to provide students with additional practice using flux reasoning across contexts.

From this second round of designing instructional tools (i.e., broader range of assessment items), testing them in the classroom (i.e., administering the assessment items to diverse student populations), evaluating the tools (i.e., developing learning progression–aligned rubrics across phenomena from student data, tracking changes in the frequency distribution of students across levels through time), and reflecting on the tools’ success, we would develop a more thorough and robust characterization of how students use flux across systems that could better inform our creation of new instructional tools to support student learning.

HOW CAN DESIGN-BASED RESEARCH EXTEND AND ENRICH BER?

While design-based research has primarily been used in educational inquiry at the K–12 level (see Reiser et al. , 2001 ; Mohan et al. , 2009 ; Jin and Anderson, 2012 ), other science disciplines at undergraduate institutions have begun to employ this methodology to create robust instructional approaches (e.g., Szteinberg et al. , 2014 in chemistry; Hake, 2007 , and Sharma and McShane, 2008 , in physics; Kelly, 2014 , in engineering). Our own work, as well as that by Zagallo et al. (2016) , provides two examples of how design-based research could be implemented in BER. Below, we articulate some of the ways incorporating design-based research into BER could extend and enrich this field of educational inquiry.

Design-Based Research Connects Theory with Practice

One critique of BER is that it does not draw heavily enough on learning theories from other disciplines like cognitive psychology or the learning sciences to inform its research ( Coley and Tanner, 2012 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Davidesco and Milne, 2019 ). For example, there has been considerable work in BER developing concept inventories as formative assessment tools that identify concepts students often struggle to learn (e.g., Marbach-Ad et al. , 2009 ; McFarland et al. , 2017 ; Summers et al. , 2018 ). However, much of this work is detached from a theoretical understanding of why students hold misconceptions in the first place, what the nature of their thinking is, and the learning mechanisms that would move students to a more productive understanding of domain ideas ( Alonzo, 2011 ). Using design-based research to understand the basis of students’ misconceptions would ground these practical learning problems in a theoretical understanding of the nature of student thinking (e.g., see Coley and Tanner, 2012 , 2015 ; Gouvea and Simon, 2018 ) and the kinds of instructional tools that would best support the learning process.

Design-Based Research Fosters Collaborations across Disciplines

Recently, there have been multiple calls across science, technology, engineering, and mathematics education fields to increase collaborations between BER and other disciplines so as to increase the robustness of science education research at the collegiate level ( Coley and Tanner, 2012 ; NRC, 2012 ; Talanquer, 2014 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Mestre et al. , 2018 ; Davidesco and Milne, 2019 ). Engaging in design-based research provides both a mechanism and a motivation for fostering interdisciplinary collaborations, as it requires the design team to have theoretical knowledge of how students learn, domain knowledge of practical learning problems, and instructional knowledge for how to implement instructional tools in the classroom ( Edelson, 2002 ; Hoadley, 2004 ; Wang and Hannafin, 2005 ; Anderson and Shattuck, 2012 ). For example, in our current work, our research team consists of two discipline-based education learning scientists from an R1 institution, two physiology education researchers/instructors (one from an R1 institution the other from a community college), several physiology disciplinary experts/instructors, and a K–12 science education expert.

Design-based research collaborations have several distinct benefits for BER: first, learning or cognitive scientists could provide theoretical and methodological expertise that may be unfamiliar to biology education researchers with traditional science backgrounds ( Lo et al. , 2019 ). This would both improve the rigor of the research project and provide biology education researchers with the opportunity to explore ideas and methods from other disciplines. Second, collaborations between researchers and instructors could help increase the implementation of evidence-based teaching practices by instructors/faculty who are not education researchers and would benefit from support while shifting their instructional approaches ( Eddy et al. , 2015 ). This may be especially true for community college and primarily undergraduate institution faculty who often do not have access to the same kinds of resources that researchers and instructors at research-intensive institutions do ( Schinske et al. , 2017 ). Third, making instructors an integral part of a design-based research project ensures they are well versed in the theory and learning objectives underlying the instructional tools they are implementing in the classroom. This can improve the fidelity of implementation of the instructional tools, because the instructors understand the tools’ theoretical and practical purposes, which has been cited as one reason there have been mixed results on the impact of active learning across biology classes ( Andrews et al. , 2011 ; Borrego et al. , 2013 ; Lee et al. , 2018 ; Offerdahl et al. , 2018 ). It also gives instructors agency to make informed adjustments to the instructional tools during implementation that improve their practical applications while remaining true to the goals of the research ( Hoadley, 2004 ).

Design-Based Research Invites Using Mixed Methods to Analyze Data

The diverse nature of the data that are often collected in design-based research can require both qualitative and quantitative methodologies to produce a rich picture of how the instructional tools and their implementation influenced student learning ( Anderson and Shattuck, 2012 ). Using mixed methods may be less familiar to biology education researchers who were primarily trained in quantitative methods as biologists ( Lo et al. , 2019 ). However, according to Warfa (2016 , p. 2), “Integration of research findings from quantitative and qualitative inquiries in the same study or across studies maximizes the affordances of each approach and can provide better understanding of biology teaching and learning than either approach alone.” Although the number of BER studies using mixed methods has increased over the past decade ( Lo et al. , 2019 ), engaging in design-based research could further this trend through its collaborative nature of bringing social scientists together with biology education researchers to share research methodologies from different fields. By leveraging qualitative and quantitative methods, design-based researchers unpack “mechanism and process” by characterizing the nature of student thinking rather than “simply reporting that differences did or did not occur” ( Barab, 2014 , p. 158), which is important for continuing to advance our understanding of student learning in BER ( Dolan, 2015 ; Lo et al. , 2019 ).

CHALLENGES TO IMPLEMENTING DESIGN-BASED RESEARCH IN BER

As with any methodological approach, there can be challenges to implementing design-based research. Here, we highlight three that may be relevant to BER.

Collaborations Can Be Difficult to Maintain

While collaborations between researchers and instructors offer many affordances (as discussed earlier), the reality of connecting researchers across departments and institutions can be challenging. For example, Peffer and Renken (2016) noted that different traditions of scholarship can present barriers to collaboration where there is not mutual respect for the methods and ideas that are part and parcel to each discipline. Additionally, Schinske et al. (2017) identified several constraints that community college faculty face for engaging in BER, such as limited time or support (e.g., infrastructural, administrative, and peer support), which could also impact their ability to form the kinds of collaborations inherent in design-based research. Moreover, the iterative nature of design-based research requires these collaborations to persist for an extended period of time. Attending to these challenges is an important part of forming the design team and identifying the different roles researchers and instructors will play in the research.

Design-Based Research Experiments Are Resource Intensive

The focus of design-based research on studying learning ecologies to uncover mechanisms of learning requires that researchers collect multiple data streams through time, which often necessitates significant temporal and financial resources ( Collins et al., 2004 ; O’Donnell, 2004 ). Consequently, researchers must weigh both practical as well as methodological considerations when formulating their experimental design. For example, investigating learning mechanisms requires that researchers collect data at a frequency that will capture changes in student thinking ( Siegler, 2006 ). However, researchers may be constrained in the number of data-collection events they can anticipate depending on: the instructor’s ability to facilitate in-class collection events or solicit student participation in extracurricular activities (e.g., interviews); the cost of technological devices to record student conversations; the time and logistical considerations needed to schedule and conduct student interviews; the financial resources available to compensate student participants; the financial and temporal costs associated with analyzing large amounts of data.

Identifying learning mechanisms also requires in-depth analyses of qualitative data as students experience various instructional tools (e.g., microgenetic methods; Flynn et al. , 2006 ; Siegler, 2006 ). The high intensity of these in-depth analyses often limits the number of students who can be evaluated in this way, which must be balanced with the kinds of generalizations researchers wish to make about the effectiveness of the instructional tools ( O’Donnell, 2004 ). Because of the large variety of data streams that could be collected in a design-based research experiment—and the resources required to collect and analyze them—it is critical that the research team identify a priori how specific data streams, and the methods of their analysis, will provide the evidence necessary to address the theoretical and practical objectives of the research (see the following section on experimental rigor; Sandoval, 2014 ). These are critical management decisions because of the need for a transparent, systematic analysis of the data that others can scrutinize to evaluate the validity of the claims being made ( Cobb et al. , 2003 ).

Concerns with Experimental Rigor

The nature of design-based research, with its use of narrative to characterize versus control experimental environments, has drawn concerns about the rigor of this methodological approach. Some have challenged its ability to produce evidence-based warrants to support its claims of learning that can be replicated and critiqued by others ( Shavelson et al. , 2003 ; Hoadley, 2004 ). This is a valid concern that design-based researchers, and indeed all education researchers, must address to ensure their research meets established standards for education research ( NRC, 2002 ).

One way design-based researchers address this concern is by “specifying theoretically salient features of a learning environment design and mapping out how they are predicted to work together to produce desired outcomes” ( Sandoval, 2014 , p. 19). Through this process, researchers explicitly show before they begin the work how their theory of learning is embodied in the instructional tools to be tested, the specific data the tools will produce for analysis, and what outcomes will be taken as evidence for success. Moreover, by allowing instructional tools to be modified during the testing phase as needed, design-based researchers acknowledge that it is impossible to anticipate all aspects of the classroom environment that might impact the implementation of instructional tools, “as dozens (if not millions) of factors interact to produce the measureable outcomes related to learning” ( Hoadley, 2004 , p. 204; DBR Collective, 2003 ). Consequently, modifying instructional tools midstream to account for these unanticipated factors can ensure they retain their methodological alignment with the underlying theory and predicted learning outcomes so that inferences drawn from the design experiment accurately reflect what was being tested ( Edelson, 2002 ; Hoadley, 2004 ). Indeed, Barab (2014) states, “the messiness of real-world practice must be recognized, understood, and integrated as part of the theoretical claims if the claims are to have real-world explanatory value” (p. 153).

CONCLUSIONS

In this essay, we have highlighted some of the ways design-based research can advance—and expand upon—research done in biology education. These ways include:

  • providing a methodology that integrates theories of learning with practical experiences in classrooms,
  • using a range of analytical approaches that allow for researchers to uncover the underlying mechanisms of student thinking and learning,
  • fostering interdisciplinary collaborations among researchers and instructors, and
  • characterizing learning ecologies that account for the complexity involved in student learning

By employing this methodology from the learning sciences, biology education researchers can enrich our current understanding of what is required to help biology students achieve their personal and professional aims during their college experience. It can also stimulate new ideas for biology education that can be discussed and debated in our research community as we continue to explore and refine how best to serve the students who pass through our classroom doors.

Acknowledgments

We thank the UW Biology Education Research Group’s (BERG) feedback on drafts of this essay as well as Dr. L. Jescovich for last-minute analyses. This work was supported by a National Science Foundation award (NSF DUE 1661263/1660643). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the NSF. All procedures were conducted in accordance with approval from the Institutional Review Board at the University of Washington (52146) and the New England Independent Review Board (120160152).

1 “Epistemic commitment” is defined as engaging in certain practices that generate knowledge in an agreed-upon way.

  • Alonzo, A. C. (2011). Learning progressions that support formative assessment practices . Measurement , 9 ( 2/3 ), 124–129. [ Google Scholar ]
  • Anderson, T., Shattuck, J. (2012). Design-based research: A decade of progress in education research? Educational Researcher , 41 ( 1 ), 16–25. [ Google Scholar ]
  • Andrews, T. M., Leonard, M. J., Colgrove, C. A., Kalinowski, S. T. (2011). Active learning not associated with student learning in a random sample of college biology courses . CBE—Life Sciences Education , 10 ( 4 ), 394–405. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Badenhorst, E., Hartman, N., Mamede, S. (2016). How biomedical misconceptions may arise and affect medical students’ learning: A review of theoretical perspectives and empirical evidence . Health Professions Education , 2 ( 1 ), 10–17. [ Google Scholar ]
  • Barab, S. (2014). Design-based research: A methodological toolkit for engineering change . In The Cambridge handbook of the learning sciences (2nd ed., pp. 151–170). Cambridge University Press. 10.1017/CBO9781139519526.011 [ CrossRef ] [ Google Scholar ]
  • Barab, S., Squire, K. (2004). Design-based research: Putting a stake in the ground . Journal of the Learning Sciences , 13 ( 1 ), 1–14. [ Google Scholar ]
  • Bell, P. (2004). On the theoretical breadth of design-based research in education . Educational Psychologist , 39 ( 4 ), 243–253. [ Google Scholar ]
  • Borrego, M., Cutler, S., Prince, M., Henderson, C., Froyd, J. E. (2013). Fidelity of implementation of research-based instructional strategies (RBIS) in engineering science courses . Journal of Engineering Education , 102 ( 3 ), 394–425. [ Google Scholar ]
  • Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings . Journal of the Learning Sciences , 2 ( 2 ), 141–178. [ Google Scholar ]
  • Chi, M. T. H., Roscoe, R. D., Slotta, J. D., Roy, M., Chase, C. C. (2012). Misconceived causal explanations for emergent processes . Cognitive Science , 36 ( 1 ), 1–61. [ PubMed ] [ Google Scholar ]
  • Cobb, P., Confrey, J., diSessa, A., Lehrer, R., Schauble, L. (2003). Design experiments in educational research . Educational Researcher , 32 ( 1 ), 9–13. [ Google Scholar ]
  • Coley, J. D., Tanner, K. D. (2012). Common origins of diverse misconceptions: Cognitive principles and the development of biology thinking . CBE—Life Sciences Education , 11 ( 3 ), 209–215. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Coley, J. D., Tanner, K. (2015). Relations between intuitive biological thinking and biological misconceptions in biology majors and nonmajors . CBE—Life Sciences Education , 14 ( 1 ). 10.1187/cbe.14-06-0094 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Collins, A., Joseph, D., Bielaczyc, K. (2004). Design research: Theoretical and methodological issues . Journal of the Learning Sciences , 13 ( 1 ), 15–42. [ Google Scholar ]
  • Corcoran, T., Mosher, F. A., Rogat, A. D. (2009). Learning progressions in science: An evidence-based approach to reform (CPRE Research Report No. RR-63) . Philadelphia, PA: Consortium for Policy Research in Education. [ Google Scholar ]
  • Davidesco, I., Milne, C. (2019). Implementing cognitive science and discipline-based education research in the undergraduate science classroom . CBE—Life Sciences Education , 18 ( 3 ), es4. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry . Educational Researcher , 32 ( 1 ), 5–8. [ Google Scholar ]
  • Dolan, E. L. (2015). Biology education research 2.0 . CBE—Life Sciences Education , 14 ( 4 ), ed1. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Duschl, R., Maeng, S., Sezen, A. (2011). Learning progressions and teaching sequences: A review and analysis . Studies in Science Education , 47 ( 2 ), 123–182. [ Google Scholar ]
  • Eddy, S. L., Converse, M., Wenderoth, M. P. (2015). PORTAAL: A classroom observation tool assessing evidence-based teaching practices for active learning in large science, technology, engineering, and mathematics classes . CBE—Life Sciences Education , 14 ( 2 ), ar23. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Edelson, D. C. (2002). Design research: What we learn when we engage in design . Journal of the Learning Sciences , 11 ( 1 ), 105–121. [ Google Scholar ]
  • Flynn, E., Pine, K., Lewis, C. (2006). The microgenetic method—Time for change? The Psychologist , 19 ( 3 ), 152–155. [ Google Scholar ]
  • Gouvea, J. S., Simon, M. R. (2018). Challenging cognitive construals: A dynamic alternative to stable misconceptions . CBE—Life Sciences Education , 17 ( 2 ), ar34. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Gunckel, K. L., Mohan, L., Covitt, B. A., Anderson, C. W. (2012). Addressing challenges in developing learning progressions for environmental science literacy . In Alonzo, A. C., Gotwals, A. W. (Eds.), Learning progressions in science: Current challenges and future directions (pp. 39–75). Rotterdam: SensePublishers. 10.1007/978-94-6091-824-7_4 [ CrossRef ] [ Google Scholar ]
  • Hake, R. R. (2007). Design-based research in physics education research: A review . In Kelly, A. E., Lesh, R. A., Baek, J. Y. (Eds.), Handbook of design research methods in mathematics, science, and technology education (p. 24). New York: Routledge. [ Google Scholar ]
  • Hoadley, C. M. (2004). Methodological alignment in design-based research . Educational Psychologist , 39 ( 4 ), 203–212. [ Google Scholar ]
  • Jackson, M., Tran, A., Wenderoth, M. P., Doherty, J. H. (2018). Peer vs. self-grading of practice exams: Which is better? CBE—Life Sciences Education , 17 ( 3 ), es44. 10.1187/cbe.18-04-0052 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Jin, H., Anderson, C. W. (2012). A learning progression for energy in socio-ecological systems . Journal of Research in Science Teaching , 49 ( 9 ), 1149–1180. [ Google Scholar ]
  • Joseph, D. (2004). The practice of design-based research: Uncovering the interplay between design, research, and the real-world context . Educational Psychologist , 39 ( 4 ), 235–242. [ Google Scholar ]
  • Kelly, A. E. (2014). Design-based research in engineering education . In Cambridge handbook of engineering education research (pp. 497–518). New York, NY: Cambridge University Press. 10.1017/CBO9781139013451.032 [ CrossRef ] [ Google Scholar ]
  • Lee, C. J., Toven-Lindsey, B., Shapiro, C., Soh, M., Mazrouee, S., Levis-Fitzgerald, M., Sanders, E. R. (2018). Error-discovery learning boosts student engagement and performance, while reducing student attrition in a bioinformatics course . CBE—Life Sciences Education , 17 ( 3 ), ar40. 10.1187/cbe.17-04-0061 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lo, S. M., Gardner, G. E., Reid, J., Napoleon-Fanis, V., Carroll, P., Smith, E., Sato, B. K. (2019). Prevailing questions and methodologies in biology education research: A longitudinal analysis of research in CBE — life sciences education and at the society for the advancement of biology education research . CBE—Life Sciences Education , 18 ( 1 ), ar9. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Marbach-Ad, G., Briken, V., El-Sayed, N. M., Frauwirth, K., Fredericksen, B., Hutcheson, S., Smith, A. C. (2009). Assessing student understanding of host pathogen interactions using a concept inventory . Journal of Microbiology & Biology Education , 10 ( 1 ), 43–50. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • McFarland, J. L., Price, R. M., Wenderoth, M. P., Martinková, P., Cliff, W., Michael, J., … & Wright, A. (2017). Development and validation of the homeostasis concept inventory . CBE—Life Sciences Education , 16 ( 2 ), ar35. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • McKenney, S., Reeves, T. C. (2013). Systematic review of design-based research progress: Is a little knowledge a dangerous thing? Educational Researcher , 42 ( 2 ), 97–100. [ Google Scholar ]
  • Mestre, J. P., Cheville, A., Herman, G. L. (2018). Promoting DBER-cognitive psychology collaborations in STEM education . Journal of Engineering Education , 107 ( 1 ), 5–10. [ Google Scholar ]
  • Michael, J. A. (2007). What makes physiology hard for students to learn? Results of a faculty survey . AJP: Advances in Physiology Education , 31 ( 1 ), 34–40. [ PubMed ] [ Google Scholar ]
  • Michael, J. A., Modell, H., McFarland, J., Cliff, W. (2009). The “core principles” of physiology: What should students understand? Advances in Physiology Education , 33 ( 1 ), 10–16. [ PubMed ] [ Google Scholar ]
  • Middleton, J., Gorard, S., Taylor, C., Bannan-Ritland, B. (2008). The “compleat” design experiment: From soup to nuts . In Kelly, A. E., Lesh, R. A., Baek, J. Y. (Eds.), Handbook of design research methods in education: Innovations in science, technology, engineering, and mathematics learning and teaching (pp. 21–46). New York, NY: Routledge. [ Google Scholar ]
  • Modell, H. I. (2000). How to help students understand physiology? Emphasize general models . Advances in Physiology Education , 23 ( 1 ), S101–S107. [ PubMed ] [ Google Scholar ]
  • Mohan, L., Chen, J., Anderson, C. W. (2009). Developing a multi-year learning progression for carbon cycling in socio-ecological systems . Journal of Research in Science Teaching , 46 ( 6 ), 675–698. [ Google Scholar ]
  • National Academies of Sciences, Engineering, and Medicine. (2018). How People Learn II: Learners, Contexts, and Cultures . Washington, DC: National Academies Press. Retrieved June 24, 2019, from 10.17226/24783 [ CrossRef ] [ Google Scholar ]
  • National Research Council (NRC). (2002). Scientific research in education . Washington, DC: National Academies Press. Retrieved January 31, 2019, from 10.17226/10236 [ CrossRef ] [ Google Scholar ]
  • NRC. (2007). Taking science to school: Learning and teaching science in grades K–8 . Washington, DC: National Academies Press. Retrieved March 22, 2019, from www.nap.edu/catalog/11625/taking-science-to-school-learning-and-teaching-science-in-grades . 10.17226/11625 [ CrossRef ] [ Google Scholar ]
  • NRC. (2012). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering . Washington, DC: National Academies Press. Retrieved from www.nap.edu/catalog/13362/discipline-based-education-research-understanding-and-improving-learning-in-undergraduate . 10.17226/13362 [ CrossRef ] [ Google Scholar ]
  • NRC. (2018). How people learn II: Learners, contexts, and cultures . Washington, DC: National Academies Press. Retrieved from www.nap.edu/read/24783/chapter/7 . 10.17226/24783 [ CrossRef ] [ Google Scholar ]
  • O’Donnell, A. M. (2004). A commentary on design research . Educational Psychologist , 39 ( 4 ), 255–260. [ Google Scholar ]
  • Offerdahl, E. G., McConnell, M., Boyer, J. (2018). Can I have your recipe? Using a fidelity of implementation (FOI) framework to identify the key ingredients of formative assessment for learning . CBE—Life Sciences Education , 17 ( 4 ), es16. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Peffer, M., Renken, M. (2016). Practical strategies for collaboration across discipline-based education research and the learning sciences . CBE—Life Sciences Education , 15 ( 4 ), es11. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reiser, B. J., Smith, B. K., Tabak, I., Steinmuller, F., Sandoval, W. A., Leone, A. J. (2001). BGuILE: Strategic and conceptual scaffolds for scientific inquiry in biology classrooms . In Carver, S. M., Klahr, D. (Eds.), Cognition and instruction: Twenty-five years of progress (pp. 263–305). Mahwah, NJ: Lawrence Erlbaum Associates, Inc. [ Google Scholar ]
  • Sandoval, W. (2014). Conjecture mapping: An approach to systematic educational design research . Journal of the Learning Sciences , 23 ( 1 ), 18–36. [ Google Scholar ]
  • Sandoval, W. A., Bell, P. (2004). Design-based research methods for studying learning in context: Introduction . Educational Psychologist , 39 ( 4 ), 199–201. [ Google Scholar ]
  • Schinske, J. N., Balke, V. L., Bangera, M. G., Bonney, K. M., Brownell, S. E., Carter, R. S., … & Corwin, L. A. (2017). Broadening participation in biology education research: Engaging community college students and faculty . CBE—Life Sciences Education , 16 ( 2 ), mr1. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Scott, E., Anderson, C. W., Mashood, K. K., Matz, R. L., Underwood, S. M., Sawtelle, V. (2018). Developing an analytical framework to characterize student reasoning about complex processes . CBE—Life Sciences Education , 17 ( 3 ), ar49. 10.1187/cbe.17-10-0225 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Scott, E., Wenderoth, M. P., Doherty, J. H. (2019). Learning progressions: An empirically grounded, learner-centered framework to guide biology instruction . CBE—Life Sciences Education , 18 ( 4 ), es5. 10.1187/cbe.19-03-0059 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sharma, M. D., McShane, K. (2008). A methodological framework for understanding and describing discipline-based scholarship of teaching in higher education through design-based research . Higher Education Research & Development , 27 ( 3 ), 257–270. [ Google Scholar ]
  • Shavelson, R. J., Phillips, D. C., Towne, L., Feuer, M. J. (2003). On the science of education design studies . Educational Researcher , 32 ( 1 ), 25–28. [ Google Scholar ]
  • Siegler, R. S. (2006). Microgenetic analyses of learning . In Damon, W., Lerner, R. M. (Eds.), Handbook of child psychology (pp. 464–510). Hoboken, NJ: John Wiley & Sons, Inc. 10.1002/9780470147658.chpsy0211 [ CrossRef ] [ Google Scholar ]
  • Stringer, E. T. (2013). Action research . Thousand Oaks, CA: Sage Publications, Inc. [ Google Scholar ]
  • Subramaniam, M., Jean, B. S., Taylor, N. G., Kodama, C., Follman, R., Casciotti, D. (2015). Bit by bit: Using design-based research to improve the health literacy of adolescents . JMIR Research Protocols , 4 ( 2 ), e62. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Summers, M. M., Couch, B. A., Knight, J. K., Brownell, S. E., Crowe, A. J., Semsar, K., … & Batzli, J. (2018). EcoEvo-MAPS: An ecology and evolution assessment for introductory through advanced undergraduates . CBE—Life Sciences Education , 17 ( 2 ), ar18. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Szteinberg, G., Balicki, S., Banks, G., Clinchot, M., Cullipher, S., Huie, R., … & Sevian, H. (2014). Collaborative professional development in chemistry education research: Bridging the gap between research and practice . Journal of Chemical Education , 91 ( 9 ), 1401–1408. [ Google Scholar ]
  • Talanquer, V. (2014). DBER and STEM education reform: Are we up to the challenge? Journal of Research in Science Teaching , 51 ( 6 ), 809–819. [ Google Scholar ]
  • Wang, F., Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments . Educational Technology Research and Development , 53 ( 4 ), 5–23. [ Google Scholar ]
  • Wang, J.-R. (2004). Development and validation of a Two-tier instrument to examine understanding of internal transport in plants and the human circulatory system . International Journal of Science and Mathematics Education , 2 ( 2 ), 131–157. [ Google Scholar ]
  • Warfa, A.-R. M. (2016). Mixed-methods design in biology education research: Approach and uses . CBE—Life Sciences Education , 15 ( 4 ), rm5. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Windschitl, M., Thompson, J., Braaten, M., Stroupe, D. (2012). Proposing a core set of instructional practices and tools for teachers of science . Science Education , 96 ( 5 ), 878–903. [ Google Scholar ]
  • Zagallo, P., Meddleton, S., Bolger, M. S. (2016). Teaching real data interpretation with models (TRIM): Analysis of student dialogue in a large-enrollment cell and developmental biology course . CBE—Life Sciences Education , 15 ( 2 ), ar17. [ PMC free article ] [ PubMed ] [ Google Scholar ]

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • What Is a Research Design | Types, Guide & Examples

What Is a Research Design | Types, Guide & Examples

Published on June 7, 2021 by Shona McCombes . Revised on November 20, 2023 by Pritha Bhandari.

A research design is a strategy for answering your   research question  using empirical data. Creating a research design means making decisions about:

  • Your overall research objectives and approach
  • Whether you’ll rely on primary research or secondary research
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research objectives and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, other interesting articles, frequently asked questions about research design.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities—start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed-methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types.

  • Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships
  • Descriptive and correlational designs allow you to measure variables and describe relationships between them.

With descriptive and correlational designs, you can get a clear picture of characteristics, trends and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analyzing the data.

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study—plants, animals, organizations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

  • Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalize your results to the population as a whole.

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study , your aim is to deeply understand a specific context, not to generalize to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question .

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviors, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews .

Observation methods

Observational studies allow you to collect data unobtrusively, observing characteristics, behaviors or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what kinds of data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected—for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are high in reliability and validity.

Operationalization

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalization means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in—for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced, while validity means that you’re actually measuring the concept you’re interested in.

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method , you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample—by mail, online, by phone, or in person?

If you’re using a probability sampling method , it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method , how will you avoid research bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organizing and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymize and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well-organized will save time when it comes to analyzing it. It can also help other researchers validate and add to your findings (high replicability ).

On its own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyze the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarize your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarize your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

There are many other ways of analyzing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

If you want to know more about the research process , methodology , research bias , or statistics , make sure to check out some of our other articles with explanations and examples.

  • Simple random sampling
  • Stratified sampling
  • Cluster sampling
  • Likert scales
  • Reproducibility

 Statistics

  • Null hypothesis
  • Statistical power
  • Probability distribution
  • Effect size
  • Poisson distribution

Research bias

  • Optimism bias
  • Cognitive bias
  • Implicit bias
  • Hawthorne effect
  • Anchoring bias
  • Explicit bias

A research design is a strategy for answering your   research question . It defines your overall approach and determines how you will collect and analyze data.

A well-planned research design helps ensure that your methods match your research aims, that you collect high-quality data, and that you use the right kind of analysis to answer your questions, utilizing credible sources . This allows you to draw valid , trustworthy conclusions.

Quantitative research designs can be divided into two main categories:

  • Correlational and descriptive designs are used to investigate characteristics, averages, trends, and associations between variables.
  • Experimental and quasi-experimental designs are used to test causal relationships .

Qualitative research designs tend to be more flexible. Common types of qualitative design include case study , ethnography , and grounded theory designs.

The priorities of a research design can vary depending on the field, but you usually have to specify:

  • Your research questions and/or hypotheses
  • Your overall approach (e.g., qualitative or quantitative )
  • The type of design you’re using (e.g., a survey , experiment , or case study )
  • Your data collection methods (e.g., questionnaires , observations)
  • Your data collection procedures (e.g., operationalization , timing and data management)
  • Your data analysis methods (e.g., statistical tests  or thematic analysis )

A sample is a subset of individuals from a larger population . Sampling means selecting the group that you will actually collect data from in your research. For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

In statistics, sampling allows you to test a hypothesis about the characteristics of a population.

Operationalization means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioral avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalize the variables that you want to measure.

A research project is an academic, scientific, or professional undertaking to answer a research question . Research projects can take many forms, such as qualitative or quantitative , descriptive , longitudinal , experimental , or correlational . What kind of research approach you choose will depend on your topic.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

McCombes, S. (2023, November 20). What Is a Research Design | Types, Guide & Examples. Scribbr. Retrieved April 4, 2024, from https://www.scribbr.com/methodology/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, guide to experimental design | overview, steps, & examples, how to write a research proposal | examples & templates, ethical considerations in research | types & examples, what is your plagiarism score.

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Study Protocol

Using a design-based research approach to develop a technology-supported physical education course to increase the physical activity levels of university students: Study protocol paper

Roles Conceptualization, Funding acquisition, Project administration, Writing – original draft

* E-mail: [email protected]

Affiliations Sydney School of Education and Social Works, Faculty of Arts and Social Science, The University of Sydney, Sydney, New South Wales, Australia, Faculty of Sport and Health Education, Universitas Pendidikan Indonesia, Bandung, Jawa Barat, Indonesia

ORCID logo

Roles Conceptualization, Supervision, Writing – review & editing

Affiliation Sydney School of Education and Social Works, Faculty of Arts and Social Science, The University of Sydney, Sydney, New South Wales, Australia

  • Kuston Sultoni, 
  • Louisa R. Peralta, 
  • Wayne Cotton

PLOS

  • Published: December 1, 2022
  • https://doi.org/10.1371/journal.pone.0269759
  • Peer Review
  • Reader Comments

Fig 1

Promoting physical activity (PA) for university students is essential as PA levels decrease during the transition from secondary to higher education. Providing technology-supported university courses targeting students’ PA levels may be a viable option to combat the problem. However, it is still unclear how and what technologies should be implemented in university courses to promote PA. This study aims to create a series of design principles for technology-supported physical education courses that aim to increase university students’ PA knowledge, motivation and levels.

The proposed methodology underpinning the research program is a seven-phase design-based research (DBR) approach, with the seven phases encompassed in four sequential studies. These four studies are a systematic review, a qualitative focus group study, a pilot study, and a randomised controlled trial (RCT) study. The protocol paper aims to detail the plan for conducting the four studies in a comprehensive and transparent manner, thus contributing to the methodological evidence base in this field.

Design principles generated from this project will contribute to the growing evidence focusing on effective design and implementation features. Future practitioners can also use these to develop physical education courses that aim to promote university students’ physical activity levels, knowledge, and motivation.

Trial registration

The RCT registry number: ACTRN12622000712707 , 18/05/2022.

Citation: Sultoni K, Peralta LR, Cotton W (2022) Using a design-based research approach to develop a technology-supported physical education course to increase the physical activity levels of university students: Study protocol paper. PLoS ONE 17(12): e0269759. https://doi.org/10.1371/journal.pone.0269759

Editor: Walid Kamal Abdelbasset, Prince Sattam Bin Abdulaziz University, College of Applied Medical Sciences, SAUDI ARABIA

Received: June 1, 2022; Accepted: October 11, 2022; Published: December 1, 2022

Copyright: © 2022 Sultoni et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: No datasets were generated or analysed during the current study. All relevant data from this study will be made available upon study completion.

Funding: The primary author (KS) is supported by the Indonesia Endowment Fund for Education Scholarship/ Lembaga Pengelola Dana Pendidikan Republik Indonesia (LPDP RI) under a doctoral degree scholarship (202001222015860). LPDP RI have no authority in study design; collection, management, analysis, and interpretation of data; writing of the report; and the decision to submit the report for publication.

Competing interests: The authors have declared that no competing interests exist.

List of abbreviations: PA, Physical Activity; DBR, Design-Based Research; PETE, Physical Education Teacher Education; ECTS, European Credit Transfer and Accumulation System; PRISMA, Preferred Reporting Items for Systematic Reviews and Meta-Analyses; PICO, Population, Intervention, Comparator, and Outcome; RCT, Randomized Controlled Trial; LMS, Learning Management System; UX, User Experience; UI, User Interface; IPAQ, International Physical Activity Questionnaire; BREQ-2, Behavioural Regulation in Exercise Questionnaire-2; MET, Metabolic Equivalent of Task

Interventions focusing on increasing physical activity levels among various age groups, from early childhood to the elderly, have been growing over the last five years [ 1 – 4 ]. These studies have suggested that targeting physical activity at different time points across the lifespan is essential, especially when there is a transition related to educational events [ 5 ] (e.g., the transition from preschool to primary school [ 6 – 8 ], from primary to high school [ 9 – 12 ], and from high school to college or university [ 13 , 14 ]). Yet, the last opportunity to intervene before adulthood is the transition into post-secondary studies. Therefore, this period is ultimately the most important for establishing lifelong actions such as personal, psychosocial, and movement behaviours for those who have not yet established them [ 15 – 17 ]. Physical activity tends to significantly decrease when graduating from high school and enrolling in universities [ 18 ] and among first-year university students [ 15 , 19 ], with 80% of university students not meeting physical activity recommendations during this transition [ 14 ]. These studies show that promoting physical activity among university students is essential.

Providing university courses that improve student’s physical activity levels may be a viable option, as universities can provide students with access to a range of sports facilities, highly educated facilitators, and appropriate technologies [ 20 ]. Furthermore, establishing and maintaining quality physical activity university courses has been a concern. As such, a guideline developed in the US and China [ 21 – 23 ] suggests that administration/support, assessment, instructional strategies, professionalism, learning environment, program staffing, and curriculum are essential facets that promote quality. Previous studies also suggest that providing strategies for administration and directors [ 24 , 25 ], modelling the development and support for course instructors [ 26 ], and utilising technology [ 27 – 29 ] can be viable strategies for increasing the quality of university courses that aim to improve the physical activity levels of university students.

There are various factors that influence physical activity behaviour in young adults [ 30 ]. Gaining knowledge of physical activity is considered one of the principle determining factors of physical activity in university-age students [ 31 ]. A cross-sectional study involving 258 adults in Hong Kong found that physical activity knowledge had a positive correlation with levels of physical activity, with this correlation strongest among the university student participants [ 32 ]. This finding is also supported by a Chinese cross-sectional study recruiting 9826 university students [ 33 ], with this study finding that knowledge of the physical activity guidelines was correlated with higher physical activity levels. Thus, physical activity knowledge should be considered as one of the learning outcomes of university courses that promote physical activity in a university setting.

Another important factor that is widely known to be associated with physical activity is motivation [ 34 , 35 ]. Exercise motivation plays an important role in long-term physical activity behaviour [ 36 ]. Systematic reviews examine relationships between motivation and physical activity [ 37 ] and examine the effects of physical activity interventions underpinned by motivational principles [ 38 ] show that motivation significantly increases physical activity levels [ 37 , 38 ]. Furthermore, a cross-sectional study involving 1079 participants aged 24±9 years showed that motivation for physical activity and exercise is associated with frequency, intensity, and duration of exercise [ 39 ]. This finding is supported by an observational study using a web-based survey involving 320 wearable activity monitor users, where motivational regulation was correlated with moderate to vigorous physical activity [ 40 ]. Hence, having motivational content and outcomes as part of a university course that promotes physical activity in university settings should also be considered.

Research focusing on technologies that promote university students’ physical activity levels is gaining more attention. The most common strategy utilised in previous studies has been internet websites [ 41 – 49 ]. Seven out of nine (78%) studies utilising internet websites have been successful in increasing physical activity levels [ 43 – 49 ]. Another common form of technology utilised to enhance university students’ physical activity levels are wearable devices. These have ranged from pedometers [ 50 – 53 ] to activity trackers (Misfit, Jawbone UP, Polar M400, Fitbit, and MyWellness Key) [ 54 – 58 ]. Two of four (50%) studies that utilised a pedometer successfully increased participants mean steps per day [ 53 , 59 ]. Two of five (40%) studies using activity trackers also increased student’s physical activity levels [ 55 , 58 ]. Social media, smartphone applications or mobile apps have also become prominent as a technology to enhance the physical activity levels of university students [ 60 – 66 ]. Three of seven (43%) studies using mobile phone apps have successfully increased students’ physical activity levels [ 62 , 63 , 65 ]. However, it is important to highlight that most of the studies (19 of 25 (76%)) utilising technology (internet website, wearable device, mobile phone apps) that aim to increase student’s physical activity levels in university settings were non-course-based interventions. That means only six of 25 (24%) studies were course-based interventions, with four of these six studies significantly increasing students’ physical activity levels. Nevertheless, it is still unclear what design principles and features of technology are the most effective to support the implementation of these course-based interventions to increase students’ physical activity knowledge, motivation and levels. There is also far too little attention paid to creating a set of design principles that can direct future improvement endeavours in this space [ 67 ]. Hence, research that focuses on development goals is needed to incorporate technology into courses at universities that aim to increase students’ physical activity knowledge, motivation and levels.

The aim of this design-based research project is to create a series of design principles, develop an intervention based on the design principles and measure the impact of the intervention [ 67 – 69 ]. The impact of the intervention will be measured through an increase in physical activity levels, knowledge, and motivation. The purpose of this protocol paper is to fully detail the plan for conducting a design-based research approach to develop a technology-supported physical activity course for increasing university students’ physical activity knowledge, motivation and levels in a comprehensive, transparent manner, contributing to the methodological evidence base in this field.

Aims and objectives

The primary purpose of this research project is to create a series of design principles which can guide the development of a technology-supported physical activity course to increase university student’s physical activity knowledge, motivation and levels.

Study design

The research methodology for the study was guided by the Design-Based Research model [ 67 ], which simplifies collaboration with practitioners by integrating known and hypothetical design principles with technological affordances to render plausible solutions to these problems. Design-based research was heralded as a practical research methodology that could effectively bridge the chasm between research and practice in formal education [ 70 ], including physical education [ 71 ]. Design-based research can also be used as an alternate model for enquiry in the field of educational technology to make future progress in improving teaching and learning through technology [ 72 ]. The original Reeves design-based research model contains four steps (see Fig 1 ). However, in this study, the four steps will be represented in seven phases to accommodate the cyclic nature of design-based research (see Fig 2 ). This adaptation has been made because the nature of design-based research is the flexibility of the process that still have some principles [ 73 ]. McKenney and Reeves [ 73 ] have highlighted that design-based research use: 1) scientific knowledge to ground design work; 2) produce scientific knowledge; 3) three main phases (analysis, design, and evaluation phase); and 4) development of both interventions in practise and reusable knowledge. Therefore, the seven phases in this study are: 1) needs analysis and creation of initial design principles from a systematic literature review and focus group discussions; 2) development of a prototype technology-supported physical education course based on initial design principles; 3) evaluation and testing of prototype technology-supported physical education course (pilot study); 4) revision of initial design principles; 5) modification of the prototype technology-supported physical education course based on revised design principles; 6) evaluation and testing of modified prototype technology-based physical education course (randomised controlled trial); and 7) the final design principles. These seven phases will be housed in four sequential studies. These are a: 1) systematic review; 2) qualitative focus group study; 3) pilot study, and 4) randomised controlled trial study. This project has obtained ethics approval from the University of Sydney Human Research Ethics Committee (Project No. 2021/071; Project No. 2021/935). Written informed consent was obtained from all participants before the study commenced.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0269759.g001

thumbnail

The four studies are identified in italics.

https://doi.org/10.1371/journal.pone.0269759.g002

Location and setting

This study will be conducted at the Universitas Pendidikan Indonesia (Indonesia University of Education) located in Bandung, West Java, Indonesia ( www.upi.edu ). The physical activity education course of interest is offered to first year and second-year undergraduate students as required (mandatory) learning or as an elective. To clarify, there is also university courses called Physical Education Teacher Education (PETE) programs that aim to educate preservice physical education teachers to prepare them to become beginning teachers in school settings. However, the physical activity education course (herein now called physical education course) in this study is unit of study with the main outcome is to promote active lifestyles or lifelong physical activity for all undergraduate students. Based on the higher education system in Indonesia, the workload used in the physical education course is a semester credit unit. In this system, one credit is equivalent to 48 hours/semester, 16 meetings, and 3 hours for each meeting consisting of scheduled lecture activities, structured, and independent assignments. The credit system is different from European Credit Transfer and Accumulation System (ECTS) which has a workload of 25 hours of study or 2.5 hours for 10 meetings per semester. When the credit system is changed to the ECTS, the students’ workloads for this course is 48:25 x 2 credits = 3.84 ECTS.

Phase 1. Needs analysis and creation of initial design principles

The purpose of Phase 1 is the creation of the initial design principles from the systematic literature review and needs analysis with key stakeholders. In addition, this phase will also identify appropriate technologies that will facilitate the implementation of a physical education course for increasing university students’ physical activity knowledge, motivation and levels.

Systematic review (study I)

Purpose . This systematic review aims to form initial design principles focusing on implementing a physical education course for university settings to improve university students’ physical activity knowledge, motivation and levels.

The first study in this project is systematic literature review aims to identify, critically appraise, and summarise the best available evidence regarding the effectiveness of technology-supported university courses for increasing the physical activity knowledge, motivation and levels of university students. This systematic review will follow the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines [ 74 ]. The protocol of the study will be registered with PROSPERO (ID: CRD42020210327 ). The process will involve planning a review, searching, and selecting studies, data collection, risk of bias assessment, analysis, and interpreting results. The PICO (Population, Intervention, Comparator, and Outcome) formula is used in this study to limit the question. The population is a university student or college students, or students who enrol in higher institution. Intervention is technology-based physical education or course-based intervention using technology. Technology is limited to online delivery, learning management system, website, wearable device, mobile application, activity tracker, and blended learning using technology. Comparation is studies with a control group including RCT or Non-RCT or quasi-experimental study describing interventions using technology-supported university courses targeting physical activity among university students. The primary outcome will be physical activity levels. Eight electronic bibliographic databases will be sought including CINAHL, ERIC, MEDLINE, ProQuest, PsycINFO, Scopus, SPORTDiscus, and Web of Science from 1st January 2010 to 31st December 2020. 10 item quality assessment scale derived from Van Sluijs and colleagues [ 75 ] will be used to measure the quality of selected studies.

Qualitative focus group study (study 2)

Purpose . The purpose of this study is to confirm and add to the initial series of design principles to inform the design of Phase 2.

Setting access and recruitment . There will be three focus group discussions with key stakeholders. These stakeholders will include administrators and directors, course instructors/lecturers and students. The first discussion is with key stakeholders such as the curriculum development team, course coordinator and the Dean of the Faculty of Sport and Health Education. The discussion will be guided by a form to confirm that the content of the initial design principles is appropriate considering the learning outcomes and current technologies provided and supported by the university. The second focus group discussion is with lecturers who teach the physical education course. This discussion will be guided with lecturers being asked to answer questions focusing on the system and content of the course. The third focus group discussion will be with students who have previously enrolled in and completed the physical education course with questions focusing on the best technologies for their learning. The semi-structured focus group discussion guidelines are attached as supporting information file [see S1 Appendix ].

Participation in these focus groups is voluntary. All participants will be invited using email and will be given the Participant Information Statements. A signed Participant Consent Form will be needed to participate in the focus group discussion.

Sampling . There will be three focus group discussions. In the first focus group discussion, four key stakeholders or policy makers will be invited to participate: 1) a physical education course lecturer; 2) the course coordinator; 3) a member of university curriculum development team; and 4) the Dean of the Faculty of Sport and Health Education. The second focus group discussion will involve six randomly selected lecturers from 30 lecturers who currently implement the physical education course. The third focus group discussion will involve 12 undergraduate students who have enrolled in the physical education course previously. The size of sample in the focus group discussion 1, 2 and 3 will enable data saturation.

Data analysis . Qualitative data from the focus group discussions will be analysed using the thematic analysis [ 76 ]. The analysis will involve transcribing the data before coding individual comments into categories determined by the research question. Each category will then be sub-coded and investigated in more detail. This method will enable issues and themes in the data to emerge and from these issues and themes, conclusions will be able to be made to reinforce the initial design principles derived from the systematic review.

Phase 2. Development of a prototype technology-supported physical education course based on initial design principles

The purpose of Phase 2 is to develop a prototype based on the initial design principles from Phase 1. The design includes content and technical development. Content design will be reviewed by three physical education experts. Content development will produce a paper-based prototype of the technology that will show how the technology aligned with the initial design principles. While technical development will produce the design of a Learning Management System (LMS) and mobile application (App). Technical design includes User Experience (UX) represented by the flow chart of how LMS and App work and User Interface (UI) represented by a series of visual design on how LMS and App looks like. The next step is producing the LMS, and App based on technical development by IT developers to create a prototype for the technology-based physical education course.

Phase 3. Evaluation and testing of prototype technology-supported physical education course

The purpose of Phase 3 is to test and evaluate the prototype from Phase 2. In this phase, a pilot study will be conducted to implement and test the prototype from the previous phases to examine the feasibility and acceptability of a randomised controlled trial that will be conducted in Phase 6.

Pilot study (study 3).

Purpose . The pilot study aims to test the prototype with a small sample size to examine the feasibility and acceptability of the intervention.

Setting access and recruitment . The pilot study will include two classes of a semester physical education course. One of the classes will be the intervention group, and the other will be the control group. The intervention group will receive a course incorporated with the prototype for the technology-supported physical education course. The control group will receive the usual physical education course. Three outcomes will be measured pre- and post-intervention. This phase also includes developing the research instruments to determine validity and reliability.

There are two recruitment processes in this pilot study including the lecturer and students. In the lecturer recruitment step, the researcher will invite a lecturer who participated in the focus group discussion in Phase 1. The lecturer will be included in the pilot study if he/she meets the following inclusion criteria: 1) run two classes of the physical education course; 2) willing to take part in lecturer training before the physical education course class begins; and 3) willing to invite their students to take part in the pilot study.

The lecturer will invite their students to participate in the pilot study in the first week of the physical education course. Participation in this pilot study is voluntary, and all participants will be given Participant Information Statements and invited to sign a Participant Consent Form to participate. Participation or non-participation in this study will not affect students’ scores in physical education courses. The number of students in each class varies from 20–50 students. Students will be included in the pilot study if they meet the following inclusion criteria: 1) students enrolled in a physical education course with a lecturer who will participate in the pilot study; 2) voluntarily participate in the pilot study; and 3) owns an android smartphone.

Intervention . The intervention group will receive 16 weeks of the physical education course incorporated with the prototype of technology-supported physical education course implemented by their lecturer. The intervention and control group will have the same content (See Table 1 ); however, the intervention group will have access to the prototype of technology. The prototype will be developed in Phase 2 using the initial design principles generated from the focus group discussions and systematic review in Phase 1.

thumbnail

https://doi.org/10.1371/journal.pone.0269759.t001

Instruments . Data will be collected on student’s physical activity knowledge, motivation and levels, since the outcome of the course is changing student physical activity behaviour, knowledge and motivation to encourage long-life physical activity. Students’ physical activity level will be measured by using International Physical Activity Questionnaire (IPAQ) short form. Students’ physical activity knowledge will be measured by a questionnaire that will be developed in the pilot study based on literature [ 32 , 33 , 77 ]. A physical education expert will confirm content validity of the questionnaire and a reliability test of the instrument will be conducted in the pilot study. Students’ motivation for physical activity and exercise will be measured using The Behavioural Regulation in Exercise Questionnaire-2 (BREQ-2) [ 78 ]. The BREQ-2 consists of a 19-item questionnaire of 5 subscales (amotivation, external regulation, introjected regulation, identified regulation, and intrinsic regulation) to assess motivation to exercise. Construct validity studies showed that the BREQ-2 is valid for measuring exercise motivation among university students [ 79 ]. Validity and reliability of an Indonesian version of the BREQ-2 will be conducted in this pilot study.

Statistics and data analysis . Data will also be gathered from class observations and interviews with the lecturers and students. The class observation will use a semi-structured observation protocol based on ISO 9126 evaluation model for e-learning [ 80 ], focusing on functionality, reliability, efficiency, usability, maintainability, and portability of the prototypes that will be analysed for refining design principles.

Descriptive statistics will be presented (mean and standard deviation) for each group separately. Changes in physical activity level (MET: Metabolic Equivalent of Task), Physical activity knowledge and motivation from pre-test and post-test will be assessed using an ANCOVA. As this is a pilot study, the study will be underpowered; therefore, quantitative outcomes will be interpreted only as feasibility and acceptability measures.

Phase 4. Revise of initial design principles

The purpose of Phase 4 is to modify the initial design principles from Phase 1 based on data gathered from Phase 3 (pilot study, observation, measurement and feedback from lecturer and students). In this phase, the initial design principles will be revised to guide the prototype for technology-based physical education courses to determine the revised set of design principles.

Phase 5. Modification of prototype technology-supported physical education based on revised design principles

The purpose of Phase 5 is to modify the prototype based on Phase 4’s revised design principles. The revised design principles in the previous phase will be a foundation to redesign and build the technology-based physical education course. The next step is an expert review to find problems and recommendations for the technology. The final step in this phase is the refinement and modification of the technology prototype.

Phase 6. Evaluation and testing of modified prototype technology-supported physical education (randomised controlled trial)

In this phase, the refined and modified prototype will be tested and evaluated. The randomised controlled trial will be conducted to examine the effectiveness of a technology-supported physical education course on increasing university students’ physical activity levels, knowledge, and motivation.

Randomised controlled trial study (study 4).

This is a two-arm parallel, randomised controlled trial of a technology-supported physical education course intervention for students enrolled in an elective unit of study. The randomised controlled trial protocol is registered at the Australian New Zealand Clinical Trials Registry (ANZCTR). The RCT request number is ACTRN12622000712707, 18/05/2022. The RCT adheres to SPIRIT guidelines for reporting clinical trial study protocols [see S1 Table ]. The SPIRIT schedule can be seen in the Fig 3 .

thumbnail

https://doi.org/10.1371/journal.pone.0269759.g003

Purpose . The randomised controlled trial aims to test and evaluate the modified prototype with a larger sample size.

Hypothesis . A technology-supported physical education course intervention will increase university student’s physical activity levels, knowledge and motivation more effectively than a non-technology-supported physical education courses.

Setting access and participants . The randomised controlled trial will include six lecturers and six classes (20 to 50 students each class). The recruitment process and intervention in the randomised controlled trial will be the modified version of the intervention from the pilot study. The six lecturers who agree to participate will be randomly assigned to intervention or control. The lecturer will invite their students to participate in the RCT in the first week of the physical education course. The decision for students who the comparator groups are (the control group) is based on their enrolment teacher. Participation in this randomised controlled trial is voluntary, and all lecturers and students will be given Participant Information Statements and be invited to sign a Participant Consent Form to participate.

Sample size including power calculation . The sample size calculation is based on the difference in change in the primary outcome (physical activity) from pre- to post-intervention in both groups (intervention and control group). Based on previous studies of technology-based physical activity interventions among university students, mean effect sizes of around d = 0.5 are expected in analyses [ 47 , 81 ]. To detect such intervention effects in two-sided significance testing (α = .05) with a power of 80%, a sample size of 128 participants is required. Considering an expected study drop-out of about 20% and response rates of about 50%, 300 participants will be included in the study.

Number of participants

N = 6 physical education course lecturers

(a) n = 3 will be assigned to the intervention group,

(b) n = 3 will be assigned to the control group.

n = 300 students will be randomised 1:1 to either an intervention arm

(a) n = 150 a technology-supported physical education course (intervention group)

(b) n = 150 non-technology-supported physical education course (control group).

Teacher training . The lecturers in the intervention group will have teacher training for two days on how to deliver the technology-based physical education course prior to the intervention.

Intervention . The intervention group will receive 16 weeks of the physical education course incorporated with modified prototype from the pilot study in Phase 3.

Instrument . Data will be gathered using valid and reliable instruments developed from the pilot study and fidelity check and interviews to accommodate the final design principles.

Phase 7. Final design principles

This is the final phase of the study involving the entire data collected from Phases 1 to 6 to create a final series of design principles to aid future practitioners who will design, implement, and evaluate technology-supported physical education courses for university students.

The purpose of this study is to evaluate a technology-supported physical education course designed to increase the students’ physical activity levels, knowledge and motivation. The intervention uses a design-based research approach to determine the technologies and features that are most effective in supporting university students in attaining physical activity and motivational outcomes at one university in Indonesia. Determining the technologies and features will help support and facilitate university students physical activity and motivation, will foster scaling-up and sustainability of this course in this setting and perhaps in other university settings, which is an important facet of publishing protocol papers [ 82 , 83 ]. This manuscript will report on the systematic creation of a series of design principles using literature, focus group discussions with key stakeholders and modified through cycles of robust research testing. The systematic review study of this project has been completed and published [ 84 ]. Four design principles generated from the review will be confirmed and enhanced in the series of focus group discussions with key stakeholders (Phase 1) before the prototype is built from the initial design principles (Phase 2). The prototype will be tested and evaluated in the small sample pilot study to ensure prototype feasibility and validity (Phase 3). Then modifications will be performed based on the pilot study findings (Phase 4) with the modified prototype (Phase 5) tested and evaluated in a large randomised controlled trial study (Phase 6). The final design principles will then be created (Phase 7) to aid future research and practitioners who will design, implement, and evaluate university physical education courses. The design principles generated from this project will contribute to future practitioners designing, implementing, and evaluating technology-supported university physical education courses that aim to enhance university students’ physical activity knowledge, motivation and levels. However, this project has limitations. The most important limitation lies in the fact that this project will be conducted during the COVID-19 pandemic, where face-to-face activities in focus groups, as well as the pilot study and randomised controlled trial, will be restricted. In addition, the pandemic will also limit the use of objective measures of physical activity (i.e., accelerometer) as this instrument requires a face-to-face setting and application. Hence, only subjective measures of physical activity levels will be used in this study.

Supporting information

S1 appendix. semi-structured focus group discussion guidelines..

https://doi.org/10.1371/journal.pone.0269759.s001

S1 Table. SPIRIT 2013 checklist: Recommended items to address in a clinical trial protocol and related documents*.

https://doi.org/10.1371/journal.pone.0269759.s002

https://doi.org/10.1371/journal.pone.0269759.s003

  • View Article
  • Google Scholar
  • PubMed/NCBI
  • 21. National Association for Sport Physical Education N. Appropriate instructional practice guidelines for higher education physical activity programs. Author. Reston, VA; 2009.
  • 31. Sallis JF, Owen N. Physical activity & behavioral medicine. Thousand Oaks, Calif.;: Sage; 1999.
  • 50. Perkins J. Baseline comparison and the effects of education and pedometer use on physical activity in undergraduate students in psychology and personal health courses. Ann Arbor: Central Michigan University; 2006.
  • 73. McKenney S, Reeves TC. Educational design research. Handbook of research on educational communications and technology: Springer; 2014. p. 131–40.
  • About this Book
  • Learning Theories
  • Behaviorism
  • Cognitivism
  • Constructivism
  • Socioculturalism
  • Research Methods
  • Case Studies

Design-Based Research

  • Inferential Statistics
  • Learning Analytics
  • Psychometrics
  • Design and Development
  • Continuous Improvement Dashboards
  • Design Layers
  • Gamification
  • Informal Learning
  • Instructional Design Methods
  • Personalized and Blended Learning
  • Project-Based Learning
  • Secure Web Application Development
  • Socratic Seminar
  • Visual Aesthetics
  • List of Authors
  • Translations

Choose a Sign-in Option

Tools and Settings

Questions and Tasks

Citation and Embed Code

design based research project

In an educational setting, design-based research is a research approach that engages in iterative designs to develop knowledge that improves educational practices. This chapter will provide a brief overview of the origin, paradigms, outcomes, and processes of design-based research (DBR). In these sections we explain that (a) DBR originated because some researchers believed that traditional research methods failed to improve classroom practices, (b) DBR places researchers as agents of change and research subjects as collaborators, (c) DBR produces both new designs and theories, and (d) DBR consists of an iterative process of design and evaluation to develop knowledge.

Origin of DBR

DBR originated as researchers like Allan Collins (1990) and Ann Brown (1992) recognized that educational research often failed to improve classroom practices. They perceived that much of educational research was conducted in controlled, laboratory-like settings. They believed that this laboratory research was not as helpful as possible for practitioners.

Proponents of DBR claim that educational research is often detached from practice (The Design-Based Research Collective, 2002). There are at least two problems that arise from this detachment: (a) practitioners do not benefit from researchers’ work and (b) research results may be inaccurate, because they fail to account for context (The Design-Based Research Collective, 2002).

Practitioners do not benefit from researchers’ work if the research is detached from practice. Practitioners are able to benefit from research when they see how the research can inform and improve their designs and practices. Some practitioners believe that educational research is often too abstract or sterilized to be useful in real contexts (The Design-Based Research Collective, 2002).

Not only is lack of relevance a problem, but research results can also be inaccurate by failing to account for context. Findings and theories based on lab results may not accurately reflect what happens in real-world educational settings.

Conversely, a problem that arises from an overemphasis on practice is that while individual practices may improve, the general body of theory and knowledge does not increase. Scholars like Collins (1990) and Brown (1992) believed that the best way to conduct research would be to achieve the right balance between theory-building and practical impact.

Paradigms of DBR

Proponents of DBR believe that conducting research in context, rather than in a controlled laboratory setting, and iteratively designing interventions yields authentic and useful knowledge. Sasha Barab (2004) says that the goal of DBR is to “directly impact practice while advancing theory that will be of use to others” (p. 8). This implies “a pragmatic philosophical underpinning, one in which the value of a theory lies in its ability to produce changes in the world” (p. 6). The aims of DBR and the role of researchers and subjects are informed by this philosophical underpinning.

Aims of DBR

Traditional, experimental research is conducted by theorists focused on isolating variables to test and refine theory. DBR is conducted by designers focused on (a) understanding contexts, (b) designing effective systems, and (c) making meaningful changes for the subjects of their studies (Barab & Squire, 2004; Collins, 1990). Traditional methods of research generate refined understandings of how the world works, which may indirectly affect practice. In DBR there is an intentionality in the research process to both refine theory and practice (Collins et al., 2004).

Role of DBR Researcher

In DBR, researchers assume the roles of “curriculum designers, and implicitly, curriculum theorists” (Barab & Squire, 2004, p.2). As curriculum designers, DBR researchers come into their contexts as informed experts with the purpose of creating, “test[ing] and refin[ing] educational designs based on principles derived from prior research” (Collins et al., 2004, p. 15). These educational designs may include curricula, practices, software, or tangible objects beneficial to the learning process (Barab & Squire, 2004). As curriculum theorists, DBR researchers also come into their research contexts with the purpose to refine extant theories about learning (Brown, 1992).

This duality of roles for DBR researchers contributes to a greater sense of responsibility and accountability within the field. Traditional, experimental researchers isolate themselves from the subjects of their study (Barab & Squire, 2004). This separation is seen as a virtue, allowing researchers to make dispassionate observations as they test and refine their understandings of the world around them. In comparison, design-based researchers “bring agendas to their work,” see themselves as necessary agents of change and see themselves as accountable for the work they do (Barab & Squire, 2004, p. 2).

Role of DBR Subjects

Within DBR, research subjects are seen as key contributors and collaborators in the research process. Classic experimentalism views the subjects of research as things to be observed or experimented on, suggesting a unidirectional relationship between researcher and research subject. The role of the research subject is to be available and genuine so that the researcher can make meaningful observations and collect accurate data. In contrast, design-based researchers view the subjects of their research (e.g., students, teachers, schools) as “co-participants” (Barab & Squire, 2004, p. 3) and “co-investigators” (Collins, 1990, p. 4). Research subjects are seen as necessary in “helping to formulate the questions,” “making refinements in the designs,” “evaluating the effects of...the experiment,” and “reporting the results of the experiment to other teachers and researchers” (Collins, 1990, pp. 4-5). Research subjects are co-workers with the researcher in iteratively pushing the study forward.

Outcomes of DBR

DBR educational research develops knowledge through this collaborative, iterative research process. The knowledge developed by DBR can be separated into two categories: (a) tangible, practical outcomes and (b) intangible, theoretical outcomes.

Tangibles Outcomes

A major goal of design-based research is producing meaningful interventions and practices. Within educational research these interventions may “involve the development of technological tools [and] curricula” (Barab & Squire, 2004, p. 1). But more than just producing meaningful educational products for a specific context, DBR aims to produce meaningful, effective educational products that can be transferred and adapted (Barab & Squire, 2004). As expressed by Brown (1992), “an effective intervention should be able to migrate from our experimental classroom to average classrooms operated by and for average students and teachers” (p.143).

Intangible Outcomes

It is important to recognize that DBR is not only concerned with improving practice but also aims to advance theory and understanding (Collins et al., 2004). DBR’s emphasis on the importance of context enhances the knowledge claims of the research. “Researchers investigate cognition in context...with the broad goal of developing evidence-based claims derived from both laboratory-based and naturalistic investigations that result in knowledge about how people learn” (Barab & Squire, 2004, p.1). This new knowledge about learning then drives future research and practice.

Process of DBR

A hallmark of DBR is the iterative nature of its interventions. As each iteration progresses, researchers refine and rework the intervention drawing on a variety of research methods that best fit the context. This flexibility allows the end result to take precedence over the process. While each researcher may use different methods, McKenny and Reeves (2012) outlined three core processes of DBR: (a) analysis and exploration, (b) design and construction, and (c) evaluation and reflection. To put these ideas in context, we will refer to a recent DBR study completed by Siko and Barbour regarding the use of PowerPoint games in the classroom.

The Iterative Process of Design-Based Research

the DBR process

Analysis and Exploration

Analysis is a critical aspect of DBR and must be used throughout the entire process. At the start of a DBR project, it is critical to understand and define which problem will be addressed. In collaboration with practitioners, researchers seek to understand all aspects of a problem. Additionally, they “seek out and learn from how others have viewed and solved similar problems ” (McKenny & Reeves, 2012, p. 85). This analysis helps to provide an understanding of the context within which to execute an intervention.

Since theories cannot account for the variety of variables in a learning situation, exploration is needed to fill the gaps. DBR researchers can draw from a number of disciplines and methodologies as they execute an intervention. The decision of which methodologies to use should be driven by the research context and goals.

Siko and Barbour (2016) used the DBR process to address a gap they found in research regarding the effectiveness of having students create their own PowerPoint games to review for a test. In analyzing existing research, they found studies that stated teaching students to create their own PowerPoint games did not improve content retention. Siko and Barbour wanted to “determine if changes to the implementation protocol would lead to improved performance” (Siko & Barbour, 2016, p. 420). They chose to test their theory in three different phases and adapt the curriculum following each phase.

Design and Construction

Informed by the analysis and exploration, researchers design and construct interventions, which may be a specific technology or “less concrete aspects such as activity structures, institutions, scaffolds, and curricula” (Design-Based Research Collective, 2003, pp. 5–6). This process involves laying out a variety of options for a solution and then creating the idea with the most promise.

Within Siko and Barbour’s design, they planned to observe three phases of a control group and a test group. Each phase would use t-tests to compare two unit tests for each group. They worked with teachers to implement time for playing PowerPoint games as well as a discussion on what makes games successful. The first implementation was a control phase that replicated past research and established a baseline. Once they finished that phase, they began to evaluate.

Evaluation and Reflection

Researchers can evaluate their designs both before and after use. The cyclical process involves careful, constant evaluation for each iteration so that improvements can be made. While tests and quizzes are a standard way of evaluating educational progress, interviews and observations also play a key role, as they allow for better understanding of how teachers and students might see the learning situation.

Reflection allows the researcher to make connections between actions and results. Researchers must take the time to analyze what changes allowed them to have success or failure so that theory and practice at large can be benefited. Collins (1990) states:

It is important to analyze the reasons for failure and to take steps to fix them. It is critical to document the nature of the failures and the attempted revisions, as well as the overall results of the experiment, because this information informs the path to success. (pg. 5)

As researchers reflect on each change they made, they find what is most useful to the field at large, whether it be a failure or a success.

After evaluating results of the first phase, Siko and Barbour revisited the literature of instructional games. Based on that research, they first tried extending the length of time students spent creating the games. They also discovered that the students struggled to design effective test questions, so the researchers tried working with teachers to spend more time explaining how to ask good questions. As they explored these options, researchers were able to see unit test scores improve.

Reflection on how the study was conducted allowed the researchers to properly place their experiences within the context of existing research. They recognized that while they found positive impacts as a result of their intervention, there were a number of limitations with the study. This is an important realization for the research and allows readers to not misinterpret the scope of the findings.

This chapter has provided a brief overview of the origin, paradigms, outcomes, and processes of Design-Based Research (DBR). We explained that (a) DBR originated because some researchers believed that traditional research methods failed to improve classroom practices, (b) DBR places researchers as agents of change and research subjects as collaborators, (c) DBR produces both new designs and theories, and (d) DBR consists of an iterative process of design and evaluation to develop knowledge.

Barab, S., & Squire, K. (2004). Design-based research: putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14.

Brown, A. L. (1992). Design experiments: theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141–178.

Collins, A. (1990). Toward a design science of education (Report No. 1). Washington, DC: Center for Technology in Education.

Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15–42.

Mckenney, S., & Reeves, T.C. (2012) Conducting Educational Design Research. New York, NY: Routledge.

Siko, J. P., & Barbour, M. K. (2016). Building a better mousetrap: how design-based research was used to improve homemade PowerPoint games. TechTrends, 60(5), 419–424.

The Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8.

This content is provided to you freely by EdTech Books.

Access it online or download it at https://edtechbooks.org/studentguide/design-based_research .

Advertisement

Advertisement

A Design-Based Research Approach to the Teaching and Learning of Multiliteracies

  • Regular Article
  • Published: 15 September 2022
  • Volume 32 , pages 641–653, ( 2023 )

Cite this article

  • Fei Victor Lim   ORCID: orcid.org/0000-0003-3046-1011 1  

694 Accesses

2 Citations

4 Altmetric

Explore all metrics

Against the backdrop of the expansion of the literacy curriculum to include multiliteracies in education systems around the world, we discuss how a design-based research approach can contribute to practical outcomes in building the participating teachers’ confidence and competence in their pedagogical practices, developing scalable lesson resources for other teachers to use and adapt, and distilling design principles that can inform the teaching of multiliteracies across contexts. In this paper, we introduce the research project we have implemented based on a design-based research (DBR) approach and describe the features of the project in relation to the characteristics of DBR. We highlight the focus in DBR to improve practice and discuss how the classroom practices of the participating teachers have improved. We also draw on the perceptions of the teachers from the interviews and reflections they have made and suggest that they have grown professionally in their confidence. Through the co-design sessions between the researchers and teachers, we have distilled design principles, that is the learning processes for the teaching and learning of multiliteracies. These learning processes are applied to guide the design of the lesson packages as resources for teachers. While it is well-recognized that teachers are critical in the implementation of curricular reforms, we are interested to understand how researchers and teachers can work together through a design-based research approach to fully implement the curricular reforms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

design based research project

Design-Based Research – Grounding, Understanding and Empirical Illustration in the Context of Vocational Education

design based research project

Participatory Approaches to Curriculum Design From a Design Research Perspective

design based research project

Design-based research in mathematics education: trends, challenges and potential

Samuel Fowler, Chelsea Cutting, … Simon N. Leonard

Anderson, T., & Shattuck, J. (2012). Design-based research: A decade of progress in education research? Educational Researcher, 41 (1), 16–25.

Article   Google Scholar  

Bomer, R., Zoch, M., & David, A. D. (2010). Redesigning memoir: A design-based investigation of materiality and new literacy practices in an elementary classroom’s writing workshop. In R. T. Jimenez, V. J. Risko, M. K. Hundley, & D. W. Rowe (Eds.), 59th Yearbook of the National Reading Conference (pp. 216–229). National Reading Conference, Inc.

Brown, A., & Campione, J. (1996). Psychological theory and the design of innovative learning environments: On procedures, principles, and systems. In L. Schauble & R. Glaser (Eds.), Innovations in learning: New environments for education (pp. 289–325). Lawrence Erlbaum Associates.

Bull, G., & Anstey, M. (2019). Elaborating multiliteracies through multimodal texts changing classroom practices and developing teacher pedagogies . Routledge.

Chia, A., & Chan, C. (2017). Re-defining ‘reading’ in the 21st century. Beyond Words: A Journal on Applied Linguistics and Language Education, 5 (2), 98–105.

Google Scholar  

Christie, F. (1993). Curriculum genres: Planning for effective teaching . Routledge.

Christie, F. (1997). Curriculum macrogenres as forms of initiation into a culture . Continuum.

Christie, F. (2002). Classroom discourse analysis: A functional perspective . Continuum.

Cope, B., & Kalantzis, M. (2009). Multiliteracies: New literacies, new learning. Pedagogies, 4 (3), 164–195. https://doi.org/10.1080/15544800903076044

Cope, B., & Kalantzis, M. (2015). An introduction to the pedagogy of multiliteracies. In B. Cope & M. Kaantzis (Eds.), A pedagogy of multiliteracies: Learning by design (pp. 1–36). Palgrave Macmillan.

Curriculum Planning Development Division. (2018). English language syllabus 2020: Primary & secondary (Express/Normal [Academic]) . Ministry of Education.

English Language Institute of Singapore. (2016). 50 years of development in English language teaching and learning in Singapore .

Finnish National Board of Education. (2016). National core curriculum for basic education . Finnish National Agency for Education.

Hafner, C. A. (2014). Embedding digital literacies in English language teaching: Students’ digital video projects as multimodal ensembles. TESOL Quarterly, 48 (4), 655–685.

Hafner, C. A., & Ho, W. Y. J. (2020). Assessing digital multimodal composing in second language writing: Towards a process-based model. Journal of Second Language Writing, 47 , 100710–100714. https://doi.org/10.1016/j.jslw.2020.100710

Ho, C. M. L. (2010). Multimodal meaning-making in the new media age: The case of a language-based design research intervention. NIE Research Brief No 11-005.

Jewitt, C., & Kress, G. (2003). Multimodal literacy . Peter Lang.

Juuti, K., & Lavonen, J. (2006). Design-based research in science education: One step towards methodology. Nordina, 4 , 54–68.

Juuti, T., Rattya, K., Lehtonen, T., & Kopra, M.-J. (2017). Pedagogical content knowledge in product development education [conference presentation]. Oslo, Norway, E&PDE Conference.

Kalantzis, M. (Eds.), The powers of literacy: A genre approach to teaching writing (pp. 154–178). University of Pittsburgh Press.

Karatza, S. (2020). Developing primary school students’ multimodal literacy in digital environments. In N. Vasta & A. Baldry (Eds.), Multiliteracy advances and multimodal challenges in ELT environments (pp. 111–130). Forum.

Kress, G., & van Leeuwen, T. (2006). Reading images: The grammar of visual design (2nd ed.). Routledge.

Kapoyannis, T. (2019). Literacy engagement in multilingual and multicultural learning spaces. TESL Canada Journal, 36 (2), 1–25.

Kim, M. S. (2017). Multimodal modeling activities with special needs students in an informal learning context: Vygotsky revisited. EURASIA Journal of Mathematics Science and Technology Education, 13 (6), 2133–2154.

Lim, F.V., & Tan, K.Y.S. (2017). Multimodal translational research: Teaching visual texts. In O. Seizov & J. Wildfeuer (Eds.), New studies in multimodality: Conceptual and methodological elaborations (pp. 175–200). London & New York: Bloomsbury. https://doi.org/10.5040/9781350026544.0014.

Lim, F.V., & Tan, K.Y.S. (2018). Developing multimodal literacy through teaching the critical viewing of films in Singapore. Journal of Adolescent & Adult Literacy, 63 (3), 291–300. https://doi.org/10.1002/jaal.882 .

Lim, F.V., Weninger, C., Chia, A., Nguyen, T.T.H., Tan, J.M., Adams, J.L., Tan-Chia, L., Peters, C.M., Towndrow, P.A., & Unsworth, L. (2020). Multiliteracies in the Singapore English Language classroom: Perceptions and practices. National Institute of Education, Nanyang Technological University, Singapore.

Lim, F.V. (2021). Designing learning with embodied teaching. Perspectives from multimodality. London & New York: Routledge.

Lim, F.V., & Nguyen, T.T.H. (2021). Design-based research approach for teacher learning: A case study from Singapore. ELT Journal . Advance online publication. https://doi.org/10.1093/elt/ccab035 .

Lim, F.V., Weninger, C., & Nguyen, T.T.H. (2021). ‘I expect boredom’ – Students’ experiences and expectations of multiliteracies learning. Literacy, 55 (2), 102-112. https://doi.org/10.1111/lit.12243 .

Lim, F.V., & Toh, W. (2022). Developing a metafunctional framework for understanding the design of educational apps. In S. Diamantopoulou & S. Ørevik (Eds.), Multimodality in English language learning, Routledge studies in multimodality. (pp. 117-130). London & New York: Routledge.

Lim, F.V., Chia, A., & Nguyen, T.T.H. (2022a). From the beginning, I think it was a stretch – teachers’ perceptions and practices in teaching multiliteracies, English Teaching: Practice & Critique. Advance online publication. https://doi.org/10.1108/ETPC-04-2021-0025

Lim, F. V., Chia, A., Weninger, C., Tan-Chia, L., Nguyen, T. T. H., Tan, J. M., Peters, C. M., Adams, J. L., Towndrow, P. A. & Unsworth, L. (2022b). Multiliteracies in the Singapore English Language Classroom: Designing Learning. National Institute of Education, Nanyang Technological University, Singapore.

Lim, F.V., Cope, B., & Kalantzis, M. (2022c). A metalanguage for learning: Rebalancing the cognitive with the sociomaterial. Frontiers in Communication. 7:830613. https://doi.org/10.3389/fcomm.2022.830613 .

Lim, F.V. & Tan-Chia, L. (2023). Designing learning for multimodal literacy: Teaching viewing and representing. London & New York: Routledge.

Macken-Horarik, M., Love, K., Sandiford, C., & Unsworth, L. (2017). Functional grammatics . Routledge.

Menon, P. K. (2015). Multimodal tasks to support science learning in linguistically diverse classrooms: Three complementary perspectives. Unpublished PhD dissertation. University of California, USA.

New London Group. (1996). A pedagogy of multiliteracies: Designing social futures. Harvard Educational Review, 66 (1), 60–92.

O’Halloran, K. L. (1996). The discourse of secondary school mathematics . (Unpublished doctoral dissertation). Murdoch University, Australia.

O’Halloran, K. L. (2004). Discourse in secondary school Mathematics classrooms according to social class and gender. In J. A. Foley (Ed.), Language, education and discourse: Functional approaches (pp. 191–225). Continuum.

Rasi, P., Kangas, M., & Ruokamo, H. (2019). Promoting multiliteracy in the Finnish educational system. In M. Paksuniemi & P. Keskitalo (Eds.), Introduction to Finnish educational system (pp. 97–111). Brill/Sense Publishers.

Reeves, S. (2000). Community-based interprofessional education for medical, nursing and dental students. Health & Social Care in the Community, 8 (4), 269–276. https://doi.org/10.1046/j.1365-2524.2000.00251.x

Riazi, M., Shi, L., & Haggerty, J. (2018). Analysis of the empirical research in the journal of second language writing at its 25th year (1992–2016). Journal of Second Language Writing, 41 , 41–54.

Roschelle, J., & Penuel, W. R. (2006). Co-design of innovations with teachers: Definition and dynamics. Proceedings of the 7th International Conference on Learning Sciences, 606612.

Sharari, B.H. I., Lim, F.V., Hung, D., & Kwan, Y.M. (2018). Cultivating sustained professional learning within a centralised education system. School Effectiveness and School Improvement, 29 , 22–42. https://doi.org/10.1080/09243453.2017.1366343

Talbert, J. E., & McLaughlin, M. W. (1999). Assessing the school environment: Embedded contexts and bottom-up research strategies. In S. L. Friedman & T. D. Wachs (Eds.), Measuring environment across the life span (pp. 197–227). American Psychological Association.

Tan, C. (2006). Creating thinking schools through ‘Knowledge and Inquiry’: The curriculum challenges for Singapore. The Curriculum Journal, 17 (1), 89–105.

Towndrow, P. A., Brudvik, O. C., & Natarajan, U. (2009). Creating and interpreting multimodal representations of knowledge: Case studies from two Singapore classrooms. In C. H. Ng & P. Renshaw (Eds.), Reforming learning education in the Asia-Pacific region: Issues, concerns and prospects (pp. 129–158). Springer.

Tressou, E., Mitakidou, S., & Karagianni, P. (2015). Roma inclusion: International and Greek experience . Aristotle University.

Unsworth, L. (2014). Towards a metalanguage for multimedia narrative interpretation and authoring pedagogy. In L. Unsworth & A. Thomas (Eds.), English teaching and new literacies pedagogy: Interpreting and authoring digital multimedia narratives (pp. 1–22). Peter Lang Publishing.

van Leeuwen, T. (2017). Multimodal literacy. Metaphor, 4 , 17–23.

Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments. Educational Technology Research and Development, 53 (4), 5–23.

Wenger, E. (1998). Communities of practice: learning, meaning, and identity. Cambridge University Press.

Wenger, E., McDermott, R., & Snyder, W. M. (2002). Cultivating communities of practice. Boston, MA: Harvard Business School Press

Zammit, K. (2018). “We’re all real serious filmmakers”: Learning about and creating multimodal mini-documentaries. English Teaching: Practice & Critique, 17 (4), 371–386.

Australian Curriculum, Assessment and Reporting Authority. (2021, August 24). Australian Curriculum. https://www.australiancurriculum.edu.au .

Download references

Acknowledgements

 The author would like to acknowledge the contributions of Dr Nguyen Thi Thu Ha, Dr Lydia Tan-Chia, and Tan Jia Min in the research project, and thank the schools, teachers, and students for their participation.

This study was funded by Singapore Ministry of Education (MOE) under the Education Research Funding Programme (DEV 01/18 VL) and administered by National Institute of Education (NIE), Nanyang Technological University, Singapore. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the Singapore MOE and NIE. This research has received clearance from the NTU-Institutional Review Board [IRB-2019-2-038].

Author information

Authors and affiliations.

National Institute of Education, Nanyang Technological University, 1 Nanyang Walk, Singapore, 637616, Singapore

Fei Victor Lim

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Fei Victor Lim .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Lim, F.V. A Design-Based Research Approach to the Teaching and Learning of Multiliteracies. Asia-Pacific Edu Res 32 , 641–653 (2023). https://doi.org/10.1007/s40299-022-00683-0

Download citation

Accepted : 24 August 2022

Published : 15 September 2022

Issue Date : October 2023

DOI : https://doi.org/10.1007/s40299-022-00683-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Multiliteracies
  • Multimodality
  • Metalanguage
  • Professional learning
  • Find a journal
  • Publish with us
  • Track your research

Main navigation

  • Cyberbullying
  • Media Coverage

Dr. Tasmia Hai, ECP, Recipient of 2024 IMPaCT Salary Award

design based research project

  • Tweet Widget

Dr. Tasmia Hai , Assistant Professor in the Department of Educational and Counselling Psychology, is a recipient of 2024 IMPaCT Salary Award .

The funding for this project is provided by the Canadian Institutes for Health Research's 'Increasing Capacity for Maternal and Paediatric Clinical Trials' (IMPaCT), a Clinical Trials Training Platform. It will support training on how to design, conduct and communicate about clinical trials in children.

Dr. Hai's project, HOPE (Helping Our Parents and children Excel) Trial: Family Intervention for Immigrants with Preschool Children, will specifically support immigrant parents and children by offering a family-focused and culturally-based intervention that will empower parents and ultimately improve the well-being and life trajectories of vulnerable children.

As an immigrant person of colour, Dr. Hai hopes to bring a voice to immigrant families and children, many of whom are experiencing complex societal pressures. Through the IMPaCT program's curriculum, mentorship and networking opportunities, Dr, Hai will be better equipped to lead impactful research projects that improve healthcare outcomes for these underserved populations.

  • Dept. of Educational and Counselling Psychology
  • Faculty of Education

NEWS RELEASES

Professor shaheen shariff interviewed about canada's proposed online harms bill, professor shaheen shariff's statement for international women's day 2024, professor shariff featured in the january 2024 edition of open access government, impacts published an edited collection on covid law and policy issues, remembering the victims of the école polytechnique massacre, launch of new report "unpacking the narrative" about media guides on sexual violence, the impacts edited collection on sexual violence is published, impacts is hiring a project administrator, artificial intelligence and education summer program by stanford graduates, professor shaheen shariff and phd student safia amiry speak at event for international women's day.

RSS feed icon

SACE Presents: Sexual Violence Policy and International Student Support

34th annual conference of the canadian association for the practical study of law in education (capsle), canadian sociological association (csa) 2024 conference, department and university information, define the line.

  • Project Director
  • Project Administrator
  • Advisory Board
  • Collaborators
  • Research assistants
  • Partner organizations
  • For survivors
  • For educators
  • For researchers
  • For policy makers
  • COVID-19 resources

Home

Site Search

  • About ARPA-E
  • Team Directory
  • ARPA-E History
  • Annual Reports
  • Budget Requests
  • Apply For Funding
  • Authorization
  • View Active Programs
  • Search Our Programs
  • Search Individual Projects
  • Interactive Project Map
  • Exploratory Topics
  • The SCALEUP Program
  • OPEN Programs
  • Project Guidance
  • ARPA-E Technology-to-Market
  • Technology Commercialization
  • External Engagement Model
  • Investor Updates
  • ARPA-E News & Media
  • Press Releases
  • ARPA-E Disruptors
  • Publications
  • ARPA-E Events
  • Energy Innovation Summit
  • Careers at ARPA-E
  • Job Opportunities
  • Life at ARPA-E
  • ARPA-E FAQs
  • General Questions
  • Current Funding Opportunities
  • Closed Funding Opportunities

ARPA-E Investor Update Blog Thea Energy Stellarator

ARPA-E Investor Update Vol. 21: Thea Energy's Stellarator

On February 8, 2024, Thea Energy announced a $20 million Series A funding round led by Prelude Ventures with participation from 11.2 Capital, Anglo American, Hitachi Ventures, Lowercarbon Capital, Mercator Partners, Orion Industrial Ventures, and Starlight Ventures. In addition to their Series A funding, they were selected as part of the Department of Energy’s Milestone-Based Fusion Development Program . Thea plans to use this funding to continue development on Eos – a large-scale integrated neutron source stellarator system – as well as their proprietary high-temperature superconducting (HTS) planar coils, and to build out their team of fusion and commercialization experts.

Thea Energy, formerly Princeton Stellarators, spun out of a Princeton Plasma Physics Laboratory (PPPL) project originally funded through ARPA-E’s Breakthroughs Enabling Thermonuclear-fusion Energy (BETHE) program  with support from ARPA-E’s Technology-to-Market Team. 

“We are building off foundational research and technology development that we spun out of Princeton University and PPPL in 2022. As part of the ARPA-E BETHE program, our team successfully designed, prototyped, and validated scalable magnet array systems, a core part of our Company’s technology. Thanks to the ARPA-E support that led to these breakthroughs, we now have a practical approach to accelerate the commercialization of fusion power to curtail climate change and to provide a source of clean and abundant energy for generations to come,” said Co-Founder and CTO Dr. David Gates.

Stellarators confine plasma in the shape of a twisted donut (not unlike a cruller) because of the shape of their twisted magnetic fields. One of Thea Energy’s key innovations is to utilize HTS planar coils to generate the needed twisted magnetic fields for their stellarator design. Restricting their magnetic coils to a planar geometry simplifies their manufacturing requirements and is intended to keep costs down. An advantage to all stellarators is their ability to confine plasma stably over extended periods of time without a plasma current flowing around the plasma torus, a feature which makes them attractive candidate for future fusion power plants.

Thea Energy ARPA-E Stellarator Design

Thea Energy's stellarator system utilizes programmable planar magnet coils, that can optimize machine parameters and operating points. Courtesy of Thea Energy.

Thea Energy ARPA-E Stellarator Design

Thea Energy’s planar magnet architecture allows for simplified commercial system sector maintenance and operation, minimizing downtime. Courtesy of Thea Energy.

Beyond Eos, Thea Energy’s plan is to design and build a larger machine, Helios, which would generate electricity on the grid. We’re eagerly awaiting further updates from the design and construction of Eos, and all that’s to come from Thea.

Thea Energy ARPA-E Stellarator Team Photo

Members of the Thea Energy Team at the Company’s labs in Kearny, NJ. Courtesy of Thea Energy.

IMAGES

  1. PPT

    design based research project

  2. Design-based research model of this study

    design based research project

  3. Design-Based Research Model

    design based research project

  4. ⛔ Essential features of good research design. Research Design. 2022-10-14

    design based research project

  5. How to Do a Research Project: Step-by-Step Process

    design based research project

  6. Research Process: 8 Steps in Research Process

    design based research project

VIDEO

  1. Research Design, Research Method: What's the Difference?

  2. Designing "Translating Euclid" and VMT

  3. Multiple Roles and Multiple Researchers in Design-based Research

  4. concepts in research design

  5. T4S Nov14, Alex Lemheney, Lehigh Valley Health Network

  6. What is research design? #how to design a research advantages of research design

COMMENTS

  1. Design-Based Research: A Methodology to Extend and Enrich Biology

    Recent calls in biology education research (BER) have recommended that researchers leverage learning theories and methodologies from other disciplines to investigate the mechanisms by which students to develop sophisticated ideas. We suggest design-based research from the learning sciences is a compelling methodology for achieving this aim. Design-based research investigates the "learning ...

  2. Design-Based Research

    In an educational setting, design-based research is a research approach that engages in iterative designs to develop knowledge that improves educational practices. This chapter will provide a brief overview of the origin, paradigms, outcomes, and processes of design-based research (DBR). ... At the start of a DBR project, it is critical to ...

  3. Full article: Design-based research: What it is and why it matters to

    Within the larger world of social science research methods, design-based research is similar to several other methodologies (Table A1). Within instructional design and technology design alike is the notion of formative design evaluation (Edelson, Citation 2002), or simply formative evaluation (Martin, Citation 1979; Weston, Citation 1995). In ...

  4. Design-based research

    Design-based research (DBR) is a type of research methodology used by researchers in the learning sciences, which is a sub-field of education. The basic process of DBR involves developing solutions (called "interventions") to problems. Then, the interventions are put to use to test how well they work. The iterations may then be adapted and re ...

  5. Design-Based Research

    Design-based research is an approach to develop new theories and educational practices in a context-sensitive manner. The aim of this chapter is to introduce design-based research using the example of a concrete design research project. The combination of theory...

  6. Design-Based Research: Definition, Characteristics, Application and

    A study adopted Design-Based Research (DBR). DBR is focused on developing a solution to real-life situations through collaboration among people from various fields of expertise (Herrington et al ...

  7. An Introduction to Design-Based Research with an Example From

    Educational design-based research (DBR) can be characterized as research in which the design of educational materials (e.g., computer tools, learning activities, or a professional development program) is a crucial part of the research. That is, the design of learning environments is interwoven with the testing or developing of theory.

  8. Design-Based Research: An Emerging Paradigm for Educational Inquiry

    The Design-Based Research Collective is a group of faculty and researchers founded to examine, improve, and practice design-based research methods in education. The group's members all blend research on learning and the design of educational interventions.

  9. Design-Based Research: A Decade of Progress in Education Research

    Design-based research (DBR) evolved near the beginning of the 21st century and was heralded as a practical research methodology that could effectively bridge the chasm between research and practice in formal education.

  10. The Development of Design-Based Research

    Design-based research was introduced to distinguish DBR from other research approaches. Sandoval and Bell (2004) best summarized this as follows: ... Research for design applies to complex, sophisticated projects, where the purpose of research is to foster product research and development, such as in market and user research (Findeli, 1998 ...

  11. PDF Design-Based Research: An Emerging Paradigm for Educational Inquiry

    Design-based research methods focus on designing and exploring the whole range of designed innovations: artifacts as well as less. ... Besides improving the curriculum design, this project yielded concomitant findings concerning psychological aspects of students' understanding of scientific explanation and argu-mentation (Sandoval, 2003 ...

  12. Research: a Practical Handbook

    Abstract. Design-based research (DBR) is a key method in the learning sciences, which is used to simultaneously develop both learning theory and the design of instructional interventions. This article is an exceptionally short introduction to design-based research; there's tons of scholarship on this idea.

  13. (PDF) Design-Based Research

    Design-based research (DBR) 1 projects share some basic characteristics, namely the situatedness in real educational contexts, the focus on the design and testing of a significant intervention ...

  14. Design-Based Research: A Methodology to Extend and Enrich Biology

    Design-based research has no specific requirements for the form that instructional tools must take or the manner in which the tools are evaluated (Bell, 2004; Anderson and Shattuck, 2012).Instead, design-based research has what Sandoval (2014) calls "epistemic commitments" 1 that inform the major goals of a design-based research project as well as how it is implemented.

  15. Reflections on Design-Based Research

    Abstract. Design-Based Research (DBR) is a relatively new intervention method investigating educational designs applied to real-life settings, and with a dual purpose to develop domain theories and to develop the design, iteratively. This paper is an integrative review, which draws on literature and empirical projects to identify and discuss ...

  16. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  17. Using a design-based research approach to develop a technology ...

    The aim of this design-based research project is to create a series of design principles, develop an intervention based on the design principles and measure the impact of the intervention [67-69]. The impact of the intervention will be measured through an increase in physical activity levels, knowledge, and motivation.

  18. A Design-Based Research (DBR) Framework to Guide Curriculum Design

    DBR is described as an authentic and ethically-based approach to curriculum design, and a pragmatic research. methodology for deali ng with real world lea rning contexts (Amiel & Reeves, 2008 ...

  19. Design-Based Research

    Design-Based Research. In an educational setting, design-based research is a research approach that engages in iterative designs to develop knowledge that improves educational practices. This chapter will provide a brief overview of the origin, paradigms, outcomes, and processes of design-based research (DBR).

  20. PDF Using Design-Based Research in Higher Education Innovation

    Using Design-Based Research in Higher Education Innovation 52 Challenges Associated with DBR It is beneficial to first consider and classify the object of research to determine whether DBR is the right approach. For example, Kelly (2013) indicated that design research may not be cost-effective for simple or closed problems.

  21. The ChallenGE Project: Using Design-Based Research to Determine the

    The ChallenGE Design-Based Research (DBR) project developed principles and a framework for contextualised PL in gifted education through an inductive qualitative manual coding analysis of participants' self-reported learning. This paper, applying the format for reporting DBR studies recommended by Jen et al. ...

  22. A Design-Based Research Approach to the Teaching and Learning of

    Fourth, a design-based research project is integrative in employing eclectic methods (Riazi et al., 2018) in the research process. The data from our project is drawn from various sources, including lesson observation field notes, video recordings of lessons, interviews with and guided reflections from teachers, focus group discussions and ...

  23. Research by Design

    Welcome to "Research by Design: A Practical Guide about Design-Based Research for Beginners". This comprehensive online course is designed to provide an in-depth exploration of design-based research, an innovative and practical research methodology that is gaining widespread recognition in various fields. In a series of digestible video units ...

  24. Design and Research on Supervision Project Management System Based on

    In order to solve the above problems, this paper is based on system engineering thinking, with WH Project Management Company as the research background, through the analysis of the existing management methods and actual needs of the supervision enterprises, a set of digital management system for supervision projects with full functions and ...

  25. Biotechnology Project-based Learning Encourages Learning and

    Project-based learning (PBL) is a promising approach to enhance mathematics learning concepts in higher education. Here, teachers provide guidance and support to PBL implementation. The objective of this study was to develop PBL-based biotechnological projects as a strategy for mathematics learning. The methodology design was applied to 111 university students from Biochemical, Chemical ...

  26. Dr. Tasmia Hai, ECP, Recipient of 2024 IMPaCT Salary Award

    Dr. Tasmia Hai, Assistant Professor in the Department of Educational and Counselling Psychology, is a recipient of 2024 IMPaCT Salary Award. The funding for this project is provided by the Canadian Institutes for Health Research's 'Increasing Capacity for Maternal and Paediatric Clinical Trials' (IMPaCT), a Clinical Trials Training Platform. It will support training on how to design, conduct ...

  27. Blog Post

    On February 8, 2024, Thea Energy announced a $20 million Series A funding round led by Prelude Ventures with participation from 11.2 Capital, Anglo American, Hitachi Ventures, Lowercarbon Capital, Mercator Partners, Orion Industrial Ventures, and Starlight Ventures. In addition to their Series A funding, they were selected as part of the Department of Energy's Milestone-Based Fusion ...