The use of frameworks in chemistry education research

Jon-Marc G. Rodriguez *a, Jocelyn Elizabeth Nardo b, Solaire A. Finkenstaedt-Quinn c and Field M. Watts a
aDepartment of Chemistry & Biochemistry, University of Wisconsin – Milwaukee, Milwaukee, WI 53211, USA. E-mail: rodrigjg@uwm.edu; wattsf@uwm.edu
bDepartment of Chemistry & Biochemistry, The Ohio State University, Columbus, OH 43210, USA. E-mail: nardo.11@osu.edu
cDepartment of Chemistry, University of Michigan, Ann Arbor, Michigan 48109-1055, USA. E-mail: quinnsa@umich.edu

Received 17th June 2023 , Accepted 1st September 2023

First published on 1st September 2023


Abstract

Extant literature has emphasized the importance of education research being theory-based. To this end, many research articles have a distinct “theoretical framework” section describing the theoretical underpinnings that inform the research. Nevertheless, there is large variation in how explicit articles are regarding their use of frameworks in the research process. This work describes a literature review focusing on the use of frameworks (broadly defined) in chemistry education research. Our sample draws on research articles published in Chemistry Education Research and Practice and the Journal of Chemical Education from 2018 to 2021 (n = 457). The longitudinal analysis revealed general trends about the presence of frameworks in research articles over four years as well as the types of frameworks commonly used. In addition, we analyzed how frameworks were used within individual research articles published in 2021, focusing on chemistry education research articles and research articles published across biology, engineering, mathematics, and physics education research journals (n = 595). Our goal is to describe how frameworks were used to open a dialogue and inform future chemistry education research.


Introduction

Research on teaching and learning within science, technology, engineering, and mathematics (STEM) has increased considerably as a collection of fields called discipline-based education research (Singer et al., 2012; Henderson et al., 2017; Trujillo and Long, 2018). Discipline-based education research (DBER) fields each reflect an interdisciplinary agenda, drawing from educational psychology, learning sciences, and STEM content expertise to seek evidence-based knowledge and practices that improve our understanding of teaching and learning (Slater, 2011; Henderson et al., 2017). To this end, DBER consistently draws upon research methodologies from educational psychology and the learning sciences, including both qualitative (Denzin and Lincoln, 2005; Charmaz, 2006) and quantitative approaches (Petscher et al., 2013). In addition, DBER often involves incorporating a guiding framework, which may be theoretical in nature, as part of the research process. Despite the use of theoretical frameworks being a norm within education research communities, prior work in the context of the mathematics education research community indicates that using theoretical frameworks poses challenges for emerging education researchers (Haas et al., 2022). This is likely a challenge shared across DBER communities, given the common critique across DBER fields that theoretical frameworks are inconsistently used and that studies are often undertheorized (Lohmann and Froyd, 2010; Reinholz and Andrews, 2019; Bussey et al., 2020; Bakker et al., 2021). To provide researchers and practitioners an understanding of the role of frameworks in chemistry education research (CER), the goal of this review is to investigate how frameworks are used in CER and to contextualise the findings by examining the use of frameworks across DBER more broadly. As part of this review, we seek to provide a primer to support individuals at the periphery of our community of practice, such as emerging education researchers and instructors interested in knowing more about education research and the use of frameworks.

Frameworks in CER

The CER community commonly emphasizes the importance of theoretical frameworks for guiding research, and frameworks are considered an important component to include when publishing research findings (Towns, 2013; Seery et al., 2019; Stains, 2022). Within the CER community, one useful resource related to theoretical frameworks is Theoretical Frameworks for Research in Chemistry/Science Education by Bodner and Orgill (2007), which provides an overview of the importance of theoretical frameworks in education research and provides individual chapters that focus on different theoretical frameworks. They broadly define theoretical frameworks as:

“…a system of ideas, aims, goals, theories and assumptions about knowledge; about how research should be carried out; and about how research should be reported that influences what kind of experiments can be carried out and the type of data that result from these experiments” (p. v, “Prologue”).

One of the key points we would like to draw attention to is that theoretical frameworks should influence interpretive and methodological decisions throughout the research process. This is highlighted in Bodner and Orgill's (2007) book, where each framework is described along with implications for analysis and interpretation processes and techniques. That said, theoretical frameworks are important because different theoretical frameworks may lead to different approaches toward data interpretation, with implications for methodological choices, conclusions reached, and the resulting suggestions made to researchers and practitioners. This has been illustrated across DBER, with various studies applying different theoretical frameworks to the same dataset, illustrating how different frameworks lead to different predictions, explanations, and inferences related to students’ reasoning (e.g., Elby, 2000; Southerland et al., 2001; Harrer et al., 2013; Gouvea and Simon, 2018; Lira and Gardner, 2020; Rodriguez and Towns, 2021). In this sense, we can view a theoretical framework as a model that we use to explain and predict phenomena, recognizing that often there are multiple models to describe the same phenomenon, such as different perspectives related to the nature and structure of knowledge (Elby, 2000; Rodriguez and Towns, 2021). Thus, criteria related to robust scientific models [e.g., consistency with data, explanatory and predictive power, etc.; (Passmore et al., 2016)] can and should be applied to how we think about theoretical frameworks.

With the various ways theoretical frameworks can inform a study, it is important to consider the research goals when selecting a framework to ensure alignment. To provide researchers with guidance in selecting an appropriate theoretical framework, Bodner and Orgill (2007) broadly grouped frameworks thematically into three categories: constructivist frameworks that emphasize the continuous and active development of concepts based on individual and collective observations (e.g., constructivism and social constructivism, symbolic interactionism, models and modelling, pedagogical content knowledge); hermeneutic frameworks that emphasize iterative and cyclic analysis of the relationship between part of a text (broadly defined) and the whole text to generate meaning, often related to understanding human experiences (e.g., hermeneutics, phenomenology, phenomenography, action research, ethnology and ethnomethodology, situated cognition, communities of practice, narrative analysis); and critical frameworks that emphasize social structures and concerns such as the uneven distribution of power (e.g., critical theory, feminism, Afrocentricity). This organization of theoretical frameworks is not exhaustive, and there may be some overlap of frameworks across categories, as well as potentially productive alternative approaches toward classiyfing frameworks; nevertheless, the grouping based on Bodner and Orgill (2007) provides a useful starting point for characterizing theoretical frameworks.

The three categories of frameworks presented in Bodner and Orgill (2007) also generally align with three paradigms which often guide different education research studies: positivist (and post-positivist), interpretivist, and critical paradigms (Treagust et al., 2014). For instance, studies using constructivist frameworks tend to focus on objectively characterising the nature of students’ content knowledge or conceptual learning, aligning with the (post-)positivist paradigm which is characterised by seeking scientific objectivity and using experimental designs to develop causal explanations. In contrast, studies using hermeneutic frameworks are more concerned with how students and teachers experience the learning environment, which reflects the values of the interpretivist paradigm to focus on the situational and contextual nature of human experience. Lastly, studies using critical theory frameworks are aligned with the critical paradigm in that they share a focus on equity and challenging power dynamics in learning contexts. However, the alignment between frameworks and paradigms is complex, as exemplified when considering constructivism which can range from personal to critical constructivism (Bodner and Orgill, 2007). Further, as paradigmatic perspectives are often unstated in research articles, it is important to recognize that any alignment suggested is based on our understanding of how different frameworks and paradigms are described in the literature.

Within DBER, researchers also use frameworks that are not necessarily theoretical in nature [e.g., analytical frameworks (Luft et al., 2022; Magana, 2022)]. While there is a distinction between theoretical frameworks and other types of frameworks that may not be theoretical, other types of frameworks can still be beneficial for guiding studies and vary in purpose and intended use. From an engineering education perspective, Magana (2022) identifies the purpose of theoretical and conceptual frameworks as “defining, grounding, and explaining the focus of a study”; methodological and analytical frameworks “for planning and executing the methods of a study”; and, lastly, instructional design, pedagogical, and evaluation frameworks “for planning, delivering, and evaluating instruction.” These different types of frameworks are seen in chemistry as well. For example, the Anchoring Concepts Content Map (ACCM) released by the American Chemical Society – Exams Institute describes the content covered in an undergraduate chemistry curriculum (Murphy et al., 2012). This framework, developed by content experts, reflects the goals and organization of content to guide curriculum and assessment and can be viewed as belonging to the category of instructional design, pedagogical, and evaluation frameworks (Magana, 2022). Importantly, the ACCM is not theoretical in nature; that is, it would provide little interpretive, predictive, or explanatory power in a study. Nevertheless, the ACCM can still be used as part of data analysis; for example, deductive coding using the ACCM can be a productive approach to develop themes related to content coverage within studies in a systematic review (Bain and Towns, 2016; Hunter et al., 2022). Similarly, the chemistry triplet has been used extensively within CER to guide data analysis (Johnstone, 1982), and although it can provide insight regarding what makes chemistry challenging, it not inherently theoretical. To add to the complexity of framework use in CER, multiple frameworks may be compatible and useful for a single study and different types of frameworks may be used in tandem to shape an investigation. Furthermore, different types of frameworks (e.g., theoretical, conceptual, analytical, methodological) are often conceptualized in different ways with overlapping meanings, leading to challenges with differentiating between types of frameworks. Therefore, for this review, it is necessary to focus on frameworks broadly, and frameworks (without any modifier) will be used throughout this work to encompass all types of frameworks encountered in studies.

Frameworks for facilitating communication across communities

Each DBER field has their own discipline-based perspectives with shared goals, norms, behaviours, and identity. The specific perspectives of the DBER fields are what make them individual communities of practice (Wenger, 1998; Rynearson, 2015). Focusing on the CER community of practice and the purpose of this review, one readily identifiable norm is the use of a framework as part of the research process. Although frameworks are not specific to CER, their importance to our community is reflected in the criteria used to evaluate articles submitted to Chemistry Education Research and Practice, which states that “authors should aim to increase the importance of their own study by establishing connections to existing theoretical or conceptual research frameworks” (Seery et al., 2019). A similar sentiment is echoed in the recommendation provided for articles submitted to the Journal of Chemical Education: “CER is theory-based, providing foundational support that situates and shapes research questions, and that guides methodological choices” (Towns, 2013). Given the inherent interdisciplinary nature of CER and (DBER broadly), it is often productive to consider what Wenger (1998) refers to as a “constellation” of interconnected communities of practice (p. 127), reflecting the complex and manifold nature of communities, which often intersect with other communities or fit within the context of a larger community. For example, existing literature describes communities of practice related to inorganic chemistry education (Eppley et al., 2009), organic chemistry education (Leontyev et al., 2020), and chemistry outreach (Santos-Díaz and Towns, 2020, 2021), all of which involve overlapping or nested communities. Furthermore, effective change requires interaction between research and practice communities (Szteinberg et al., 2014), which necessitates making research accessible in terms of making it available, understandable, and practical in application (Rodriguez and Towns, 2019).

Transformative change also requires interactions across disciplinary communities. Exploring framework use in CER and across DBER fields can facilitate change in STEM higher education, with frameworks specifically having the ability to connect multiple domains of knowledge that enable needed connection amongst the currently siloed STEM higher education landscape (Reinholz and Andrews, 2019). CER often draws frameworks from other communities, such as looking at mathematics education research for ways to investigate how students use mathematics in the context of chemistry (Bain et al., 2019). Therefore, it is more than shared disciplinary skills, language, and concepts that connect DBER communities. We are connected by theories and frameworks related to concerns such as how students learn and how to promote conceptual change. To this end, we are motivated by the broader goal of connecting communities of practice by using frameworks to facilitate the flow of knowledge across STEM disciplinary domains.

Purpose and guiding questions

This work presents a literature review focused on how the CER and DBER communities use frameworks, broadly defined to include any type of frameworks including theoretical, conceptual, analytical, and methodological frameworks. The review draws on two samples of data: CER research articles published from 2018 to 2021 and DBER articles published in 2021. The first sample provides insight regarding the presence of frameworks across CER articles and the types of frameworks that have been used and the second sample allows us to contextualize CER within the broader DBER landscape. Our goals are reflected in our guiding research questions:

(1) For CER articles published from 2018 to 2021 (n = 457), what trends emerge related to the presence of frameworks (a) by year and (b) methods used?

(2) For DBER articles published in 2021 (n = 595), what trends emerge related to the (a) presence of frameworks and (b) use of frameworks across each article?

(3) What types of frameworks were used in CER articles published from 2018 to 2021 (n = 457)?

Methods

Positionality statement

We would like to acknowledge and share how our positionalities have shaped our approach to this literature review in an effort to engage in reflexivity (Lincoln and Guba, 1985). As emerging scholars and early career researchers in the field of chemistry education, we recognize the importance of theory and frameworks for designing, analysing, and communicating research, and we endeavour to understand how frameworks have recently been utilised in CER and DBER broadly. Specifically, our research team includes two assistant professors, a research scientist, and a postdoctoral researcher working across three institutions in the Midwestern United States. Three of the co-authors’ backgrounds include graduate training in CER, and one co-author's graduate training was in bioanalytical chemistry with a focus on CER as a postdoctoral researcher. Our mutual interest in the study was inspired by our previous experiences working with frameworks and a desire to reflect on the (sometimes implicit) paradigms that shape our research decisions. Collectively, our experience with research frameworks ranges from constructivist, hermeneutic, and critical theory frameworks, and our studies have drawn upon stances across positivist/post-positivist, interpretive, and critical paradigms, depending on the goals of our respective research projects. With our varied experiences working with various types of frameworks across paradigmatic perspectives, we recognize the complexity of frameworks and how they may be used, and we emphasize that our analysis reflects our shared perspectives extending from our varied experiences. We do not intend the findings of our analysis to be prescriptive or definitive, but rather to open dialogue between research and practitioners regarding the complexity of frameworks in education research.

In addition, we would like to acknowledge that we are authors on 28 research articles in our sample. We believe our experiences successfully publishing within and outside the field of CER helped us consider what would be meaningful to share with researchers regarding the use of theory and frameworks. To account for potential bias, our thematic analysis consisted of independent and group coding until consensus was reached and all discrepancies were resolved through discussion. Additionally, we sought to draw from supporting texts to determine framework types (e.g., Bodner and Orgill, 2007). We do not believe we are authorities to dictate how researchers should or should not use frameworks. That said, the goal of this work is not to evaluate the quality of individual research articles. We echo the sentiment by Ashwin (2012) that a published study can be viewed as the shared product of a community, produced through a social process involving peer-review and context-specific group norms. Thus, we believe it would not be productive to critique features of individual studies.

Sampling

Chemistry education research articles. The review encompasses articles submitted to Chemistry Education Research and Practice (CERP) and the Journal of Chemical Education (JCE), the two established journals for publication of CER (Towns and Kraft, 2012). To select articles to include in this review, we began by retrieving articles published as CER within both journals in 2021: for CERP we retrieved all articles (excluding editorials), and for JCE we retrieved all articles with the “Chemical Education Research” keyword [this classification is reserved for manuscripts written and reviewed with the criteria for research manuscripts in the journal, (JCE Author Guidelines, 2023)]. We continued retrieving articles in a reverse chronological manner (i.e., retrieving articles published in 2020, then retrieving articles published in 2019) throughout the analysis process (described below). We stopped the process of retrieving and analyzing articles after completing the analysis of 2018 articles because we noted saturation with the trends identified in the longitudinal analysis (along with no statistical significance between years, as reported below). After the initial stages of article selection, we further evaluated each article for inclusion in the review, removing those which were perspective articles (i.e., articles contributing an opinion or perspective not grounded in some form of data collection and analysis) from the complete set of articles included. In total, 457 CER articles were included in the review, published between the years 2018 and 2021.
Discipline-based education research articles. We sought to contextualize the results related to our community by exploring how other DBER fields use frameworks. The initial challenge was to narrow the scope sufficiently to make this feasible; thus, we focused on a narrow timeframe to draw connections across DBER fields. We report on discipline-specific education research studies published in 2021, as opposed to journals that focus broadly on STEM education research. This affords general claims regarding norms that are specific to a particular DBER community, which would be challenging to elicit from a STEM education research journal. In the case where there were multiple discipline-specific education research journals for a field, the journals for our sample were selected based on journal impact factors, as well as input from the relevant community regarding journal quality (Towns and Kraft, 2012; Williams and Leatham, 2017). It is important to note there are limitations in using journal impact factors to measure research quality (Cameron, 2005; Rodriguez et al., 2017), but their prevalence in academia provides a useful metric for gauging the likelihood that a community would be familiar with and use a journal. Like the CER sample discussed above, our larger DBER sample focused on research articles.

The final sample involved articles collected from two journals for each community (except for physics). Across each field, the journals selected were: Biochemistry and Molecular Biology Education and CBE-Life Sciences Education (biology); the International Journal of Engineering Education and the Journal of Engineering Education (engineering); Educational Studies in Mathematics and the Journal for Research in Mathematics Education (mathematics); and Physical Review Physics Education Research (physics). Including the CER articles published in 2021 in CERP and JCE (i.e., a subset of the CER sample discussed previously), our final sample of DBER articles was n = 595. Provided in Table 1, we have additional information regarding the DBER sample, including the journal author guidelines regarding framework use. For the remainder of the article, we use the journal abbreviations provided in Table 1.

Table 1 Discipline-based education research studies published in 2021 by discipline and journal
DBER field Journal Journal description Author guidelines
Biology Biochemistry and Molecular Biology Education Biochemistry and Molecular Biology Education is an international journal aimed to enhance teacher preparation and student learning in Biochemistry, Molecular Biology, and related sciences such as Biophysics and Cell Biology, by promoting the world-wide dissemination of educational materials,” (BAMBED, 2023) No discussion of frameworks or theory in journal author guidelines (BAMBED Author Guidelines, 2023)
N = 128 Abbreviation: BAMBED
Impact factor: 1.160
N = 60
CBE-Life Sciences Education “LSE is written by and to serve professionals engaged in biology teaching in all environments, including faculty at large research universities who teach but do not view teaching as their primary mission, as well as those whose teaching is the major focus of their careers, in primarily undergraduate institutions, museums and outreach programs, junior and community colleges, and K–12 schools,” (LSE, 2023) “Authors of this type of article are encouraged to draw from the diverse social science theories, methods, and findings to inform their work, and to clearly define terms and approaches that may be unfamiliar to a biologist audience,” (LSE Author Guidelines, 2023)
Abbreviation: LSE
Impact factor: 3.365
N = 68
Chemistry Chemistry Education Research and Practice Chemistry Education Research and Practice (CERP) is the journal for teachers, researchers and other practitioners at all levels of chemistry education. … Coverage includes the following: research, and reviews of research, in chemistry education; evaluations of effective innovative practice in the teaching of chemistry; in-depth analyses of issues of direct relevance to chemistry education,” (CERP, 2023) “the studies reported should have all features of scholarship in chemistry education, that is they must be … theory-based … It is highly desirable that such contributions should be demonstrably based, wherever possible, on established educational theory and results,” (CERP Author Guidelines, 2023)
N = 116 Abbreviation: CERP
Impact factor: 1.902
N = 65
Journal of Chemical Education “The Journal of Chemical Education publishes peer-reviewed articles and related information as a resource to those in the field of chemical education and to those institutions that serve them. The journal typically addresses chemical content, laboratory experiments, instructional methods, and pedagogies. JCE serves as a means of communication among people across the world who are interested in the teaching and learning of chemistry,” (JCE, 2023) “Researchers use particular theoretical and methodological frameworks to inform their work. Discussion of these frameworks and why they are informative is very helpful to readers. Further, these frameworks should be used to guide the analysis of data and interpretation of results or findings,” (JCE Author Guidelines, 2023)
Abbreviation: JCE
Impact factor: 1.385
N = 51
Engineering International Journal of Engineering Education “The International Journal of Engineering Education (IJEE) is an independent, peer-reviewed journal. It has been serving as an international archival forum of scholarly research related to engineering education for over thirty years,” (IJEE, 2023) No discussion of frameworks or theory in journal author guidelines (IJEE Author Guidelines, 2023)
N = 173 Abbreviation: IJEE
Impact factor: 1.280
N = 131
Journal of Engineering Education “The Journal of Engineering Education is more than a place to publish papers—it is a vital partner in the global community of stakeholders dedicated to advancing research in engineering education from pre-college to post-graduate professional education. The Journal of Engineering Education seeks to help define and shape a body of knowledge derived from scholarly research that leads to timely and significant improvements in engineering education worldwide. The Journal of Engineering Education serves to cultivate, disseminate, and archive scholarly research in engineering education, “ (JEE, 2023) “Aspects of validity, reliability, and trustworthiness are taken into consideration, including but not limited to instrument and protocol development, data collection, handling of the data and interpretation of theory,” (JEE Author Guidelines, 2023)
Abbreviation: JEE
Impact factor: 1.569
N = 42
Mathematics Educational Studies in Mathematics “Educational Studies in Mathematics presents new ideas and developments of major importance to those working in the field of mathematics education. It seeks to reflect both the variety of research concerns within this field and the range of methods used to study them. It deals with methodological, pedagogical/didactical, political and socio-cultural aspects of teaching and learning of mathematics, rather than with specific programmes for teaching mathematics. Within this range, Educational Studies in Mathematics is open to all research approaches,” (ESM, 2023) No discussion of frameworks or theory in journal author guidelines (ESM Author Guidelines, 2023)
N = 76 Abbreviation: ESM
Impact factor: 2.402
N = 59
Journal for Research in Mathematics Education “JRME is a forum for disciplined inquiry into the teaching and learning of mathematics. The editors encourage submissions including: research reports, addressing important research questions and issues in mathematics education; brief reports of research; research commentaries on issues pertaining to mathematics education research,” (JRME, 2023) “The study is guided by a theoretical framework that influences the study's design; its instrumentation, data collection, and data analysis; and the interpretation of its findings. The literature review connects to and supports the theoretical framework. Make it clear to the reader how the theoretical framework influenced decisions about the design and conduct of the study.” (JRME Author Guidelines, 2023)
Abbreviation: JRME
Impact factor: 3.676
N = 17
Physics Physical Review Physics Education Research “PRPER publishes detailed research articles, review articles, and replication studies. Descriptions of the development and use of new assessment tools, presentation of research techniques, and methodology comparisons or critiques are welcomed,” (PRPER, 2023) No discussion of frameworks or theory in journal author guidelines (PRPER Author Guidelines, 2023)
N = 102 Abbreviation: PRPER
Impact factor: 2.412
N = 102


Data analysis

Overview of analysis across both samples. Outlined in Fig. 1, the analysis was split into three categories: (1) analysis of the presence of frameworks, (2) tracking the use of a framework throughout each DBER article published in 2021, and (3) analysis of the commonly used frameworks for the 2018–2021 CER sample.
image file: d3rp00149k-f1.tif
Fig. 1 Overview of analysis across both samples in this review.

Reflected in our research questions, the initial stage of our analysis was relevant to both the longitudinal CER sample and the 2021 DBER sample. Here, we were interested in identifying the presence of a framework and contextualizing emerging trends using the journal author guidelines (see Table 1). The coding scheme for this analysis is presented in Table 2. Coding for distinct framework sections identified whether authors included clear headings distinguished as describing theory or frameworks, including phrases such “theoretical framework”, “conceptual framework”, “analytical framework”, and other similarly-worded heading variations. When coding for discussed framework(s) used in the research, we identified if the authors described a framework or theory as guiding the study, irrespective of placement within a distinct framework section, and we wrote memos identifying the specific frameworks the authors described. To receive this code, the authors needed to define or describe key constructs for the framework or theory such that they could be used to guide and inform the study. Articles that only provided a literature review of studies that used the framework, as opposed to reviewing the framework itself, did not receive this code. Lastly, we coded for the research methodology (qualitative, quantitative, mixed methods, or reviews).

Table 2 Coding scheme for shared analysis for: (A) longitudinal data analysis of CER articles and (B) 2021 DBER articles
Item coded Definition Examples Percent agreement (%) Cohen's kappa
A B A B
a Reliability measures represent the agreement between two pairs of coders. b Reliability measures represent the agreement between two independent coders. c The specific framework(s) discussed were noted within memos for articles receiving this code.
Distinct framework section The article includes an explicit section that describes the framework(s) and clearly indicates that the framework(s) were used to guide the study “Theoretical Framework(s),” “Guiding Frameworks,” “Meaningful Learning” 93a 100b 0.86a 1.00b
Discussed framework(s) used in the researchc The article specifically identifies the framework(s) guiding the study, providing definitions and/or descriptions of key constructs such that the framework/theory can be used to guide/inform the study Pedagogical content knowledge, representational competence, social cognitive, 3D learning, meaningful learning 90b 94b 0.78b 0.88b
Research methodology The research methodology used in the study Qualitative, quantitative, mixed methods (includes studies where qualitative data is quantitatively transformed and studies that utilise both qualitative and quantitative analysis), reviews (both literature and methodological) 89b 94b 0.83b 0.88b


Longitudinal data analysis of CER articles. For the CER articles in the dataset (published between 2018–2021), the coding results were quantitatively analysed by year and research methodology for the codes distinct framework section and discussed framework(s) used in the research (see codes provided in Table 2). Data analysis was performed using Stata. The chi-squared test of homogeneity was used to identify differences between the frequency of distinct framework section and discussed framework(s) used in the research across the full corpus, and the frequency across year and research methodology for both codes. Statistical significance was set at a = 0.05. When chi-squared analyses indicated statistically significant deviations from the expected frequencies, the standardised residuals were calculated to determine which observed frequencies differed significantly from the expected frequencies and thus contributed significantly to the chi-square value. Standardised residuals above z.05 = 1.64 were considered as significant at the a = 0.05 level. In addition to the statistical analysis, we used the list of frameworks identified across the dataset to characterise the range of frameworks present for the longitudinal sample. Because frameworks are broadly defined and may be used in various ways, we sought to align the list of frameworks with the categories previously discussed in Bodner and Orgill (2007): constructivist frameworks, hermeneutic frameworks, and critical theory frameworks. Through this analysis, we identified a fourth category of frameworks was also present, representing frameworks that can be described using Magana's (2022) description of instructional design, pedagogical, and evaluation frameworks; this category broadly reflected frameworks related to the organization of chemistry knowledge and the articulation of target knowledge and skills. The entire research team discussed and reviewed the categorization of specific frameworks into the four categories.
Data analysis of 2021 DBER articles. Among the 2021 DBER articles that were characterized using the code discussed framework(s) used in the research, we analyzed how each article used the frameworks identified. Starting with the CER articles, this was accomplished by focusing on the different sections of each article (e.g., Methods, Findings, Conclusions) and initially coding as Yes or No based on whether the section referenced the framework and/or its related constructs. For example, we analyzed whether the Methods section described data collection or data analysis that were informed by the framework. It is important to note there was variation in heading titles and we use the word “Findings” to reflect the portion of the article that describes the outcome of the analysis, which is often accompanied with raw or processed data, quotes, graphs, and other data visualizations. Similarly, we use the word “Conclusions” to reflect the end portion of the article where typically no new data are presented, and the research is contextualized using extant literature to discuss suggested future work and implications for practitioners. This initial approach toward tracking the framework throughout articles did not provide enough granularity to capture variation in how the frameworks were used. For this reason, we made a distinction between whether a framework was used to guide the data collection process and whether the framework was used to guide the analysis of data. Moreover, we focused on whether connections were made to the framework in the Findings or if the framework played a larger role and was integrated across the section or structured the section (e.g., using the framework to organize themes in the Findings). This approach was then applied to the remainder of the DBER sample. Based on the specific ways articles used the frameworks, we developed inductive categories (Tier 1, Tier 2, Tier 3, and Tier 4) that are described in the Findings.
Reliability. Efforts were made to establish reliability for each stage of the analysis process. For the first round of analysis (analysing the presence of a framework), all researchers analysed the full set of JCE and CERP articles published in 2021. Two pairs of researchers (J-MGR & JEN; SAFQ & FMW) completed this analysis through a separate consensus process, after which the coding from both pairs were compared to calculate 93% agreement with a Cohen's kappa of 0.86. These values represent strong agreement, and all disagreements were discussed to clarify the definition of the distinct framework section code and to reach complete consensus. Regarding the remaining codes applied, reliability was established by two researchers (SAFQ & FMW) analysing a randomised 20% subset of the longitudinal sample of CER articles (stratified by year and journal). The percent agreement and Cohen's kappa for discussed framework(s) used in the research and research methodology are reported in Table 2, indicating moderate and strong agreement for each code, respectively (Watts and Finkenstaedt-Quinn, 2021). The researchers discussed and reached consensus on the disagreements in coding, which primarily arose for articles which were ambiguous about whether they employed a framework in the study. The researchers divided the remaining articles to analyse independently and noted any ambiguous articles for discussion to reach consensus.

This analysis was then extended beyond the CER context to confirm its application to the DBER sample. Here, reliability was established by two researchers (J-MGR & JEN) analysing a randomized subset consisting of 5% of the (non-CER) DBER articles (n = 30). This involved: (1) applying the codes provided in Table 2 and (2) assigning the previously discussed inductive categories used to characterize framework use across individual articles (Tiers 1–4). See Table 2 for the percent agreement and kappa values for the previously discussed codes. For the inductive categories related to tracking framework use across the DBER sample, the percent agreement was 88% and Cohen's kappa was 0.76, indicating moderate agreement (Watts and Finkenstaedt-Quinn, 2021). Disagreements were then discussed to reach consensus. Lastly, we sought feedback from members of the biology, engineering, mathematics, and physics education research communities to lend validity to our claims related to each field, analogous to member-checking (Carlson, 2010). We sent an excerpt of the manuscript to disciplinary experts, asking for comments on the claims and their impression of the results to provide insight regarding consistency with their experience in their community and their familiarity with the relevant journals, especially in terms of expectations regarding framework use. This process provided confirmation regarding the claims made to ensure we were adequately capturing community norms regarding framework use. Experts commented on preferences related to publication destination: for mathematics, a tendency to publish equally within both high-impact journals of JRME and ESM; for biology, a preference in publishing in LSE over BAMBED; for engineering, a preference in publishing in JEE over IJEE; and for physics, PRPER was confirmed as the primary discipline-specific journal.

Findings

Longitudinal trends of framework presence in CER

Of the CER articles published in JCE and CERP between 2018 and 2021, 282 contained a distinct framework section and 332 discussed a framework (or frameworks) guiding the study. Thus, while 62% of the research articles published between 2018–2021 contained a section describing the framework or frameworks guiding the study, 73% discussed a framework somewhere in the article (χ2 = 277.2676, p < 0.001). Analysis for variation in the presence of a framework across 2018–2021 showed no statistically significant difference by year for either containing a distinct framework section or a discussion of guiding frameworks (Table 3).
Table 3 Number of CER articles by year that (a) included a distinct framework section or (b) discussed framework(s) used in the research
2018 2019 2020 2021 Total (N = 457) χ2 p-Value
Distinct framework section 66 65 71 80 283 5.4288 0.143
Discussed framework(s) used in the research 80 77 84 91 333 5.0248 0.170
N Year 120 110 111 116


Furthermore, we examined trends regarding the inclusion of frameworks between different research methodologies used in the articles (Table 4). Here, there was a statistically significant difference from the expected distribution across methodologies for both the presence of a distinct framework section (χ2 = 30.6148, p < 0.001) and whether articles contained some discussion of the framework(s) guiding the study (χ2 = 38.2645, p < 0.001). For each set of analyses, the standardised residuals were calculated to determine which methodologies differed significantly from the expected frequencies. For the presence of a distinct framework section, the standardised residuals were significant for articles using only qualitative (z = ±4.663) and only quantitative methodologies (z = ±3.954), compared to z = ±0.302 and 2.049 for mixed methods and reviews, respectively. Similarly, the standardised residuals for the chi-square examining whether framework(s) were discussed were also significant for articles that used only qualitative (z = ±5.042) and only quantitative (z = ±4.887) methodologies, compared to z = ±0.070 and 1.784 for mixed methods and reviews, respectively. Examining the frequencies in conjunction with the significant residuals, it appears that more qualitative studies and fewer quantitative studies contain a distinct framework section or description of the framework(s) than would be expected.

Table 4 Number of CER articles by methodology that (a) included a distinct framework section or (b) discussed frameworks used in the research
Qualitative Quantitative Mixed methods Reviews Total (N = 457) χ2 p-Value
Distinct framework section 114 52 112 4 282 30.6148 <0.001
Discussed framework(s) used in the research 130 62 134 6 332 38.2645 <0.001
N Methodology 148 113 184 12


Contextualizing framework presence within the broader DBER landscape

As we reflect on how our community uses frameworks in the research process, it is helpful to consider framework use in other DBER communities. In this section, we present trends related to the presence of frameworks in research articles published in 2021 across biology, chemistry, engineering, mathematics, and physics education research journals. Fig. 2 summarizes the trends related to whether articles discussed framework(s) used in the research, with some of these articles including a distinct framework section.
image file: d3rp00149k-f2.tif
Fig. 2 Percentage of DBER articles by journal from 2021 that (a) discussed framework(s) used in the research and (b) included a distinct framework section. Note that articles with a distinct framework section are a specific subset of the articles that discussed framework(s) used in the research.

As shown in Fig. 2, there was large variation in whether studies had a framework across the disciplinary journals. Focusing on specific disciplines, in chemistry, 82% of the articles in CERP and 75% of the articles in JCE discussed a framework. This aligns with the authorship guidelines for both journals that emphasize authors should discuss framework(s) in relation to their work (CERP Author Guidelines, 2023; JCE Author Guidelines, 2023). Framework use is also high in the mathematics education research articles, where 90% of the articles published in ESM and 88% of the articles published in JRME discussed frameworks. The high percentage of framework use across mathematics education research articles is notable considering the journal guidelines. Specifically, the ESM journal author guidelines do not explicitly mention the use of frameworks in submitted manuscripts (ESM Author Guidelines, 2023), while the JRME author guidelines have a subsection on theoretical frameworks and how they should influence the study design (JRME Author Guidelines, 2023). The high presence of framework use across mathematics education articles despite the difference in authorship guidelines suggests the use of frameworks may be an implicit expectation and norm within the mathematics education research community of practice. As for engineering and biology education research, the discussion of frameworks is mixed across their respective disciplinary journals. In engineering education research, 69% of studies discussed a framework in JEE, but only 14% of studies discussed a framework in IJEE. Biology education exhibits a similar difference across journals, where LSE has 40% of the research articles involving a discussion of frameworks, but only 5% of articles within BAMBED. In both cases, this reflects the authorship guidelines related to framework use for the respective journals (BAMBED Author Guideline, 2023; IJEE Author Guidelines, 2023; JEE Author Guidelines, 2023; LSE Author Guidelines, 2023). Within PRPER, the primary discipline-specific journal for physics education research, 41% of articles involved a discussion of a framework. This is a similar percentage to that in LSE, even though the incorporation of a framework or theory is not discussed in the journal's author guidelines (PRPER Author Guidelines, 2023). While there is variation across discipline and journal for whether a framework or frameworks are described, a general trend across disciplines was that most studies discussing a framework also included a distinct framework section.

Use of frameworks throughout 2021 DBER articles

For the DBER articles published in 2021, we also characterized the ways frameworks were used throughout individual articles, narrowing our analysis to articles coded as discussed framework(s) used in the research. Shown in Table 5, there are different ways frameworks were used across the articles; based on the different ways an article used framework(s), we grouped the articles into tiers. Note that each subsequent tier includes the defining features of the previous tier and introduces a new way the framework was used in an article [e.g., Tier 1 encompasses articles that used the framework to “Provide supporting literature to contextualize the study”; Tier 2 encompasses articles that used the framework to (1) “Provide supporting literature to contextualize the study” and (2) “Make connections to data in the Findings and Conclusions”, etc.]. These tiers are the result of inductive analysis and are not intended to reflect a hierarchy, nor are they intended to critique the quality of research conducted.
Table 5 Overview of how articles were grouped into different tiers based on the ways an article used a framework. Note that the tiers build on one another, so Tier 4 includes the elements of previous tiers
Tier 1 Tier 2 Tier 3 Tier 4 Framework was used to …
Provide supporting literature to contextualize the study
Make connections to data in the findings and conclusions
Guide the data analysis
Organize the findings
Guide the data collection


In the case of Tier 1, articles primarily used a framework as supporting literature to situate and provide context for the work described. In addition to using the framework to contextualize the study, articles in Tier 2 drew connections between the framework and the data in the Findings and Conclusions. This was often observed as making a claim using the data presented and then commenting on how that relates to the framework. In the case of Tier 3, articles also used the framework and related constructs to analyze the data. Moreover, for articles in Tier 3, the organization and framing of the Findings were influenced by the framework, with the framework integrated throughout the Findings (as opposed to only being referenced). In the case of qualitative studies, this often stemmed from the framework's more explicit role in data analysis and thus generation of themes; quantitative studies typically emphasized the assumptions from the frameworks used to determine the constructs being measured (e.g., performance, identity, and belonging, etc.). Lastly, studies in Tier 4 included the components of the previous tiers but also discussed how the framework(s) informed the data collection process; thus, articles in this tier discussed the framework and how it was relevant across the primary sections of the article (Data Collection, Data Analysis, Findings, Conclusions). Based on the patterns in how each group used frameworks, the primary feature that separates Tiers 1 and 2 from Tiers 3 and 4 is that studies in Tiers 3 and 4 incorporated the framework into the data analysis process (which then subsequently informed the organization of the Findings).

As suggested by Fig. 3, the previously discussed findings pertaining to community norms regarding the presence of frameworks generally corresponds with how the articles use frameworks. For the biology and engineering journals, LSE and JEE author guidelines include the expectation for a framework, whereas BAMBED and IJEE author guidelines do not, and this appears to be reflected in the prevalence of articles in Tiers 3 and 4 for LSE and JEE, whereas there are proportionally more articles in the Tier 1 and 2 categories for BAMBED and IJEE. In contrast, the mathematics journals (JRME and ESM) both include a higher proportion of Tier 3 or Tier 4 articles versus Tier 1 or Tier 2 articles. This suggests that not only is the use of frameworks an established community norm (Fig. 2), but also that frameworks may be expected to be used in a particular way (i.e., as part of data analysis), despite just one of the journals (JRME) emphasizing frameworks in the author guidelines (it is also interesting to note that JRME does not have any Tier 1 articles). Similarly, PRPER author guidelines do not discuss framework use, though a larger proportion of articles use frameworks in alignment with Tiers 3 and 4. Notably, for the CER journals, both sets of author guidelines mention framework use and both journals include a high proportion of articles using frameworks that aligned with Tiers 3 and 4. Thus, in the case of CER, the norms of the community appear to be well-aligned with the guidelines for authoring articles in both journals.


image file: d3rp00149k-f3.tif
Fig. 3 Overview of results from tracking the use of a framework throughout the 2021 DBER articles. Note that the tiers build on one another, so Tier 4 includes the elements of previous tiers. This analysis includes the articles that were coded as discussed framework(s) used in the research.

Common frameworks used in CER

Our analysis also sought to broadly identify the common frameworks used across all CER articles included in the review. There were more than 150 distinct frameworks identified across the four years of articles included in the analysis. Frameworks were commonly identified in articles via the term “theoretical framework” or broader terms such as “framework,” with terms such as “conceptual framework” or “analytical framework” appearing less frequently. Few frameworks were identified specifically as “methodological” frameworks (e.g., phenomenology, action research). A subset of frameworks were referred to across articles using a mixture of the terms “theoretical,” “conceptual,” or “analytical” frameworks (e.g., Johnstone's triangle, representational competence, mechanistic reasoning, pedagogical content knowledge). Generally, the range of frameworks aligned with the three broad categories previously described: constructivist frameworks, hermeneutic frameworks, and critical theory frameworks (Bodner and Orgill, 2007). A subset of frameworks did not align with these categories and instead reflected the organisation of chemistry knowledge or the quality of scientific practices, which are often the learning goals for chemistry courses; as such, these served as frameworks for investigating and interpreting how different instructional approaches support these goals. The frameworks used the most often are presented in Table 6. Across the categories of frameworks, the majority identified within the dataset were related to constructivism and social constructivism, with many of the remaining frameworks related to hermeneutics or the organisation of chemistry knowledge. The fewest frameworks aligned with critical theories. Apart from the frameworks that reflect the organisation of chemistry knowledge, another trend reflected in the table is that most frameworks used are derived from literature outside of CER (e.g., frameworks are often drawn from domains such as education, psychology, or social science).
Table 6 Overview of common frameworks identified within the reviewed articles, sorted into the categories of constructivist frameworks, hermeneutic frameworks, critical theory frameworks, and frameworks related to the organization of chemistry knowledge. The constructivist frameworks aligned with constructivism, social constructivism, or learning and cognition. The authors recognize that different frameworks can and do span the four categories in the table, and the placement of frameworks within the categories are based on the authors’ understanding of which category the frameworks best align with based on how the frameworks were discussed within the articles included in the review. For each framework indicated, a single article is provided from the sample to exemplify how the framework is used
Constructivist frameworks Hermeneutic frameworks Critical theory frameworks Organization of chemistry knowledge
Frameworks aligned with constructivism
Conceptions-based constructivism (e.g., conceptual change; Park et al., 2020) Action research (Sansom et al., 2021) Belonging and identity (Fink et al., 2020) Chemical thinking framework (Caushi et al., 2021)
Constructivism (broadly stated; e.g., Nakiboğlu and Nakiboğlu, 2019) Attitudes toward chemistry (Kousa et al., 2018) Culturally responsive teaching (Reimer et al., 2021) Johnstone's triangle/the chemistry triplet (Irby et al., 2018)
Fine-grained constructivism (e.g., the resource-based model of cognition; Bain et al., 2018) Design-based research (Wu et al., 2021) Cultural competence and social justice (Clark et al., 2020) Mechanistic reasoning (Macrie-Shuck and Talanquer, 2020)
Meaningful learning (Enneking et al., 2019) Phenomenology (Burrows et al., 2021) Gender performativity (Miller-Friedmann et al., 2018) Systems thinking (Talanquer, 2019)
Mental models (Bongers et al., 2019) Science and chemistry identity (Hosbein and Barbera, 2020) Science capital (Rüschenpöhler and Markic, 2020) Three-dimensional learning/A Framework for K-12 Science Education (Underwood et al., 2021)
Pedagogical content knowledge (Akinyemi and Mavhunga, 2021) Motivation/self-determination theory (Partanen, 2020)
Representations and representational competence (Ferreira and Lawrie, 2019) Self-efficacy (Willson-Conrad and Grunert Kowalske, 2018)
Teacher-centred systemic reform model (Rupnow et al., 2020)
Teacher noticing (Schafer and Yezierski, 2021)
Frameworks aligned with social constructivism
Argumentation (Shah et al., 2018)
Communities of practice (Santos-Díaz and Towns, 2021)
Situated learning (Reynders et al., 2019)
Social constructivism (broadly stated; e.g., Bancroft et al., 2020)
Sociocultural learning theory (Schmidt-McCormack et al., 2019)
Frameworks aligned with learning and cognition
Bloom's taxonomy (Lu et al., 2020)
Cognitive load theory (Karch et al., 2019)
Information processing (Galloway et al., 2019)
Metacognition (Heidbrink and Weinrich, 2021)


Researchers generally operationalized the specific frameworks used in alignment with the goals of the broader categories to which the frameworks belong. For example, studies using conceptions-based or fine-grained constructivist theories sought to understand the nature of students’ content knowledge and conceptual learning (e.g., Bain et al., 2018; Park et al., 2020), and studies using the pedagogical content knowledge framework focused on how instructors enact or develop their knowledge for teaching (e.g., Akinyemi and Mavhunga, 2021). In both cases, the use of frameworks was consistent with the constructivist framework category and its focus on the nature, structure, and development of knowledge. In contrast, studies using phenomenology or other hermeneutic theories such as self-efficacy generally sought to characterise how students and teachers experience features of the learning process or environment (e.g., Willson-Conrad and Grunert Kowalske, 2018; Burrows et al., 2021), reflecting the hermeneutic framework category and its attention to deriving meaning from individual experiences. Lastly, studies leveraging critical theories such as belonging and identity sought to investigate power structures that relate to inequities within teaching and learning in chemistry (e.g., Fink et al., 2020), mirroring the critical theory framework category and its emphasis on social systems and the distribution of power. As discussed by Bodner and Orgill (2007), specific frameworks lend themselves to particular study designs and are relevant for specific research topics, and researchers should select frameworks that align with the scope and research aims.

Conclusions

Contextualized using findings across DBER disciplines, this review provides insight regarding the use of frameworks (broadly defined) in CER. Through our analysis of CER articles published between 2018–2021, we identified that 73% discussed a framework guiding the study, whereas 62% of the articles included a distinct framework section. Across the four years analyzed, the presence of a distinct framework section or the discussion of a guiding framework has not significantly changed. However, we did identify a significant difference across methodologies, where qualitative studies tend to include frameworks more than quantitative studies. For other DBER fields, we identified large variation in whether studies included a framework or a distinct framework section (Fig. 2), suggesting an inherent connection between the presence of frameworks and the norms of each discipline (which are often, but not always, reflected within journal-specific authorship guidelines). For example, incorporating frameworks into a study may be an implicit expectation and norm within the mathematics education research community of practice because of the high presence of frameworks across articles, despite the difference in authorship guidelines (ESM vs. JRME). However, for the CER community of practice, the use of frameworks has a clear expectation provided by both CER journals, which was reflected in the incorporation of frameworks in the articles in our sample.

In addition to characterizing the presence of frameworks, our analysis also sought to identify how frameworks are used within individual articles published across DBER fields. For this analysis, we found that across all DBER disciplines, many studies presented a framework that was involved in the Data Analysis, informed the Findings, and was revisited in the Conclusions (Tiers 3 and 4), whereas a smaller subset of articles presented a framework that contextualized the study but was not discussed in relation to methodological decisions (Tiers 1 and 2). Most DBER journals had a higher fraction of articles that were categorized as Tier 4 in comparison to articles categorized as Tier 1 but there was variation (both across disciplines and within disciplines’ respective journals). Lastly, building on the correspondence between author guidelines and whether a framework is present for each disciplinary community (Fig. 2), we observed a correspondence between author guidelines and the varied ways in which frameworks are used (Fig. 3).

Lastly, we examined the common frameworks used within CER studies, which we grouped into four broad categories: constructivist frameworks, hermeneutic frameworks, critical theory frameworks, and frameworks related to the organization of chemistry knowledge. Articles mostly employed frameworks under the umbrella of constructivist frameworks, which included the subcategories of frameworks related to both social constructivism as well as learning and cognition. Our findings suggest that fewer studies use critical theory frameworks and (to a lesser extent) hermeneutic frameworks.

Implications

Altogether, the findings from this study provide implications for the CER community regarding: (1) whether frameworks are necessary for all research articles; (2) how frameworks may be used in different ways; (3) implications for practice; and (4) potential future research directions. The goal of discussing these implications is not to prescriptively answer these questions or to impose a value judgement on the variation in framework use, but rather to promote scholarly discussion regarding the use of frameworks and their importance to the growth of the field.

We argue it is necessary as a community to clarify and revisit expectations regarding the need to incorporate pertinent frameworks into research articles. Although authorship guidelines and recommendations suggested by editorials provide a great starting point for researchers to ensure their research and writing meets the standards of the respective journals, in communities such as CER, researchers continue to document and shape the norms of the field. Thus, it may be necessary to periodically reflect on author guidelines and consider whether they accurately reflect the current state of the field. We advocate for the importance of rethinking framework expectations given 27% of CER articles published in the last four years did not include a discussion of a guiding framework, indicating that it was not deemed necessary as it went through the journal review process. Stated differently, as a shared product of our community (Ashwin, 2012), it was decided that in some cases the project design did not warrant the inclusion of a framework. The variation in alignment between the community norms of disciplines (as reflected by the peer-reviewed articles analyzed) and the expectations described in the author guidelines can be seen across our DBER sample. In fact, as the co-authors discussed our literature review, we reflected on whether we needed to incorporate a framework. As we brainstormed, one potential framework we considered was communities of practice (Wenger, 1998). Literature related to communities of practice certainly informed our project by providing language to discuss this project and support our rationale; however, communities of practice did not explicitly inform the research process (e.g., data analysis). Rather than presenting a framework that did not track throughout different sections of the manuscript or forcing a framework (e.g., coding for constructs outlined in communities of practice like boundary objects or brokers, which would not add to our analysis), we chose to discuss the communities of practice literature without including it as an explicit “framework.” Importantly, whether a study needs a framework depends on the scope and goals of the study. For example, some investigations may not involve readily “theorizable” ideas, such as a study focusing on collecting data to inform future iterations of a course (e.g., through surveying student preferences related to in-person, hybrid, or online instruction). In contrast, other study designs, such as investigating students’ motivation to pursue a graduate program in chemistry, would be challenging without a framework. Motivation is a broad concept and there are multiple theories available with assumptions and definitions that could guide an investigation. As part of this, it is necessary to clarify whether the community holds the same expectations with respect to framework use for qualitative versus quantitative studies.

Furthermore, it is important to consider the ways frameworks can be used to support a study. Our analysis demonstrates that frameworks in CER articles often serve either as a picture “frame” (i.e., Tiers 1 and 2; the framework surrounds and contextualizes the study, and while it may influence the “big picture” narrative it could be removed without changing the specific findings of the research) or a house “frame” (i.e., Tiers 3 and 4; the framework provides a necessary structure and organizing scheme that is critical for the outcome of the research). Clarifying how frameworks inform and shape research is valuable both for increasing transparency about the research process while improving the ways emerging researchers and practitioners can identify connections between research findings and their own contexts.

We identified that constructivist frameworks are the most prevalent in our field, followed by a similar level of utilization for hermeneutic and organization of chemistry knowledge frameworks, and lastly an underutilization of critical theory frameworks. The distribution of framework usage suggests the current focus and values of our discipline and provides broader implications regarding the direction of future research within CER. In particular, recent calls for an increased focus on investigating diversity, equity, inclusion, and justice in CER (Winfield et al., 2020; Ryu et al., 2021) suggest a need to increase research using hermeneutic and critical frameworks, which are essential frameworks for social justice research (Metcalf et al., 2018). In general, the common usage of constructivist (e.g., personal and social) and organization of chemistry knowledge frameworks may indicate an unspoken post-positivist (or positivist) paradigm underlying many education research studies within the CER community, and it is necessary for researchers to engage in reflexivity regarding the traditional associations of positivist paradigms with privileging majority perspectives. We posit that research within the positivist paradigm can provide valuable contributions to our understanding of teaching and learning in chemistry, but that it is necessary to reflect on how research under these paradigms may lend itself to perpetuating structures that reinforce inequities. Hence, it is necessary for researchers to understand frameworks drawing from other paradigms and to recognize the value of approaching shared problems through the lenses of multiple paradigms and frameworks (Treagust et al., 2014).

In addition to contextualizing frameworks within their broader paradigms, it is important for researchers to reflect on the specific assumptions of frameworks and evaluate the validity of their application to chemistry. We noted that CER tends to borrow frameworks from other communities, as opposed to developing our own models that have explanatory and predictive power. Borrowing frameworks from other communities can be extremely productive, especially when investigating topics that align well with the goals of other communities (Bain et al., 2019); however, to continue to advance our community of practice, more work is needed in CER that uses data to develop chemistry-situated frameworks and theories, as well as work that critically evaluates the limitations of the frameworks we adapt from other research communities. Projects designed using a grounded theory approach are particularly well-suited to accomplish this goal (Strauss and Corbin, 1990; Charmaz, 2006).

In the context of the larger discussion related to bridging the research/practice divide, we echo the concerns related to accessibility and practicality (Rodriguez and Towns, 2019; Johnson, 2022; Sweeder et al., 2023). In terms of accessibility, we provided a general overview of how frameworks are used within CER and other DBER fields, focusing on use rather than categorizing frameworks into the categories of theoretical, conceptual, analytical, or methodological frameworks which are somewhat interchangeably used across the CER community. Additionally, as a community, we must consider the extent to which a “framework requirement” could make our work less accessible by potentially creating barriers toward entry into our community of practice. We aim to make our review more practical by providing a resource for instructors regarding frameworks and how they can serve as helpful tools to structure classroom interventions. Specifically, Table 6 summarizes types of frameworks that are organized thematically. As part of this, we draw the readers’ attention toward the variety of potentially useful instructional design, pedagogical, and evaluation frameworks. These frameworks can be used to shape assessment and instructional choices in the way they clearly articulate the organization of chemistry knowledge and outline target competencies related to developing expert reasoning.

Lastly, as discussed previously, our DBER communities are connected not just by shared content and skills, but by the frameworks we use as part of the research process. Although it was beyond the scope of the current review to compare the specific frameworks used across the DBER articles, we note this as a potential area for future inquiry and discussion. For example, examining how disciplines may use different frameworks to describe the same construct (e.g., identity) or how disciplines may be using the same framework in different ways (e.g., knowledge-in-pieces) could serve as a useful area to begin a dialogue across communities. Comparatively, multiple communities using the same frameworks may indicate shared assumptions that can serve as bridges between communities and demonstrate the broad utility of a framework. Considering how different communities may be operationalizing theoretical constructs differently, with open conversations across communities, has the potential to advance our collective knowledge about STEM education more broadly.

Author contributions

We would like to acknowledge and present the equal contributions of each author within the present work. J-MGR and JEN joined with SAFQ and FMW after discussing shared ideas to conduct a literature review on theoretical frameworks in CER. SAFQ and FMW had collected and analysed longitudinal data for CER articles while J-MGR and JEN had collected and analysed data for DBER articles within 2021. To combine the reviews, we developed a shared coding scheme for discussed framework(s) used in the research, distinct framework section, and methodology. Authors met regularly to discuss the revised/refined analysis, readjust paper scope, and provide manuscript feedback. J-MGR contributed to the DBER data analysis, reliability, writing, tables, and figures. JEN contributed to the DBER data analysis, writing, and reliability. SAFQ contributed to the CER longitudinal data analysis, reliability, writing, tables, and figures. FMW contributed to the CER longitudinal data analysis, reliability, writing, and tables.

Conflicts of interest

There are no conflicts to declare.

Acknowledgements

We would like to acknowledge representatives from the DBER communities in biology, engineering, mathematics, and physics for their input on this project. We would also like to acknowledge Cerys Rogers for helping with the initial coding of framework use in CER articles.

References

  1. Akinyemi O. S. and Mavhunga E., (2021), Linking pre-service teachers’ enacted topic specific pedagogical content knowledge to learner achievement in organic chemistry, Chem. Educ. Res. Pract., 22(2), 282–302 10.1039/D0RP00285B.
  2. Ashwin P., (2012), How often are theories developed through empirical research into higher education? Stud. High. Educ., 37(8), 941–955 DOI:10.1080/03075079.2011.557426.
  3. Bain K. and Towns M. H., (2016), A review of research on the teaching and learning of chemical kinetics, Chem. Educ. Res. Pract., 17(2), 246–262 10.1039/C5RP00176E.
  4. Bain K., Rodriguez J.-M. G. and Towns M. H., (2018), Zero-order chemical kinetics as a context to investigate student understanding of catalysts and half-life, J. Chem. Educ., 95(5), 716–725 DOI:10.1021/acs.jchemed.7b00974.
  5. Bain K., Rodriguez J.-M. G. and Towns M. H., (2019), Chemistry and mathematics: research and frameworks to explore student reasoning, J. Chem. Educ., 96(10), 2086–2096 DOI:10.1021/acs.jchemed.9b00523.
  6. Bakker A., Cai J. and Zenger L., (2021), Future themes of mathematics education research: an international survey before and during the pandemic, Educ. Stud. Math., 107(1), 1–24 DOI:10.1007/s10649-021-10049-w.
  7. BAMBED, (2023), About the journal. https://iubmb.onlinelibrary.wiley.com/journal/15393429.
  8. BAMBED Author Guidelines, (2023), Author guidelines. https://iubmb.onlinelibrary.wiley.com/hub/journal/15393429/author-guidelines.html.
  9. Bancroft S. F., Fowler S. R., Jalaeian M. and Patterson K., (2020), Leveling the field: flipped instruction as a tool for promoting equity in general chemistry, J. Chem. Educ., 97(1), 36–47 DOI:10.1021/acs.jchemed.9b00381.
  10. Bodner G. M. and Orgill M., (2007), Theoretical frameworks for research in chemistry/science education, Pearson.
  11. Bongers A., Northoff G. and Flynn A. B., (2019), Working with mental models to learn and visualize a new reaction mechanism, Chem. Educ. Res. Pract., 20(3), 554–569 10.1039/C9RP00060G.
  12. Burrows N. L., Ouellet J., Joji J. and Man J., (2021), Alternative assessment to lab reports: a phenomenology study of undergraduate biochemistry students’ perceptions of interview assessment, J. Chem. Educ., 2021, 98(5), 1518–1528 DOI:10.1021/acs.jchemed.1c00150.
  13. Bussey T. J., Lo S. M. and Rasmussen C., (2020), Theoretical frameworks for STEM education research, in Handbook of Research on STEM Education, Routledge, pp. 51–62.
  14. Cameron B. D., (2005), Trends in the usage of ISI bibliometric data: uses, abuses, and implications, Portal Libr. Acad., 5(1), 105–125 DOI:10.1353/pla.2005.0003.
  15. Carlson J. A., (2010), Avoiding traps in member checking, Qual. Rep., 15(5), 1102–1113.
  16. Caushi K., Sevian H. and Talanquer V., (2021), Exploring variation in ways of thinking about and acting to control a chemical reaction, J. Chem. Educ., 98(12), 3714–3722 DOI:10.1021/acs.jchemed.1c00902.
  17. CERP, (2023), Scope. https://www.rsc.org/journals-books-databases/about-journals/chemistry-education-research-practice/.
  18. CERP Author Guidelines, (2023), Journal specific guidelines. https://www.rsc.org/journals-books-databases/about-journals/chemistry-education-research-practice/#journal-specific-guidelines.
  19. Charmaz K., (2006), Constructing grounded theory: A practical guide through qualitative analysis, Sage Publications, Inc.
  20. Clark G. A., Humphries M. L., Perez J., Udoetuk S., Bhatt K. and Domingo J. P. et al., (2020), Urinalysis and prenatal health: evaluation of a simple experiment that connects organic functional groups to health equity, J. Chem. Educ., 97(1), 48–55 DOI:10.1021/acs.jchemed.9b00408.
  21. Denzin N. K. and Lincoln Y. S., (2005), Introduction: the discipline and practice of qualitative research, in Denzin N. K. and Lincoln Y. S. (ed.), The SAGE handbook of qualitative research, SAGE Publications, Ltd, pp. 1–32.
  22. Elby A., (2000), What students’ learning of representations tells us about constructivism. J. Math. Behav., 19(4), 481–502 DOI:10.1016/S0732-3123(01)00054-2.
  23. Enneking K. M., Breitenstein G. R., Coleman A. F., Reeves J. H., Wang Y. and Grove N. P., (2019), The evaluation of a hybrid, general chemistry laboratory curriculum: impact on students’ cognitive, affective, and psychomotor learning, J. Chem. Educ., 96(6), 1058–1067 DOI:10.1021/acs.jchemed.8b00637.
  24. Eppley H., Johnson A., Benatan E., Geselbracht M., Stewart J., Reisner B. et al., (2009), IONiC: a cyber-enabled community of practice for improving inorganic chemical education, J. Chem. Educ., 86(1), 123 DOI:10.1021/ed086p123.2.
  25. ESM, (2023), Aims and scope, https://www.springer.com/journal/10649.
  26. ESM Author Guidelines, (2023), Submission guidelines. https://www.springer.com/journal/10649/submission-guidelines.
  27. Ferreira J. E. V. and Lawrie G. A., (2019), Profiling the combinations of multiple representations used in large-class teaching: pathways to inclusive practices, Chem. Educ. Res. Pract., 20(4), 902–923 10.1039/C9RP00001A.
  28. Fink A., Frey R. F. and Solomon E. D., (2020), Belonging in general chemistry predicts first-year undergraduates’ performance and attrition. Chem. Educ. Res. Pract., 21(4), 1042–1062 10.1039/D0RP00053A.
  29. Galloway K. R., Leung M. W. and Flynn A. B., (2019), Patterns of reactions: a card sort task to investigate students’ organization of organic chemistry reactions, Chem. Educ. Res. Pract., 20(1), 30–52 10.1039/C8RP00120K.
  30. Gouvea J. S. and Simon M. R., (2018), Challenging cognitive construals: a dynamic alternative to stable misconceptions, CBE—Life Sci. Educ., 17(2), ar34 DOI:10.1187/cbe.17-10-0214.
  31. Haas C. A. F., El-Adawy S., Hancock E., Sayre E. C. and Savić M., (2022), Emerging mathematics education researchers’ conception of theory in education research, in Proceedings of the 24th Annual Conference on Research in Undergraduate Mathematics Education, Special Interest Group of the Mathematical Association of America (SIGMAA) for Research in Undergraduate Mathematics Education. pp. 1009–1014.
  32. Harrer B. W., Flood V. J. and Wittmann M. C., (2013), Productive resources in students’ ideas about energy: an alternative analysis of Watts’ original interview transcripts, Phys. Rev. Spec. Top. - Phys. Educ. Res., 9(2), 023101 DOI:10.1103/PhysRevSTPER.9.023101.
  33. Heidbrink A. and Weinrich M., (2021), Encouraging biochemistry students’ metacognition: reflecting on how another student might not carefully reflect, J. Chem. Educ., 98(9), 2765–2774 DOI:10.1021/acs.jchemed.1c00311.
  34. Henderson C., Connolly M., Dolan E. L., Finkelstein N., Franklin S. and Malcom S. et al., (2017), Towards the STEM DBER alliance: Why we need a discipline-based STEM education research community, Int. J. Res. Undergrad. Math. Educ., 3(2), 247–254 DOI:10.1007/s40753-017-0056-3.
  35. Hosbein K. N. and Barbera J., (2020), Development and evaluation of novel science and chemistry identity measures, Chem. Educ. Res. Pract., 21(3), 852–877 10.1039/C9RP00223E.
  36. Hunter K. H., Rodriguez J.-M. G. and Becker N. M., (2022), A review of research on the teaching and learning of chemical bonding, J. Chem. Educ., 99(7), 2451–2464 DOI:10.1021/acs.jchemed.2c00034.
  37. IJEE, (2023), Aims and scope. https://www.ijee.ie/Aims_and_scope_2015.html.
  38. IJEE Author Guidelines, (2023), Submission guidelines. https://www.ijee.ie/Submission%20Guidelines_2015.html.
  39. Irby S. M., Borda E. J. and Haupt J., (2018), Effects of implementing a hybrid wet lab and online module lab curriculum into a general chemistry course: impacts on student performance and engagement with the chemistry triplet, J. Chem. Educ., 95(2), 224–232 DOI:10.1021/acs.jchemed.7b00642.
  40. JCE, (2023), About the journal. https://pubs.acs.org/page/jceda8/about.html.
  41. JCE Author Guidelines, (2023), Content requirements for chemical education research manuscripts. http://pubs.acs.org/paragonplus/submission/jceda8/jceda8_cerguide.pdf.
  42. JEE, (2023), Overview. https://onlinelibrary.wiley.com/page/journal/21689830/homepage/productinformation.html.
  43. JEE Author Guidelines, (2023), Journal of Engineering Education (JEE) Author Guidelines. https://onlinelibrary-wiley-com.proxy.lib.uiowa.edu/page/journal/21689830/homepage/forauthors.html.
  44. Johnstone A. H., (1982), Macro- and micro-chemistry, Sch. Sci. Rev., 64, 377–379.
  45. Johnson R., (2022), Bridging the divide between chemistry educators and chemistry education researchers, J. Chem. Educ., 99(11), 3631–3632 DOI:10.1021/acs.jchemed.2c01035.
  46. JRME, (2023), About. https://pubs.nctm.org/view/journals/jrme/jrme-overview.xml?tab_body=about.
  47. JRME Author Guidelines, (2023), Characteristics of a high quality JRME manuscript. https://www.nctm.org/publications/write-review-referee/journals/Characteristics-of-a-High-Quality-JRME-Manuscript/.
  48. Karch J. M., García Valles J. C. and Sevian H., (2019), Looking into the black box: using gaze and pupillometric data to probe how cognitive load changes with mental tasks, J. Chem. Educ., 96(5), 830–840 DOI:10.1021/acs.jchemed.9b00014.
  49. Kousa P., Kavonius R. and Aksela M., (2018), Low-achieving students’ attitudes towards learning chemistry and chemistry teaching methods, Chem. Educ. Res. Pract., 19(2), 431–441 10.1039/C7RP00226B.
  50. Leontyev A., Houseknecht J. B., Maloney V., Muzyka J. L., Rossi R., Welder C. O. and Winfield L., (2020), OrganicERs: building a community of practice for organic chemistry instructors through workshops and web-based resources, J. Chem. Educ., 97(1), 106–111 DOI:10.1021/acs.jchemed.9b00104.
  51. Lincoln Y. S. and Guba E. G., (1985), Naturalistic inquiry, SAGE Publications.
  52. Lira M. and Gardner S. M., (2020), Leveraging multiple analytic frameworks to assess the stability of students’ knowledge in physiology, CBE—Life Sci. Educ., 18(3), 1–19.
  53. Lohmann J. and Froyd F., (2010), Chronological and ontological development of engineering education as a field of scientific inquiry.
  54. LSE, (2023), About CBE-Life Sciences Education. https://www.lifescied.org/about.
  55. LSE Author Guidelines, (2023), Information for authors. https://www.lifescied.org/info-for-authors.
  56. Lu H., Jiang Y. and Bi H., (2020), Development of a measurement instrument to assess students’ proficiency levels regarding galvanic cells, Chem. Educ. Res. Pract., 21(2), 655–667 10.1039/C9RP00230H.
  57. Luft J. A., Jeong S., Idsardi R. and Gardner G., (2022), Literature reviews, theoretical frameworks, and conceptual frameworks: an introduction for new biology education researchers, CBE—Life Sci. Educ., 21(3), rm33 DOI:10.1187/cbe.21-05-0134.
  58. Macrie-Shuck M. and Talanquer V., (2020), Exploring students’ explanations of energy transfer and transformation, J. Chem. Educ., 97(12), 4225–4234 DOI:10.1021/acs.jchemed.0c00984.
  59. Magana A. J., (2022), The role of frameworeks in engineering education research, J. Eng. Educ., 111(1), 9–13 DOI:10.1002/jee.20443.
  60. Metcalf H., Russell D. and Hill C., (2018), Broadening the science of broadening participation in STEM through critical mixed methodologies and intersectionality frameworks, Am. Behav. Sci., 62(5), 580–599 DOI:10.1177/0002764218768872.
  61. Miller-Friedmann J., Childs A. and Hillier J., (2018), Approaching gender equity in academic chemistry: lessons learned from successful female chemists in the UK. Chem. Educ. Res. Pract., 19(1), 24–41 10.1039/C6RP00252H.
  62. Murphy K., Holme T. A., Zenisky A., Caruthers H. and Knaus K., (2012), Building the ACS exams anchoring Concept content map for undergraduate chemistry, J. Chem. Educ., 89(6), 715–720 DOI:10.1021/ed300049w.
  63. Nakiboğlu C. and Nakiboğlu N., (2019), Exploring prospective chemistry teachers’ perceptions of precipitation, conception of precipitation reactions and visualization of the sub-microscopic level of precipitation reactions, Chem. Educ. Res. Pract., 20(4), 873–889 10.1039/C9RP00109C.
  64. Park C., Lee C. Y. and Hong H.-G., (2020), Undergraduate students’ understanding of surface tension considering molecular area, J. Chem. Educ., 97(11), 3937–3947 DOI:10.1021/acs.jchemed.0c00447.
  65. Partanen L., (2020), How student-centred teaching in quantum chemistry affects students’ experiences of learning and motivation—A self-determination theory perspective, Chem. Educ. Res. Pract., 21(1), 79–94 10.1039/C9RP00036D.
  66. Passmore C., Schwarz C. V. and Mankowski J., (2016), Developing and using models, in Schwarz C. V., Passmore C. and Reiser B. J. (ed.), Helping students make sense of the world using next generation science and engineering practices, National Science Teachers Association, pp. 109–134 DOI:10.2505/9781938946042.
  67. Petscher Y., Schatschneider C. and Compton D. L., (2013), Applied quantitative analysis in education and the social sciences, Routledge DOI:10.4324/9780203108550.
  68. PRPER, (2023), About. https://journals.aps.org/prper/about.
  69. PRPER Author Guidelines, (2023), Information for authors. https://journals.aps.org/prper/authors.
  70. Reimer L. C., Denaro K., He W. and Link R. D., (2021), Getting students back on track: persistent effects of flipping accelerated organic chemistry on student achievement, study strategies, and perceptions of instruction, J. Chem. Educ., 98(4), 1088–1098 DOI:10.1021/acs.jchemed.0c00092.
  71. Reinholz D. L. and Andrews T. C., (2019), Breaking down silos working meeting: an approach to fostering cross-disciplinary STEM–DBER collaborations through working meetings.
  72. Reynders G., Suh E., Cole R. S. and Sansom R. L., (2019), Developing student process skills in a general chemistry laboratory, J. Chem. Educ., 2019, 96(10), 2109–2119 DOI:10.1021/acs.jchemed.9b00441.
  73. Rodriguez J.-M. G. and Towns M. H., (2019), Alternative use for the refined consensus model of pedagogical content knowledge: suggestions for contextualizing chemistry education research, J. Chem. Educ., 96(9), 1797–1803 DOI:10.1021/acs.jchemed.9b00415.
  74. Rodriguez J.-M. G. and Towns M. H., (2021), Analysis of biochemistry students’ graphical reasoning using misconceptions constructivism and fine-grained constructivism: why assumptions about the nature and structure of knowledge matter for research and teaching, Chem. Educ. Res. Pract., 22, 1020–1034 10.1039/D1RP00041A.
  75. Rodriguez J.-M. G., Bain K., Moon A., Mack M. R., DeKorver B. K. and Towns M. H., (2017), The citation index of chemistry education research in the Journal of Chemical Education from 2008 to 2016: a closer look at the impact factor. J. Chem. Educ., 94(5), 558–562 DOI:10.1021/acs.jchemed.7b00062.
  76. Rüschenpöhler L. and Markic S., (2020), Secondary school students’ acquisition of science capital in the field of chemistry, Chem. Educ. Res. Pract., 21(1), 220–236 10.1039/C9RP00127A.
  77. Rupnow R. L., LaDue N. D., James N. M. and Bergan-Roller H. E., (2020), A perturbed system: How tenured faculty responded to the COVID-19 shift to remote instruction, J. Chem. Educ., 97(9), 2397–2407 DOI:10.1021/acs.jchemed.0c00802.
  78. Rynearson A., (2015), Building a community of practice: Discipline-based educational research groups, in 2015 ASEE Annual Conference and Exposition Proceedings, American Society for Engineering Education. p. 26.298.1–26.298.8 DOI:10.18260/p.23637.
  79. Ryu M., Bano R. and Wu Q., (2021), Where does CER stand on diversity, equity, and inclusion? Insights from a literature review, J. Chem. Educ., 98(12), 3621–3632 DOI:10.1021/acs.jchemed.1c00613.
  80. Sansom R. L., Clinton-Lisell V. and Fischer L., (2021), Let students choose: examining the impact of open educational resources on performance in general chemistry, J. Chem. Educ., 98(3), 745–755 DOI:10.1021/acs.jchemed.0c00595.
  81. Santos-Díaz S. and Towns M. H., (2020), Chemistry outreach as a community of practice: investigating the relationship between student-facilitators’ experiences and boundary processes in a student-run organization, Chem. Educ. Res. Pract., 21(4), 1095–1109 10.1039/D0RP00106F.
  82. Santos-Díaz S. and Towns M. H., (2021), An all-female graduate student organization participating in chemistry outreach: a case study characterizing leadership in the community of practice, Chem. Educ. Res. Pract., 22(2), 532–553 10.1039/D0RP00222D.
  83. Schafer A. G. L. and Yezierski E. J., (2021), Investigating how assessment design guides high school chemistry teachers’ interpretation of student responses to a planned, formative assessment, J. Chem. Educ., 98(4), 1099–1111 DOI:10.1021/acs.jchemed.0c01264.
  84. Schmidt-McCormack J. A., Judge J. A., Spahr K., Yang E., Pugh R., Karlin A. et al., (2019), Analysis of the role of a writing-to-learn assignment in student understanding of organic acid–base concepts, Chem. Educ. Res. Pract., 20(2), 383–398 10.1039/C8RP00260F.
  85. Seery M. K., Kahveci A., Lawrie G. A. and Lewis S. E., (2019), Evaluating articles submitted for publication in Chemistry Education Research and Practice, Chem. Educ. Res. Pract., 20(2), 335–339 10.1039/C9RP90003A.
  86. Shah L., Rodriguez C. A., Bartoli M. and Rushton G. T., (2018), Analysing the impact of a discussion-oriented curriculum on first-year general chemistry students’ conceptions of relative acidity, Chem. Educ. Res. Pract., 19(2), 543–557 10.1039/C7RP00154A.
  87. Singer S. R., Nielson N. R. and Schweingruber H. A., (2012), Discipline-based education research: Understanding and improving learning in undergraduate science and engineering, National Academies Press DOI:10.17226/13362.
  88. Slater T. F., (2011), History of physics education research as a model for geoscience education research community progress, pp. ED13C–0833.
  89. Southerland S. A., Abrams E., Cummins C. L. and Anzelmo J., (2001), Understanding students’ explanations of biological phenomena: conceptual frameworks or p-prims? Sci. Educ., 85(4), 328–348 DOI:10.1002/sce.1013.
  90. Stains M., (2022), Keeping up-to-date with chemical education research standards, J. Chem. Educ., 99(6), 2213–2216 DOI:10.1021/acs.jchemed.2c00488.
  91. Strauss A. and Corbin J., (1990), Basics of qualitative research: Grounded theory procedures and techniques, Sage Publications, Inc.
  92. Sweeder R. D., Herrington D. G. and Crandell O. M., (2023), Chemistry education research at a crossroads: Where do we need to go now? J. Chem. Educ., 2023, 100(5), 1710–1715 DOI:10.1021/acs.jchemed.3c00091.
  93. Szteinberg G., Balicki S., Banks G., Clinchot M., Cullipher S., Huie R. et al., (2014), Collaborative professional development in chemistry education research: bridging the gap between research and practice, J. Chem. Educ., 91(9), 1401–1408 DOI:10.1021/ed5003042.
  94. Talanquer V., (2019), Some insights into assessing chemical systems thinking, J. Chem. Educ., 96(12), 2918–2925 DOI:10.1021/acs.jchemed.9b00218.
  95. Towns M. H., (2013), New guidelines for chemistry education research manuscripts and future directions of the field, J. Chem. Educ., 90(9), 1107–1108 DOI:10.1021/ed400476f.
  96. Towns M. H. and Kraft A., (2012), The 2010 rankings of chemical education and science education journals by faculty engaged in chemical education research, J. Chem. Educ., 89(1), 16–20 DOI:10.1021/ed100929g.
  97. Treagust D. F., Won M. and Duit R., (2014), Paradigms in science education research, in Lederman N. G. and Abell S. K. (ed.), Handbook of Research on Science Education, Routledge.
  98. Trujillo C. M. and Long T. M., (2018), Document co-citation analysis to enhance transdisciplinary research. Sci. Adv., 4(1), e1701130 DOI:10.1126/sciadv.1701130.
  99. Underwood S. M., Kararo A. T. and Gadia G., (2021), Investigating the impact of three-dimensional learning interventions on student understanding of structure–property relationships. Chem. Educ. Res. Pract., 22(2), 247–262 10.1039/D0RP00216J.
  100. Watts F. M. and Finkenstaedt-Quinn S. A., (2021), The current state of methods for establishing reliability in qualitative chemistry education research articles, Chem. Educ. Res. Pract., 22(3), 565–578 10.1039/D1RP00007A.
  101. Wenger E., (1998), Communities of practice: Learning, meaning, and identity, Cambridge University Press.
  102. Williams S. R. and Leatham K. R., (2017), Journal quality in mathematics education, J. Res. Math. Educ., 48(4), 369–396 DOI:10.5951/jresematheduc.48.4.0369.
  103. Willson-Conrad A. and Grunert Kowalske M., (2018), Using self-efficacy beliefs to understand how students in a general chemistry course approach the exam process, Chem. Educ. Res. Pract., 19(1), 265–275 10.1039/C7RP00073A.
  104. Winfield L. L., Wilson-Kennedy Z. S., Payton-Stewart F., Nielson J., Kimble-Hill A. C. and Arriaga E. A., (2020), Journal of Chemical Education Call for papers: special issue on diversity, equity, inclusion, and respect in chemistry education research and practice, J. Chem. Educ., 97(11), 3915–3918 DOI:10.1021/acs.jchemed.0c01300.

Footnote

Authors contributed equally to this work.

This journal is © The Royal Society of Chemistry 2023