What works? What's missing? An evaluation model for science curricula that analyses learning outcomes through five lenses

Mark A. R. Raycroft and Alison B. Flynn *
Department of Chemistry & Biomolecular Sciences, University of Ottawa, Ottawa, Ontario, Canada. E-mail: alison.flynn@uOttawa.ca

Received 19th July 2019 , Accepted 17th May 2020

First published on 22nd May 2020


Abstract

Science is rapidly changing with vast amounts of new information and technologies available. However, traditional instructional formats do not adequately prepare a diverse population of learners who need to evaluate and use knowledge, not simply memorize facts. Moreover, curricular change has been glacially slow. One starting goal for curricular change can be identifying the features of a current curriculum, including potential areas for improvement, but a model is needed to accomplish that goal. The vast majority of studies related to curricular change have been conducted in K-12 environments, with an increasing number in post-secondary environments. Herein, we describe a model for science curriculum evaluation that we designed by integrating a number of different approaches. That model evaluates the intended, enacted, and achieved components of the curriculum, anchored by analyzing learning outcomes through five lenses: (i) a scientific Framework reported by the US National Research Council, (ii) systems thinking, (iii) equity, diversity, and inclusion, (iv) professional skills, and (v) learning skills. No curriculum evaluation models to date have used the five learning outcomes lenses that we describe herein. As a proof of principle, we applied the evaluation model to one organic chemistry course, which revealed areas of strength and possible deficiencies. This model could be used to evaluate other science courses or programs. Possible deficiencies may be addressed in other courses, in the course at hand, or may not be deemed necessary or important to address, demonstrating the potential for this evaluation to generate areas for discussion and ultimately, improvements to post-secondary science education.


Changes in science education are needed and fast(er)

Science is rapidly changing with vast amounts of new information and technologies available (National Academy of Sciences, National Academy of Engineering, 2010; Coppola, 2015). However, traditional instructional formats do not adequately prepare a diverse population of learners who need to evaluate and use knowledge, not simply memorize facts. Moreover, “often, there is a big difference between the ways students intuitively think (explain, make decisions) and the ways we want them to think in our disciplines” (Talanquer, 2018). Reforms are greatly needed in higher education in the sciences, including pedagogical approaches (Freeman et al., 2014; Gibbons et al., 2018; Stains et al., 2018), expectations for learning outcomes (Cooper et al., 2015), equity, diversity, and inclusion (American Chemical Society Committee on Chemists with Disabilities, 2001; Science and Engineering Leadership Initiative, 2019), professional skills (e.g., ethics, safety, teamwork, communication) (World Economic Forum: Centre for the New Economy and Society, 2018), and learning skills (e.g., metacognition, goal-setting) (World Economic Forum: Centre for the New Economy and Society, 2018). Ideally, a science curriculum will position students for future learning in their science programs, their careers, and as global citizens, whether or not they take more science courses.

Despite the need for and availability of new approaches, curricular change has been glacially slow (Cooper et al., 2015). That slow pace of change can be attributed to a complex series of factors, including institutional barriers and ingrained habits (Herrington et al., 2016; Gibbons et al., 2018). For educational change to occur and (more importantly) endure, many factors are needed, including dissatisfaction with the existing approach, incentives, support, self-efficacy beliefs, ability to change, and knowledge about how learning occurs (Posner et al., 1982; Dole and Sinatra, 1998; Cooper, 2013). A starting point is to create greater awareness of an existing curriculum's goals and effects through an educational evaluation, which may identify potential areas for change.

Our goal in this study was to create a cohesive model for evaluating science curricula that brought together curricular evaluation components in a new way and could be used across courses and through a degree or program. Herein, we describe that model and demonstrate how we have applied it in a proof of principle to the organic chemistry curriculum in one course at one institution, anticipating that it will be able to be used throughout a full degree in a curriculum mapping and evaluation process in chemistry or science courses.

Benefits of curriculum evaluation

Curriculum evaluation offers an opportunity for data-supported reflection and discussions on the current curriculum, as a first step toward change. Acknowledging that the pace of change in higher education is often glacially slow and that whole program reforms may not seem possible/achievable for some institutions, a curriculum evaluation can give insight into aspects of a curriculum that could be iteratively improved. Ideally, coherent curricular design will be used to best support student learning (Wiggins and McTighe, 2005; Klymkowsky and Cooper, 2012).

The many parties involved in the curriculum have a stake in identifying the curriculum's features, strengths, and weaknesses. Students’ future studies and careers are deeply affected by prior courses; they dedicate substantial money and time to university courses. Educators (e.g., professors, staff, teaching assistants) facilitate and depend on the quality of students’ skills, which impact their future courses and research studies. Employers (e.g., from industry, government, other professions) depend on students’ skills. Our ability to address global issues rests on the ability of our citizens and require a high level of skills, many of which will be learned in post-secondary settings (United Nations, 2015; World Economic Forum, 2016; Anderson, 2017).

Landscape of curriculum evaluation

Across disciplines, curriculum evaluation can take a number of forms. In particular, curriculum mapping has been used as an instrument for evaluation (Lam and Tsui, 2016). Curriculum mapping is a process that identifies how an existing curriculum helps learners develop competences while identifying gaps and areas for improvement (Sumsion and Goodfellow, 2004; Ervin et al., 2013). The vast majority of these studies have been conducted in K-12 environments, with an increasing number being conducted in post-secondary environments (Robley et al., 2005; Plaza et al., 2007; Uchiyama and Radin, 2009; Spencer et al., 2012; Archambault and Masunaga, 2015; Wang, 2015; Arafeh, 2016; Lam and Tsui, 2016; Muntinga et al., 2016; Reid and Wilkes, 2016). In post-secondary education, medical education and engineering in particular have engaged in curricular mapping, likely attributable to stringent and changing accreditation standards in those fields (Edmondson, 1995; Harden, 2001; Hubball and Burt, 2007; Muntinga et al., 2016; Imansari and Sutadji, 2017).

A variety of frameworks exist for curriculum evaluation, including the intended-enacted-achieved model (also called declared-delivered-learned) (Biggs, 1999), Context-Inputs-Process-Products[-Outcomes] (CIPP and CIPPO) models (Stufflebeam, 1983; Imansari and Sutadji, 2017), the New World Kirkpatrick model of reaction-learning-behaviour-results (Praslova, 2010; Kirkpatrick and Kirkpatrick, 2016), Stake's Countenance model (Stake, 1967), and awareness-analysis-action (Jewett et al., 2018). None of these to date has analyzed a course's learning outcomes through the five lenses that we describe herein. In chemistry courses, a number of curricular modifications and reforms have been described; a few of these innovations have described approaches to evaluate the changes (additional details in Appendix 1) (Fetterman, 2001; Reingold, 2001; Clauss and Nelsen, 2009; Talanquer and Pollard, 2010, 2017; Vázquez et al., 2012; Cooper and Klymkowsky, 2013; Webb et al., 2013; Shultz and Gere, 2015; Williams et al., 2015; Underwood et al., 2016; Finkenstaedt-Quinn et al., 2017; Jewett et al., 2018). Herein, we describe a new curriculum evaluation model that can be used as a potential mechanism to identify areas and opportunities for curricular change in chemistry or other science courses and programs.

Evaluation model: intended, enacted, and achieved curriculum

We designed the evaluation model used in this study to shed light on the intended, enacted, and achieved curricular components (Fig. 1) (Biggs, 1999; National Research Council (NRC), 2004; Van den Akker et al., 2006). The intended curriculum involves the intended learning outcomes—the knowledge, skills, and values that students are expected to demonstrate by the end of a module, course, program, or degree. The enacted curriculum includes all the course components that help students achieve the intended learning outcomes and could include lecture notes, videos, homework, class activities, projects, etc. Each LO could also be enacted explicitly or implicitly in the curriculum. The achieved curriculum refers to the observed and measured learning outcomes.
image file: c9rp00157c-f1.tif
Fig. 1 Overall structure of the curriculum evaluation.

Intended curriculum: five lenses of analysis

Learning outcomes form the basis of this evaluation model as they connect directly to the knowledge, skills, and values students gain from their education experience rather than to the inputs such as concepts that are explained to students or material that is provided. In this evaluation model, we analyzed learning outcomes through five lenses (Fig. 1): (i) a scientific Framework, (ii) systems thinking, (iii) equity, diversity, and inclusion, (iv) professional skills (e.g., ethics, safety, teamwork, communications), and (v) learning skills. These lenses and the reasons for their selection are described in more detail below.
(i) The scientific Framework. The first cognitive lens is a Framework described by the United States’ National Research Council (National Research Council (NRC), 2012), which has been incorporated into the US’ Next Generation Science Standards (National Research Council (NRC), 2015). This lens was selected because of its direct link to learning specific chemistry concepts. The Framework contains three parts, or dimensions: science and engineering practices, cross-cutting concepts, and disciplinary core ideas (Table 1). The science practices (SPs) are ones that professional scientists engage in. Cross-cutting concepts (CCs) are those found across scientific disciplines, and disciplinary core ideas (CIs) are those identified as being central to a given discipline. Originally developed for K-12 science education, the Framework is also highly relevant to postsecondary science education contexts. Efforts are being made in K-12 settings to guide teachers in implementing and monitoring students’ engagement with the dimensions of the Framework (Trygstad et al., 2016; Kellamis and Yezierski, 2019) To date, the Framework has been used in postsecondary contexts to design new curricula and assessments (Laverty et al., 2016; Underwood et al., 2018; McGill et al., 2019), but not to evaluate existing curricula.
Table 1 Categories in each dimension of the Framework (National Research Council (NRC), 2012)
Dimension Categories
Science practices (SPs) 1. Asking questions & defining problems
2. Developing & using models
3. Planning & carrying out investigations
4. Analyzing & interpreting data
5. Using mathematics & computational thinking
6. Constructing explanations & designing solutions
7. Engaging in argument from evidence
8. Obtaining, evaluating, & communicating information
Cross-cutting concepts (CCs) 1. Patterns, similarity, & diversity
2. Cause & effect: mechanism & explanation
3. Scale, proportion, & quantity
4. Systems & system models
5. Energy & matter
6. Structure & function
7. Stability & change
Disciplinary core ideas (CIs) 1. Electrostatic and Bonding Interactions
2. Atomic/Molecular Structure and Properties
3. Energy
4. Change and Stability in Chemical Systems


There are a number of approaches to classifying learning outcomes, including Bloom's taxonomy, the Structure of Observed Learning Outcomes (SOLO) taxonomy, and frameworks from provincial/state or national bodies. We elected to analyze the learning outcomes herein using the Framework because of its specificity to science but we recognize that other researchers and educators may prefer another approach.

(ii) Systems thinking. A second cognitive approach to learning outcomes involves systems thinking (Matlin et al., 2015, 2016), which “emphasizes the interdependence of components of dynamic systems and their interactions with other systems, including societal and environmental systems” (Mahaffy et al., 2018). Calls for a systems thinking approach have been made in chemistry (Mahaffy et al., 2018) and can be readily extended to other sciences. This lens was selected because of issues with chemistry-specific approaches that can involve teaching chemistry in a silo. Socio-scientific inquiry is a related framework (Sadler and Zeidler, 2005; Zeidler et al., 2009) that generally has three main stages: (a) raising an authentic research-based question about a socio-scientific issue, (b) carrying out inquiry on the question to enact change, and (c) finding a solution, which can include taking personal action (Amos and Levinson, 2019). Socio-scientific inquiry and systems thinking contain many related ideas, with systems thinking including other ideas such as dynamic relationships and emergent properties, and socio-scientific inquiry including an action. Although “Systems & system models” is part of the cross-cutting concepts dimension of the Framework and other aspects of systems thinking appear, these dimensions connect only implicitly (if at all) outside of chemistry. Systems thinking is designed to help students build skills between chemistry and other concepts, disciplines, and global issues, ultimately to help students become citizens prepared to solve complex challenges (Mahaffy et al., 2018; Orgill et al., 2019; Pazicni and Flynn, 2019).

A method to analyse LOs with respect to systems thinking has yet to be proposed in chemistry, although assessment rubrics have been developed in other areas (Yurtseven, 2016; Grohs et al., 2018). We identified the key systems thinking skills that we intend students to gain (Appendix 2), by adapting and integrating existing proposals for systems thinking LOs (Richmond, 1993; 1997; Assaraf and Orion, 2005; Calhoun et al., 2008; Orgill et al., 2019).

(iii) Equity, diversity, and inclusion (EDI). In addition to cognitive aspects of education, purposeful design is needed to embed principles of (iii) equity, diversity, and inclusion in the curriculum. For example, women and persons with disabilities are persistently underrepresented in the science, technology, engineering, and mathematics (STEM) workforce (Duerstock and Shingledecker, 2014; Etkin, 2016; Inclusion and Diversity Committee of the Royal Society of Chemistry, 2018a, 2018b). Educational equity is a democratic ideal that requires substantial attention and results in better outcomes in workplace environments, including educational, corporate, and research environments (National Research Council (NRC), 2012).

There has been substantial work in education focussing on how instructors and other experts should structure the learning environment, such as Universal Design for Learning (Center for Applied Special Technology (CAST), 2018). For example, UDL principles include: designing for people in the margins, which improves learning for even more learners, how the instructor should provide multiple options for assessment, etc. These initiatives provide stronger opportunities and access for learners; however, there is a gap on equipping students themselves with EDI-related knowledge, skills, and values (i.e., EDI-related learning outcomes).

Training programs and expectations are being developed for faculty and staff at institutions around the world to address issues in EDI (Etkin, 2016; Saini, 2017; Inclusion and Diversity Committee of the Royal Society of Chemistry, 2018a, 2018b; Natural Sciences and Engineering Research Council of Canada, 2020), including the UK's Advance HE Equality Charters such as Athena SWAN (Advance HE, 2020) and Canada's Dimensions EDI (Natural Sciences and Engineering Research Council of Canada, 2020).

Despite the recent focus on training programs for faculty and staff (see above), analogous training has yet to be explicitly identified and developed for students. Students in our courses are moving on to diverse professions and should be equipped with EDI-related skills upon arrival in the workforce. Accordingly, herein we developed learning outcomes related to EDI, which will need to be rigorously evaluated (and undoubtedly revised, adapted, and improved), listed below and in Appendix 3. Specifically, by the end of a unit, course, or program, students should be able to:

• Define equity, diversity, and inclusion.

• Identify situations in which EDI issues have impacted scientific research.

• Propose methods to address EDI issues in current and upcoming research.

• Differentiate between conscious and unconscious bias.

• Demonstrate inclusive language and behaviour in courses and assignments.

An EDI framework for the teaching and learning environments in undergraduate science education has also been developed based on recommendations from the Canadian Society for Chemistry's Working Group on Inclusion, Diversity, and Equity (Chemical Institute of Canada, 2018a), which brought together considerations from multiple sources (American Chemical Society Committee on Chemists with Disabilities, 2001; Science and Engineering Leadership Initiative (SELI), 2019), including from Canada's Truth and Reconciliation Report from Indigenous Peoples (TRC, 2012). This EDI framework is described in detail in Appendix 3 and includes: course and laboratory design, learning environment, connections, selection processes, and Indigenous students.

(iv) Professional skills. Professional skills (e.g., ethics, safety, teamwork, communication) are sought by employers and needed for degrees beyond the undergraduate level. We addressed professional skills using the Ontario Council of Academic Vice-Presidents’ Undergraduate (and Graduate) Degree Level Expectations (Ontario Universities Council on Quality Assurance, 2019) as a framework, which includes three main areas that relate to professional skills: Communication skills (UDLE 4), Awareness of limits of knowledge [in a discipline] (UDLE 5), and Autonomy and professional capacity skills (UDLE 6). Other researchers and educators may use frameworks aligned with their context (Schultz et al., 2019), such as the Australian Qualifications Framework, which is the Australian national policy for regulated qualifications in Australian education and training (Australian Government Department of Education, 2013; Tertiary Education Quality and Standards Agency, 2015; Royal Australian Chemical Institute, 2020) and the University of Leeds (Pugh, 2019).
(v) Learning skills. Learning skills (e.g., metacognition, goal-setting, growth mindset) constitute essential learning outcomes for 21st century learners (World Economic Forum: Centre for the New Economy and Society, 2018) and form the fifth lens through which a course or program's learning outcomes can be analysed. Equipping students for success as citizens requires expanding discipline-specific learning outcomes to include the skills they will need as lifelong learners. Once the intended learning outcomes have been identified, the course (or program, etc.) components can be designed, including the environment, instruction, activities, and assessments—these components constitute the enacted curriculum.

As a lens to investigate learning-to-learn skills, we applied the learning outcomes from the Growth & Goals module developed in our group (O’Connor et al., 2018, 2020). This module has learning outcomes focused on self-regulated learning, mindset, goal-setting, and metacognition (Flynn, 2019), listed in Appendix 4.

Enacted curriculum

The enacted curriculum—what actually occurs in the classes, laboratories, and other learning opportunities—should be aligned with the intended curriculum. However, this alignment may not actually occur in practice. Numerous artefacts can be used to establish the enacted curriculum, including aspects that are Taught, Practiced, and Assessed. The Taught components can include videos, pre-class notes, textbook, supplemental resources, etc. The Practiced components of a course include class notes, activities (which can be gathered through reporting or classroom observations (Smith et al., 2013)), problem sets, prior exams, pre-tests, etc. The Assessed component can include assignments, reports, and exams.

Achieved curriculum

The Four Level Kirkpatrick model was identified to evaluate the achieved curriculum (Kirkpatrick, 1996; Kirkpatrick and Kirkpatrick, 2016), for which academic adaptations of this model have been proposed previously (Praslova, 2010). The levels address students’ reaction, learning, behaviour, and results (Appendix 5).

Methods

With the curricular evaluation model in hand with its Intended, Enacted, and Achieved components, we conducted a proof of principle study to understand the type of findings this evaluation model could reveal.

For a proof of principle of the new evaluation model, we analysed the Intended and Enacted components of one organic chemistry course that uses a redesigned curriculum (Flynn and Ogilvie, 2015; Webber and Flynn, 2018); an overview can be found in Fig. 2. The achieved components of the curriculum are the focus of other studies (Flynn and Featherstone, 2017; Galloway et al., 2017, 2018; Webber and Flynn, 2018; Galloway et al., 2019; Lapierre and Flynn, 2020). Specifically, we asked the following research question (RQ): To what extent can the curriculum evaluation model provide a basis for evaluating an existing course's curriculum, identifying areas of strength and potential improvements? Below, we describe the course context, methods, and findings from the evaluation study.


image file: c9rp00157c-f2.tif
Fig. 2 Overview of the uOttawa curriculum. Organic Chemistry II was the focus of this study.

Terms: curriculum, evaluation, assessment, and learning outcomes

Curriculum refers to the “interrelated set of plans and experiences which a student completes under the guidance of the school [or professor]” (Marsh and Willis, 1994, p. 5). In this work, evaluation—also called educational evaluation—refers to a higher-level gathering of information and measures a curriculum's effectiveness (Mertens, 2015). As a whole, this work may be described as evaluation research, which has been described as “a systematic process for (a) determining the strengths and weaknesses of programs, policies, organizations, technologies, persons, needs, or activities; (b) identifying ways to improve them; and (c) determining whether desired outcomes are achieved” (Bickman, 2005). Assessment refers to activities in a course by the students, professors, and/or teaching assistants and measures student performance, with or without feedback (Brookhart, 2001). Assessment can be formative (for the purpose of advancing learning and not for grades) or summative (for the purpose of assigning a grade and ideally to advance learning). Learning outcomes (LOs) refer to the knowledge, skills, and habits of mind (or values) that the learner demonstrates following a unit of learning (e.g., degree, program, course, or section/module) (Association of American Colleges, 2007; Klodiana Kolomitro and Katrina Gee, 2015). Learning outcomes should be meaningfully aligned through a degree, program, course, and section/module.

Course context: Flipped pedagogy, curriculum based on mechanistic patterns and principles

This study focused on one course section in the second of four organic chemistry courses that students can take at the University of Ottawa (Fig. 2). The course is taught with a new curriculum (Flynn and Ogilvie, 2015) and the particular course section has a flipped format (Flynn, 2015; 2017). Greater detail is available in Appendix 6. The learning outcomes for that course can be found in Appendix 7, with the program level learning outcomes in Appendix 8.

Intended curriculum: Five lenses of learning outcomes

The learning outcomes used as the basis for this study are the ones used in the courses addressed (i.e., organic chemistry) and were identified in the curriculum redesign process from professors’ experiences in the field and desired improvements to the existing curriculum. See Appendix 7 for the full list of LOs.
(i) The scientific framework. For the first lens in this evaluation, we used the Three Dimensional Learning Assessment Protocol (3D-LAP). The 3D-LAP is an instrument that was developed to determine the extent to which assessment items are aligned with the Framework (Laverty et al., 2016). The 3D-LAP has been used to identify assessment items that contain or lack all three dimensions of the Framework, and to design 3D assessment items (Underwood et al., 2018). Although originally developed for assessment items, in this work we applied the 3D-LAP to curriculum design by analysing learning outcomes. Learning outcomes were analyzed with the 3D-LAP because they are ultimately the knowledge, skills, and values that learners gain following a learning period (e.g., course) and each component of a course should be aligned with those intended learning outcomes, including teaching, practice opportunities, and assessment.

We used the 3D-LAP to analyse each course-level LO for its capacity to elicit Science Practices (SPs), Cross-Cutting Concepts (CCs), and chemistry's disciplinary Core Ideas (CIs). The 3D-LAP describes criteria that can be used to decide if an SP, CC, or CI is present in a given item. For example, an LO that involves the Science Practice Developing and Using Models would satisfy the following criteria: (1) gives an event, observation, or phenomenon for the student to explain or make a prediction about; (2) gives a representation or asks the student to construct a representation; (3) asks the student to explain or make a prediction about the event, observation, or phenomenon; and (4) asks the student to provide the reasoning that links the representation to their explanation or prediction. The first of the two LOs shown in Table 2 satisfies three of the four criteria related to Developing and Using Models; the second satisfies all four. We repeated this analysis for the CCs and CIs. Once this analysis was complete, the data were summarized and are discussed in the Results section, below. One might decide that a SP/CC/CI is not present if all the constituent criteria are not satisfied (e.g., that the first LO below did not possess a SP at all). In our analysis, however, we retained the criterion level of analysis for each SP, CC, or CI instead of deciding whether a SP/CC/CI was present or absent in an absolute sense, to better identify nuances in the curriculum. Moreover, the analysis was one-dimensional in that LOs were not analysed for the simultaneous presence of all three dimensions, to keep the analysis fine-grained. The analysis could be done in different ways if departments so chose; for example, one could analyze the integration of SPs with each other or between SPs and CCs (Carleton College, 2020).

Table 2 Examples of coding according to criteria for science practices
Learning outcomes (LOs) Science practice (SP) with associated criteria
Asking questions Developing & using models Analyzing and interpreting data
(1) Involves an observation, phenomenon, data, etc. (2) Student generates an empirically testable question (1) Involves observation, phenomenon (2) Has or requires a representation (3) Student explains/ makes a prediction (4) Student provides reasoning (1) Involves a scientific question, claim, or hypothesis to be investigated (2) Has a representation of data to answer a question or test a claim or hypothesis (3) Involves analyzed data or asks student to analyze data (4) Student interprets results or assess the validity of a conclusions
Note: ✓ = criterion is observed; blank = criterion is not observed.
Use the EPF to propose a reaction mechanism between a Nu and an aldehyde or ketone
Explain how to differentiate between multiple structures using spectral data


(ii) Systems thinking. The second lens used addresses systems thinking (Matlin et al., 2016; Mahaffy et al., 2018). The systems thinking definition and proposed systems thinking skills (Assaraf and Orion, 2005) were used to decide if any of the LOs possessed aspects of systems thinking; none was identified.
(iii) EDI learning outcomes. The curriculum's LOs were analyzed to identify alignment with the EDI LOs that we developed (Appendix 3). Although no course level LOs addressed aspects of the EDI framework, aspects of the chemistry program-level learning outcomes did and will be described in the Results and discussion section.
(iv) Professional skills. We investigated the LOs for the presence of alignment or overlap with professional skills in the curriculum using the Undergraduate (and Graduate) Degree Level Expectations Framework (Ontario Universities Council on Quality Assurance, 2019).
(v) Learning skills. We looked for the presence of LOs related to learning skills using the Growth & Goals module's learning outcomes (Appendix 4).

Enacted curriculum: taught, practiced, and assessed

To determine the nature of the enacted curriculum, we categorized each LO as Taught, Practiced, and/or Assessed according to its representation throughout the course. The Taught-Practiced-Assessed scaffold was subdivided further based on the type of activity in the course. Because the course was taught in a flipped format (Flynn, 2015), the Taught section included videos, pre-class notes, and supplemental resources (such as textbook readings or website activities). The Practiced section included class notes and activities, problem sets, prior exams, and pre-tests. The Assessed section included assignments, pre-tests, midterm 1, midterm 2, and the final exam. We recorded each instance that an intended LO was Taught, Practiced, and/or Assessed (Table 3).
Table 3 Examples of coding for instances when LOs were taught, practiced, and assessed, explicitly or implicitly
Learning outcomes for module 4 Taught Practiced Assessed Total
Videos Notes Supplemental resources Class Problem sets Prior exams Pre tests Assignments Midterm 1 Midterm 2 Final Explicit Implicit
Coding scheme: ✓ = LO is addressed explicitly; [✓] = LO is addressed implicitly.
Use the EPF to propose a reaction mechanism between a Nu and an aldehyde or ketone 5 0
Apply the principle of microscopic reversibility to predict the mechanism of a reverse reaction [✓] 2 1


The presence of a learning outcome was distinguished further as being either implicit or explicit. For example, one LO states: “Gauge leaving group ability using the pKa value of the leaving group's conjugate acid.” In an exam question that implicitly required that LO, the student was asked to decide which leaving group was best. In an analogous question that explicitly required the LO, the student was directly instructed to use pKa values to answer the question. We also explored the Intended and Enacted data to determine whether alignment was present, by comparing the relative prevalence of LOs with respect to the Framework and the presence of exam questions with respect to the Framework. Once this analysis was complete, the findings were summarized and are discussed in the Findings section, below.

Achieved curriculum

In this study, we did not specifically investigate the achieved components of the curriculum as these have been the focus of other studies (Flynn and Featherstone, 2017; Galloway et al., 2017, 2018; Webber and Flynn, 2018; Galloway et al., 2019; Lapierre and Flynn, 2020). Some of those findings are summarized in the Results and discussion section.

Reliability

To address reliability in the implementation of this model, a second rater conducted a three-stage analysis, which resulted in high Krippendorff alpha values and percent agreement (Krippendorff, 2011); additional details can be found in Appendix 9.

Results and discussion

Intended curriculum: five lenses of learning outcomes

(i) The scientific framework. We analysed the learning outcomes (LOs) to determine which aspects of the Framework were present: many aspects were dominant; others were not identified at all (Fig. 3). Among the science practices, Developing & using models and Constructing explanations & engaging in arguments from evidence were the most-highly represented in Modules 1, 2 and 4. The fourth criterion of Developing & using models was much less well-represented, in which the student must provide reasoning. Asking questions, Planning investigations, Analyzing & interpreting data, Using mathematical & computational thinking, and Evaluating information had little to no representation in these modules. Few of the SPs had all subcriteria satisfied.
image file: c9rp00157c-f3.tif
Fig. 3 Percentage of learning outcomes for Organic Chemistry II's modules associated with each Science Practice criterion as defined by the 3D-LAP.

Module 3—spectroscopic analysis—strongly elicited many SPs: Developing & using models, Analyzing & interpreting data, Using mathematical & computational thinking, and Constructing explanations & engaging in arguments from evidence. The SPs Asking questions, Planning investigations, and Evaluating information were not elicited in this module. The module did not have any learning outcomes explicitly related to the Evaluating information SP. Although Module 3 provided frequent opportunities to interpret data, the Evaluating information SP, by definition, involves having the student make a conclusion about the validity of an excerpt from a conversation, article, etc. and provide reasoning for their conclusion.

Among the crosscutting concepts, Mechanism & explanation (cause & effect) and Structure & function were most-highly represented (Fig. 4). Proportion & quantity is a CC elicited in Module 3 (spectroscopy) but not to a significant degree in the other three modules. Patterns, Scale, Systems & system models, Energy & matter, and Stability & change had little to no representation in the learning outcomes in these modules. Despite being a curriculum focused on mechanistic patterns, the course's learning outcomes at the time of analysis did not explicitly include LOs that had students find and identify patterns in mechanisms. Since that time, related LOs are being piloted and we are studying how students make connections across various reaction types (Galloway et al., 2018, 2019; Lapierre and Flynn, 2020). Similarly, concepts of stability and change appear in classroom activities and assessments but not explicitly in the learning outcomes in the analysed version of the curriculum. These concepts arose through questions about equilibrium, relative stability of a series of compounds, etc.


image file: c9rp00157c-f4.tif
Fig. 4 Percentage of learning outcomes for Organic Chemistry II's modules explicitly associated with each Cross-Cutting Concept criterion (no sub-criteria) as defined by the 3D-LAP.

All of the core ideas were represented in the modules (Fig. 5); Electrostatic & bonding interactions and Structure & properties were most-highly represented. Not surprisingly, Module 3 (Spectroscopy) had a great deal of structure and property ideas, with less of the others. Energy and Change & stability were the least-represented, pointing to a potential to thread these ideas to a greater degree throughout the modules.


image file: c9rp00157c-f5.tif
Fig. 5 Percentage of learning outcomes for Organic Chemistry II's modules associated with each Disciplinary Core Idea criterion (no sub-criteria) as defined by the 3D-LAP. N = 112.

Given that some SPs, CCs, and CIs appear prevalently in the learning outcomes while others do not, a question that arises is: What is the appropriate proportion of each part of the Framework? An appropriate or “optimal” proportion (if such a thing could exist) would depend on the students’ next courses, their chosen career, etc., which are all highly variable. The degree of representation of the SPs, CCs, and CIs in a given course or other unit of the curriculum will depend on the particular subject matter as well as the curricular goals decided upon by the department as a whole. The preceding analyses illuminated a variety of areas for potential curricular modifications, although we do not recommend seeking an optimal or fixed proportion. We can reflect on the decidedly underrepresented SPs, CCs, and CIs and consider how the LOs may be modified to elicit them. In some cases, a specific criterion associated with a practice, concept, or idea stands out as important but lacking. In other cases, some aspects may be over-represented. The activity of thoroughly analysing the existing LOs has provided a clear direction toward improving the outcomes so that they are more complete and meaningful to the students.

The LOs that we would like to add or modify tend to have connections to more than one missing component. For example, we could better represent the concepts and ideas associated with Energy and Change & stability. These concepts could be more consistently threaded throughout the course, so we are piloting ways to integrate free energy diagrams in the course (throughout the taught, practiced, and assessed components). Asking questions is a practice we could incorporate to a greater degree. LOs and activities to elicit this practice could involve students developing mechanistic questions surrounding how reactants may transform into products. Such questions could directly connect to Planning investigations, such as: based on a given hypothesis, prompt students to decide on key experiments to test the hypothesis. Additionally, thermodynamic and kinetic data for reactions, their interpretation, and their comparison with other reactions could provide students with on-going practice connecting energetic parameters with the ideas of kinetic and thermodynamic stability in reactions.

(ii) Systems thinking. The curriculum did not have any learning outcomes directly associated with systems thinking, although there were consistent links with relevant contexts (e.g., biochemistry, materials, pharmaceuticals) and program-level learning outcomes related to contemporary societal and global issues—while not systems thinking, there exists an anchor upon which systems thinking learning outcomes could be developed. To be considered systems thinking, explicit learning outcomes and associated assessments would need to be designed to incorporate the interrelated and dynamic nature of various systems, as well as asking students to make connections between systems and predict and evaluate the effects of perturbing systems. Learning outcomes for systems thinking are proposed in Appendix 2, having been adapted and integrated from a variety of literature sources (Richmond, 1993, 1997; Assaraf and Orion, 2005; Calhoun et al., 2008; Orgill et al., 2019; Pazicni and Flynn, 2019).
(iii) Equity, diversity, and inclusion. LOs related to equity, diversity, and inclusion were not represented at the course or program levels; however, the tools used in the course (e.g., OrgChem101.com and course videos) were created in adherence to accessibility standards (AODA, 2014). Given the importance for citizens to develop EDI-related skills, we recommend adopting or adapting the ones provided in Appendix 3.
(iv) Professional skills. Professional skills were not represented in the course's learning outcomes at the time of the data collection and analysis. Program-level learning outcomes were therefore analysed for the presence of these skills, specifically those described by Ontario's Degree Level Learning Outcomes that require communication skills (UDLE 4), awareness of limits of knowledge [in a discipline] (UDLE 5), and autonomy and professional capacity skills (UDLE 6). Professional skills were represented at the program level (Appendix 8) and addressed in other courses. For example, program-level learning outcomes include: present information in a clear and organized manner by a variety of means (e.g., oral presentation, poster, video) (aligned with UDLE 4); self-assess and self-direct one's own learning (i.e., possess metacognitive abilities) (aligned with UDLE 5); conduct oneself responsibly in a laboratory and course (aligned with UDLE 6); and plan and organize effectively, manage time and assets, respect deadlines (aligned with UDLE 6). The decision whether to leave an organic chemistry course as is or to include a greater number of LOs directly related to professional skills should be made in context with other courses—if these professional skills are being well-developed (i.e., demonstrated by students) through other courses, then there is no need to continue repeating the teaching in the organic chemistry courses. On the other hand, if students are not demonstrating sufficient proficiency in the desired professional skills and they do not have the opportunity to do so through other courses or media, then the organic chemistry courses provide a possible means through which students can gain the desired skills.
(v) Learning skills. LOs related to learning skills were represented at the program level (e.g., self-assess and self-direct learning, plan and organize effectively) and have since been piloted through a Growth & Goals module that was integrated in one section of the course itself (Flynn, 2018; O’Connor et al., 2020), although they were not present in the course during the year of analysis. The Growth & Goals module includes aspects of metacognition (aligned with the course's intended learning outcomes), growth mindset (including resilience in the face of failure) (Dweck, 2006; 2015), goal-setting (Conzemius and O’Neill, 2006; Dallas, 2015), and self-regulated learning (Zimmerman, 1990, 2002).

Having identified strengths and gaps in the curriculum with respect to the five lenses of learning outcomes, we turned to an analysis of the Enacted component of the curriculum.

Enacted curriculum: taught, practiced, and assessed

A comparison between taught, practiced, and assessed course components immediately identifies whether a course is balanced in terms of the inputs (taught), opportunities for students to develop knowledge and skills with feedback (practiced), and assessment (summative). For example, a course could have extensive lectures on a topic but few opportunities for students to practice and receive feedback.

From a broad perspective, the curriculum addressed the intended LOs in all three components and in a balanced way. Within the taught portion of the course, the LOs were well represented in the videos and notes and students did not need to consult supplemental resources for a critical portion of LOs (Fig. 6). Analysis of the practiced component revealed that students had a variety of opportunities to engage in the LOs. Lastly, analysis of the assessed component showed how each subsequent assessment item covered a greater percentage of the LOs and that the implicit integration of LOs into the assessment items increased proportionally as students gain more experience with and exposure to the LOs.


image file: c9rp00157c-f6.tif
Fig. 6 Explicit and implicit presence of learning outcomes in the taught, practiced, and assessed components of the course for Module 4. N = 112.

Ideally, a course's goals and priorities are aligned with classroom activities, including assessment. We explored the Intended and Enacted data to determine whether that alignment was present (Fig. 7). Overall, the Intended and Enacted curricula were consistent, indicating that the course was enacted in line with its intended learning outcomes.


image file: c9rp00157c-f7.tif
Fig. 7 Comparing the course's intended LOs with assessment aligned with LOs (enacted curriculum) on the final exam. To what extent are the intended and enacted curricula connected?

A coarse-grained analysis can extract the key information

For this study, we conducted an analysis using very fine-grained LOs, hence the large number of LOs for each module. However, such a large set of LOs quickly becomes unwieldy from the learning, teaching, and evaluation perspectives. We used fine-grained LOs to ensure that we could capture detailed information from the course, knowing it would be easier to collapse the analysis into broad categories if we later chose to do so, but that the opposite would be impossible without a new analysis.

To determine whether a coarse-grained analysis could be reliably used in subsequent analyses (i.e., in other courses or settings), we collapsed each module's fine-grained LOs into coarse-grained ones and repeated the 3D-LAP analysis. Indeed, the coarse-grained analysis successfully captured the key findings found in the fine-grained analysis, providing reassurance that a coarse-grained analysis could be meaningful for curriculum analysis (Fig. 8).


image file: c9rp00157c-f8.tif
Fig. 8 Comparison of fine versus coarse grained analysis of learning outcomes.

This study did not specifically investigate the achieved components of the curriculum as these have been the focus of other studies (Flynn and Featherstone, 2017; Galloway et al., 2017, 2018; Webber and Flynn, 2018; Galloway et al., 2019; Lapierre and Flynn, 2020). For example, one study found that students in a new context were better able to propose mechanisms for both familiar and unfamiliar reactions, and that students who drew mechanisms (when not asked to do so) outperformed students who did not draw mechanisms (Webber and Flynn, 2018). Students proficiently use the electron-pushing formalism (i.e., curved arrows) although some contexts remain challenging, such as intramolecular reactions (Flynn and Featherstone, 2017). Students work through mechanism questions in a process-oriented way, using concepts of nucleophiles and electrophiles to guide their thinking (Galloway et al., 2017). The ways in which students and experts categorize reactions has identified static and process-oriented ways to do so (Galloway et al., 2018; 2019). In a larger class setting, categorization methods become more advanced over time (Lapierre and Flynn, 2020).

Limitations

Our analysis of the enacted curriculum captured learning opportunities provided by the professor but we did not conduct classroom observations or collect data from students to determine what they were doing in and out of class. A laboratory course runs separately from but parallel to the course used in the proof of principle; the laboratory course was not used in the analysis but might have provided opportunities for students to achieve learning outcomes related to other aspects of the Framework.

The Framework was designed to have the three components integrated, creating 3D-learning (science practices, cross-cutting concepts, and core ideas). In our work, each SP, CC, and CI was analysed one dimension at a time, and not for the blending of the dimensions (i.e., the 3D nature of these dimensions); this decision was made to keep the analysis fine-grained and retain information that would be lost if we had only identified LOs with all three dimensions present.

The systems thinking learning outcomes currently represent a work-in-progress that are currently being implemented and studied in many settings, including our own research group.

The program-level LOs have not been analysed with respect to the five LO lenses or how they are enacted in courses, suggesting future directions and applications for this curriculum evaluation model.

Conclusions

In this work, we described a model for critically evaluating science curricula that analyses the course or program's learning outcomes through five lenses, using an overarching intended-enacted-achieved organization. The five lenses used for the analysis are: (i) the scientific Framework, analysed using the 3D-LAP, (ii) systems thinking, (iii) equity, diversity, and inclusion, (iv) professional skills, and (v) learning skills. As a proof of principle, we analysed one organic chemistry course for its intended and enacted components, with the achieved components being investigated and reported separately (Flynn, 2014; Stoyanovich et al., 2015; Bodé et al., 2016; Bodé and Flynn, 2016; Flynn and Featherstone, 2017; Galloway et al., 2017, 2018; Webber and Flynn, 2018; Carle et al., 2020; Galloway et al., 2019). That course we analyzed is part of a redesigned organic chemistry curriculum that takes a patterns and principles approach (mechanistic instead of functional group-based), which emphasizes scientific explanations over rote memorization (Flynn and Ogilvie, 2015). We anticipate that the evaluation model can be used across full chemistry and science programs, including in mapping exercises.

In the intended curriculum and with respect to the Framework, the findings starkly revealed some science practices, cross-cutting concepts, and core ideas that were well represented among the intended learning outcomes and others that did not appear at all. No learning outcomes explicitly addressed systems thinking or equity, diversity and inclusion. Professional skills and learning skills were not addressed in the specific course's learning outcomes but were addressed to some extent in the chemistry program-level learning outcomes. In the originally analysed version of the course, learning-to-learn skills were not part of the course. Since then, a Growth & Goals module has been integrated and piloted in that course, which includes aspects of metacognition (aligned with the course's intended learning outcomes), growth mindset—including resilience in the face of failure (Dweck, 2006; 2015), goal-setting (Conzemius and O’Neill, 2006; Dallas, 2015), and self-regulated learning (Zimmerman, 1990; 2002).

The enacted curriculum was well aligned with the intended curriculum, as identified by analysing components that were taught, practiced, and assessed, both implicitly and explicitly.

Equal representation of every part of the lenses is not necessary or appropriate; for example, not every cross-cutting concept needs to appear in every section of every course. Some modules, courses (theory and laboratory), and programs may feature some aspects of the Framework more than others with additional variability arising based on learners’ needs (i.e., strengths and weaknesses), culminating in strong overall learning outcomes. These findings provide an important, evidence-based opportunity for discussing the curriculum as it stands to make decisions about what might be changed.

As next steps, the findings from this study will be used as a basis for conversations about the existing curriculum and ways it might be improved to better support students’ learning towards their goals as citizens and in their careers. Changes will further be informed by studies related to the achieved curriculum, in terms of learners’ reactions, learning, behaviour, and results (Kirkpatrick and Kirkpatrick, 2016). Aspects of educational equity, diversity, and inclusion (EDI) can be improved through assistive technologies and programmatic interventions, ensuring (and improving) communication and sustainability (Bradley S. Duerstock and Clark A. Shingledecker, 2014), EDI training for faculty, staff, and graduate students, making diversity visible, etc. (Queen's University; National Research Council (NRC), 2012). Other curricula may also be considered, such as the chemical thinking curriculum (Talanquer and Pollard, 2017).

All the lenses for chemistry curriculum evaluation need additional research. Even the newer lenses (e.g., systems thinking and EDI) were included in this study because of their immense value and because they address important issues in chemistry and science education. One advantage of developing learning outcomes from the start is the ability to engage in a backward design educational process (Wiggins and McTighe, 2005), particularly possible for the lenses such as EDI, systems thinking, and professional skills. While we have developed one learning skills module (O’Connor et al., 2020), other ways of equipping students with learning skills are possible.

Improved curricula are needed to meet learners’ needs in a rapidly changing world. We hope that this model for curriculum evaluation will prove useful to educators and administrators in their efforts to improving and reforming curricula. Other educators and researchers may decide to use a different model, use different lenses in their model (e.g., accreditation criteria), or improve on the model described herein. If full curriculum re-designs do not seem feasible, this curricular evaluation model can help identify areas for iterative improvements, as we have demonstrated herein.

Conflicts of interest

There are no conflicts of interest to declare.

Appendix 1: new curricular models in chemistry with evaluations

Jewett et al. analysed common curricular challenges across institutions through an awareness-analysis-action model (Jewett et al., 2018). The chemical thinking curriculum (Talanquer and Pollard, 2010), first implemented at the University of Arizona, has been analysed against the curriculum's goals, successes, and challenges (Talanquer and Pollard, 2017). Another curriculum reform has been described in terms of its developmental model: Chemistry, Life, the Universe, and Everything (Cooper and Klymkowsky, 2013), from which studies have been published related to students learning (Williams et al., 2015; Underwood et al., 2016). Approaches have been documented that alter the typical course sequence, such as teaching bioorganic topics first, along with successes and problems with those approaches (Reingold, 2001). Other innovations have integrated new aspects into existing curricula and studied their effects, such as nuclear magnetic resonance (Fetterman, 2001; Webb et al., 2013), computational molecular modelling (Clauss and Nelsen, 2009), writing-to-teach (Vázquez et al., 2012), and writing-to-learn (Shultz and Gere, 2015; Finkenstaedt-Quinn et al., 2017).

Appendix 2: systems thinking learning outcomes

Learning outcomes related to systems thinking can be thought of from two perspectives: (i) Demonstrating systems thinking skills using chemistry and (ii) demonstrating chemistry knowledge through systems thinking (Pazicni and Flynn, 2019). The knowledge and skills essential for systems thinking have been suggested by a few authors (Richmond, 1993, 1997; Assaraf and Orion, 2005; Calhoun et al., 2008; Orgill et al., 2019). The ones proposed below represent an adaptation of that work and relate specifically to knowledge and skills related to systems. A system is a model that possesses at least three characteristics: (i) components/parts, (ii) interconnections between the components, and (iii) a purpose (Arnold and Wade, 2015; Pazicni and Flynn, 2019). As an example, “to apply chemical bonding concepts to explain various aspects of toxicity, climate change, or the energy economy” (Pazicni and Flynn, 2019). By the end of a course/section/program, students should be able to:

• Identify, interpret, and explain the characteristics of a system, including components, boundaries, and processes—in particular those between disciplines (e.g., culture, gender, economics, chemistry, engineering).

• Identify relationships among the system's components (causal and correlational).

• Identify dynamic relationships within the system, including feedback loops.

• Predict and explain the emergent properties of the system, including intended/unintended consequences.

• Predict and explain the effect of changing components of a system (dynamic relationships), including intended/unintended consequences and strengths/weaknesses.

• Explain and predict how a system has changed and will change over time.

Move between grain sizes to predict, interpret, and explain aspects of the system (e.g., global scale in chemistry to global scale in cultural impacts to molecular scale, or local, national, and international levels within a discipline/area).

Appendix 3: equity, diversity, and inclusion (EDI) framework for undergraduate chemistry programs

EDI-related learning outcomes

These learning outcomes were adapted from a number of courses, including a bias training module (Natural Sciences and Engineering Research Council of Canada, 2020) and Inferior, an evidence-based description of gender bias in science research (Saini, 2017).

By the end of the section, course, or program, the student should be able to:

• Define equity, diversity, and inclusion.

• Identify situations in which EDI issues have impacted scientific research.

• Propose methods to address EDI issues in current and upcoming research.

• Differentiate between conscious and unconscious bias.

• Demonstrate inclusive language and behaviour in courses and assignments.

EDI in the teaching and learning environment

The following approaches to incorporate EDI into university classes (teaching and learning environments) have been identified by the Canadian Society for Chemistry's Working group on Inclusion, Diversity, and Equity (Chemical Institute of Canada, 2018b) and based on additional resources (American Chemical Society Committee on Chemists with Disabilities, 2001; TRC, 2012; Science and Engineering Leadership Initiative (SELI), 2019). The approaches fall into five categories: course and laboratory design, learning environment, connections, selection processes, and indigenous students. Each component contains a series of opportunities to improve/address EDI considerations in a way that stands to impact an undergraduate program (Appendix 3). For example, the course and laboratory design component suggests providing evidence of how principles of universal design are applied in a course; for the learning environment component, departments/institutions can describe the activities they offer aimed at involving and including equity-seeking groups; the connections component includes creating a webpage/section for EDI on department websites and identifying faculty/staff allies; the selection process component speaks to the need for deliberate processes for nominating and selecting awardees, seminar speakers (role models), etc.; and the Indigenous students component could explain how programs can help keep Indigenous students together in classes and ensure local access to elders and mentors (senior “cousins”). We anticipate that this EDI framework will continue to evolve and improve over time (Fig. 9).
image file: c9rp00157c-f9.tif
Fig. 9 Equity, diversity, and inclusion framework for undergraduate science programs.

Course and laboratory design

• Provide evidence and/or tracking of related initiatives such as universal design (CAST, 2018), language, academic accommodations.

• Address access, accommodations, and safety in the chemical laboratories.

• Provide appropriate training for faculty and staff on equity and diversity issues. The departmental leaders are key persons who need to be educated on equity and diversity issues.

Learning environment

• Departments should be able to describe what they are doing to ensure that their program is accessible and welcoming to all persons.

• Offer activities aimed at involving and including equity-seeking groups.

• Describe events that address EDI issues, e.g., conference symposia on EDI or integrated presentations/activities (e.g., Crudden, 2019).

Connections

• Provide evidence and/or tracking of student advising.

• Identify a selection of faculty and staff who are willing to be an ally for and mentor equity-seeking groups.

• Include a webpage for Equity and Diversity on the Department's website. The page could make it clear that the Department welcomes all persons. Resources for faculty, staff and students would be available on that web page including but not limited to resources for inclusive language, an LGBT+ glossary (LGBT+ = lesbian, gay, bisexual, transgender), a list of faculty and staff allies, links to relevant university resources on mental health, non-discrimination policies, the Equity Office, the Student Code of Conduct, a description of relevant activities in the department, an invitation for students to contact appropriate lab staff to accommodate physical disabilities in the labs.

Selection processes

• Describe what efforts have been made to ensure that the selection of undergraduate awardees and ranking of scholarship applicants has been an equitable process.

• Describe how the principles of equity and diversity have been considered for speakers in the departmental seminar series, including speaker nominations and invitations.

• Describe the diversity of the faculty including a description of appointment and promotion processes at the university. What efforts have been made to ensure adherence to equity policies relevant to appointments and promotion?

• Describe the institution's recruitment and admission policies for students, both graduate and undergraduate, to determine to what extent they are equitable and what efforts are being made to attract and retain underrepresented groups.

Indigenous students

• If a university has multiple Indigenous students in the same program, e.g., a group of Indigenous pre-nursing students, the students who would have to take Chemistry, Biology, Mathematics, Introductory Computer Science, etc., a group of departments could make sure the students get access to small classes sizes for as many classes as possible and attempt to keep the students all together in the same cohort. Since they are all in the same classes (mostly) they get to know and support each other. For example, the 6 would take intro Chemistry, Biology and Mathematics at the same time taught by a special instructor.

• Recognizing that many Indigenous students coming from remote communities do not have the same opportunities for learning STEM disciplines due to the difficulties associated with attracting and retaining qualified STEM instructors, departments should be open to offering preparatory courses to small groups of indigenous students prior to their first-year studies to assist in the transition to university-level STEM courses. Alternatively, the department, with the university, should make designated student teaching assistants available to Indigenous students and funded using allocated university funds. Volunteer-driven learning communities can also be used to provide needed assistance to 1st year Indigenous students.

• Ensure access to elders, mentors (senior “cousins”) for Indigenous students to aid in the transition from small communities to the university environment.

• Offer/participate in summer workshops for Indigenous students and teachers employed at schools within Indigenous communities.

• Offer/Participate in programs where a TA or instructor works with the community to provide a meaningful lab experience to Indigenous students with the goal of training the teachers to improve and sustain the quality of STEM instruction.

• Provide personal, social, and cultural support to ensure the academic and personal success of Indigenous students.

• See additional consideration in the Calls for Action of the Truth and Reconciliation Commission of Canada (http://www.trc.ca).

Appendix 4: learning outcomes for the learning-to-learn module, “Growth & Goals” (Flynn, 2019)

By the end of the Growth & Goals module, the learner should be able to:

(1) Use the concept of self-regulated learning and its associated 3-phase learning cycle to:

(a) Describe self-regulated learning in your own words

(b) Describe each of the three phases of the self-regulated learning cycle in your own words

(c) Identify common myths about learning

(d) Self-assess study habits and thinking

(e) Rate personal feelings towards a course

(2) Use the concept of mindset to:

(a) Describe a growth and fixed mindset in your own words

(b) Identify growth and fixed mindset statements

(c) Transform fixed mindset statements into growth mindset statements

(d) Construct strategies to deal with failure and build resiliency

(3) Use goal-setting skills to:

(a) Identify and construct SMART goals

(b) Construct a personalized schedule for a university semester to achieve goals

(c) Define and refine your priorities and use them to set your own goals for a course or personal endeavour

(4) Use the concept of metacognition to:

(a) Rate your current ability towards the course's learning outcomes and provide an explanation for your rating

(b) Identify resources and strategies that you will use to reach your goals

(c) Explain to what extent the skills acquired from the module can be used in other settings

(d) Apply skills from the Growth & Goals module to other courses and life challenges

(e) Describe the course's intended learning outcomes in your own words

Appendix 5: Kirkpatrick model for the achieved curriculum – explanation

The Four Level Kirkpatrick model was identified to evaluate the achieved curriculum (Kirkpatrick, 1996; Kirkpatrick and Kirkpatrick, 2016), for which academic adaptations of this model have been proposed previously (Praslova, 2010). Reaction (level 1), is “the degree to which participants find the training favourable, engaging, and relevant to their training [learning experience]” (Kirkpatrick, 1996). In the context of student learning in an academic setting, Reaction can refer to the degree to which students find the classroom and laboratory learning opportunities favourable, engaging, and relevant to thinking and functioning like a scientist. Learning (level 2), is “the degree to which participants acquire the intended knowledge, skills, attitude, confidence and commitment based on their participation in the training” (Kirkpatrick, 1996). Aspects of this level may be acquired through assessment instruments, customized to individual courses, whose validity and reliability are pre-established. Behaviour (level 3), is “the degree to which participants apply [transfer] what they learned during training when they are back on the job” (Kirkpatrick, 1996), which can include how students transfer what they learned to other classes, lab experiments (course-based and/or research-based), or to their careers. Results (level 4), is “the degree to which targeted outcomes occur as a result of the training and support and accountability package” (Kirkpatrick, 1996). Graduation rates, retention rates, recruitment methods, and diversity statistics are data that could provide a higher-level view of the Results level.

Related to Behaviours, Required drivers are the processes and systems that reinforce, monitor, encourage, and reward performance of critical behaviours (Kirkpatrick and Kirkpatrick, 2016). These Required drivers could include helping students see the relevance of SPs, CCs, CIs, knowledge, and skills from prior courses through by explicitly applying prior course concepts to learning new concepts (activation of prior knowledge), through classroom activities and assessments.

Appendix 6: course context

Organic Chemistry I is offered in the winter semester of first year undergraduate studies. Organic Chemistry II is offered in the fall semester after the students have completed Organic Chemistry I. Intermediate Organic Chemistry (i.e., Organic Chemistry III) and Advanced Organic Synthesis and Reaction Mechanisms (i.e., Organic Chemistry IV) are offered in the fall. The courses are 12 weeks long each and include 3 hours of class per week, and a 1.5-hour discussion group each week. Organic Chemistry I also includes a laboratory component, which is 3 hours long and runs biweekly; Organic II and III have separate laboratory courses associated with each theory course.

The [institution] uses an organic chemistry curriculum that is intended to help students develop expert-like thinking about organic chemistry, develop skills that equip them to interpret new structures and predict reactivity, and move away from rote memorization (Fig. 2) (Flynn and Ogilvie, 2015). We conducted studies to explore aspects of student learning in the new curriculum (Flynn, 2014; Stoyanovich et al., 2015; Bodé and Flynn, 2016; Bodé et al., 2016; Flynn and Featherstone, 2017; Galloway et al., 2017, 2019; Webber and Flynn, 2018; Carle et al., 2020); however, to date we had not conducted an overall curriculum evaluation.

Organic Chemistry I includes structure, properties, stereochemistry, and conformational analysis of organic compounds; electron-pushing formalism/symbolism; acid–base chemistry, simple π bond electrophile reactions (e.g., 1,2-carbonyl addition reactions with Grignard reagents, imine reductions), and π bond nucleophile reactions (alkenes, alkynes, and aromatics).

Organic Chemistry II begins with Module 1—a review, then Module 2 comprises σ electrophile reactions (i.e., SN1, SN2, E1, E2 and oxidation reactions), Module 3 involves IR and 1H NMR spectroscopy, and Module 4 includes more advanced π bond electrophiles (e.g., 1,2-carbonyl addition reactions, acetals and imine formation, addition–elimination reactions, aldol reactions, etc.).

The section of the course analysed in the pilot study was taught using a flipped format (Flynn, 2015, 2017). In this flipped format, the “lectures” are given as short online videos and notes that students are expected to watch and read before coming to class; alternatively, they can read the textbook or use other sources. Students complete a pre-class test before the first class in each week and an assignment at the end of each week. Class time is used for interactive, problem-solving activities to help students achieve the intended learning outcomes.

Appendix 7: learning outcomes for Organic Chemistry II (course level)

Fine grained learning outcomes

By the end of an Organic Chemistry II course, students should be able to:
Module 1 (N = 23). 1. Quickly (<5 seconds) calculate the formal charge on an atom.

2. Quickly (<2 seconds) indicate the direction of a dipole (δ+, δ−) of any bond.

3. Quickly (<10 seconds) translate between representations: draw a Lewis structure from a line structure, condensed structure, or name (and vice versa).

4. Quickly (<2 seconds) identify the hybridization of an atom.

5. Draw resonance structures for a given molecule.

6. Rank all the resonance structures for a molecule based on their contribution to the resonance hybrid.

7. Interpret and employ symbolism by either drawing a reaction mechanism (i.e., curved arrows), given the starting materials, intermediate and/or products or drawing intermediates/products given curved arrows.

8. Draw a reaction mechanism for reactions seen in CHM 1321/1721 (Organic Chemistry I).

9. Predict the product(s) for reactions seen in CHM 1321/1721 given starting materials.

10. Identify the most acidic proton, given a molecule(s).

11. Identify the most basic atom, given a molecule(s).

12. Predict the direction of an acid–base equilibrium.

13. Justify your prediction of the direction of an equilibrium using physical characteristics or pKa values (depending on the reactants).

14. Identify predominant form of compound at a given pH.

15. Identify factors required to compare the stability of (conjugate) bases.

16. Protonate an atom, including drawing a mechanism and products.

17. Deprotonate a molecule, including drawing a mechanism and products.

18. Decide whether a system is aromatic, anti-aromatic, or non-aromatic.

19. Explain why a system is aromatic, anti-aromatic, or non-aromatic.

20. Identify types and/or locations of molecular orbitals (MOs).

21. Use MO theory to portray a reaction.

22. Propose a retrosynthesis and synthesis using reaction types seen in CHM 1321/1721.

23. Demonstrate how a reaction fits into a patterns-of-mechanism framework.

Module 2: reactions of σ electrophiles—E1, E2, SN2, SN2, and oxidation reactions (N = 28). 1. Identify the leaving group, alpha (a) carbon, base, nucleophile, and solvent in a given reaction.

2. Indicate suitability of substrate, LG, B/Nu, solvent structure for a given reaction.

3. Draw and/or describe reactive conformation/orientation using an appropriate representation (e.g. Newman projection) to justify the stereochemical outcome of a reaction.

4. Be able to convert OH into a better leaving group (2 ways)

5. Decide whether a reaction is likely to proceed via an E1/SN1, E2, or SN2 mechanism.

6. Decide whether a rearrangement is likely to take place and/or draw rearranged product.

7. Draw a mechanism for an E1, SN1, E2, and SN2 reaction, given various starting materials.

8. Draw and label the reaction coordinate diagram for E1, SN1, E2, and SN2 reactions.

9. Associate reaction coordinate diagram with a mechanism.

10. Predict the major and/or minor products of the reaction, including stereochemical and regiochemical considerations.

11. Draw the orbitals involved in a given reaction.

12. Rank leaving groups based on their rate of reaction/reactivity.

13. Rank substrates based on their rate of reaction/reactivity

14. Rank nucleophiles/bases based on their rate of reaction/reactivity.

15. Rank solvents based on their effect on rate of reaction.

16. Rank relative intermediate or TS energies/stabilities.

17. Associate a rate equation with mechanism.

18. Use rate equation to solve for an unknown (k, rate, etc.).

19. Explain and/or draw how ΔG or rate is related to Nu, E, solvent structure.

20. Draw a mechanism for an alcohol oxidation reaction (a type of E2 reaction).

21. Demonstrate link between E2 mechanism and oxidation reaction.

22. Identify oxidation/reduction reactions or oxidized/reduced species.

23. Select appropriate oxidizing/reducing reagent for a given transformation.

24. Provide the starting materials or products of an oxidation/reduction reaction.

25. Explain a reaction mechanism and why the reaction proceeds (or not).

26. Design a synthesis involving elimination, substitution, and CHM1321 reactions.

27. Demonstrate a retrosynthetic approach to designing a synthesis.

28. Demonstrate how a reaction fits into a patterns-of-mechanism framework.

Module 3: spectroscopy (N = 15). 1. Determine the degrees of unsaturation (DU), given a molecular formula.

2. Distinguish types of vibration (stretching vs. bending, symmetric vs. asymmetric, in-plane vs. out-of-plane) in infrared.

3. Describe mechanism of radiation absorption and emission

4. Identify the key functional group(s) present in the molecule, given IR data.

5. Justify IR signal position, shape, strength.

6. Identify structural features that lead to distinguishing spectral features, or signals indicative of unique structural features.

7. Identify relationship between vibrational frequency and bond strength, atom size (reduced mass).

8. Use the concept of shielding to predict the chemical shift of 1H NMR signals.

9. Justify chemical shift, multiplicity, and integration.

10. Use integration to determine number of protons represented by a signal.

11. Use principle of splitting to determine signal multiplicity or information about neighbouring protons (limited to n + 1 rule).

12. Propose structure given some of: molecular formula, degrees of unsaturation, infrared spectrum and/or 1H NMR spectra/spectral data, and/or reactants.

13. Explain how to differentiate between multiple structures using molecular formula, degrees of unsaturation, and spectral data.

14. Given a structure, predict multiplicity, integration, number of signals, relative signal intensity.

15. Corroborate structure, given spectra or reaction (e.g., assign signals).

Module 4: π electrophiles—carbonyl-based groups and analogues (N = 46).
Part a: reactions of π electrophiles bearing a leaving group (i.e., carboxylic acid derivatives). 1. Name carbonyl-containing structures; draw structures from names.

2. Use the electron-pushing formalism (EPF, curved arrows) to show reaction of a Nu with a π electrophile bearing a leaving group.

3. Use structural features of migrating groups to justify migratory aptitude (Baeyer–Villiger oxidation).

4. Use MOs to justify retention of stereochemistry in migrating group (Baeyer–Villiger oxidation).

5. Draw orbitals involved in reaction of a Nu with a π electrophile bearing a leaving group.

6. Use structural characteristics to identify potential nucleophiles/nucleophilic positions and leaving groups.

7. Use the EPF to demonstrate the mechanism for acid- and base-catalysed acyl transfer reactions.

8. Assess the capacity for an electrophile to undergo subsequent nucleophilic additions.

9. Use structural characteristics to assess/compare leaving group ability.

10. Rank carboxylic acid derivatives based on electrophilicity/reactivity.

11. Rank nucleophilicity.

12. Determine the direction of an equilibrium between two carboxylic acid derivatives based on their relative stabilities.

13. Demonstrate how to manipulate an equilibrium to promote reactants/products.

14. Identify/employ reagents that will convert a given carboxylic acid derivative to more activated forms (e.g., acid chlorides, anhydrides).

15. Use entropic reasoning to justify reaction outcome.

16. Apply the principle of microscopic reversibility to predict mechanism of a reverse reaction.

17. Apply umpolung reactivity to synthesis problems.


Part b: reactions of nucleophiles with π electrophiles bearing a hidden leaving group. 1. Use the electron-pushing formalism to show formation/hydrolysis of hemi/acetals and/or hemi/ketals.

2. Apply reductive amination to overcome problematic SN2 reactions with amines.

3. Apply the principle of microscopic reversibility to predict mechanism of a reverse reaction.

4. Demonstrate how to manipulate an equilibrium to promote reactants/products.


Part c: reactions of alpha carbon nucleophiles with electrophiles. 1. Use the EPF to demonstrate enol/ate formation under acidic and basic conditions.

2. Use the EPF to show how an alpha carbon nucleophile reacts with an electrophile.

3. Predict kinetic vs. thermodynamic enolate formation based on asymmetric substrate structure and reaction conditions (base structure).

4. Identify kinetic vs. thermodynamic enolate on a reaction coordinate diagram.

5. Identify conditions under which epimerization would occur; use the EPF to demonstrate epimerization and its stereochemical consequences.

6. Use substituent effects to assess relative pKa values of carbon acids.

7. Identify features of a base that form an enolate in a way that avoids self-condensation and minimizes its potential to act as a nucleophile.

8. Use the EPF to demonstrate how enols and enolates may act as nucleophiles.

9. Assess degree of (mono vs. poly) halogenation of enol based on (acidic or basic) conditions.

10. Use structural features to assess stability of keto vs. enol forms.

11. Demonstrate the use of enol/ates toward the formation of C–C bonds (alkylation, Dieckmann, inter- and intramolecular aldol, Claisen).

12. Identify factors that relate structure of carbonyl compounds to their reactivity (e.g., aldehyde vs. ketone).

13. Use the EPF to demonstrate preparation of a phosphorus ylide.

14. Use the EPF to demonstrate how to convert an aldehyde or ketone to the corresponding alkene using a phosphorus ylide (Wittig).

15. Identify structural characteristics of the reacting partners that dictate stereoselectivity.

16. Apply retrosynthetic analysis to identify C–C disconnections that could be associated with alkylation, Dieckmann, inter- and intramolecular aldol, Claisen.


Part d: synthesis. 1. Use mapping to identify starting material atoms that were integrated into the product.

2. Identify bonds broken/formed, atoms added/removed

3. Identify positions where multiple regiochemical or stereochemical outcomes are possible.

4. Use reagent structure to anticipate regiochemical and stereochemical outcomes.

5. Propose a synthesis using reactions from Org I & II given specific starting materials.

6. Design retrosynthetic analysis.

7. Draw product given reactants and reagents.

8. Identify specific reactions/reagents to form given bonds or a particular structural motif.

9. Draw synthons for a bond or structural fragment, based on a complex starting material (e.g., Taxol).

Coarse grained

Module 4: π electrophiles—carbonyls and analogues (N = 12). 1. Use the EPF to propose a mechanism for reactions involving a carbonyl group.

2. Connect structural characteristics to energy, reactivity, and function (rank relative reactivity, make predictions, justifications, etc.).

3. Identify relevant molecular orbitals in a reaction involving a carbonyl group.

4. Use orbitals to predict the outcome of a reaction involving a carbonyl group.

5. Construct molecular orbitals associated with carbonyl-containing compounds.

6. Determine the direction of an equilibrium involving carbonyl-containing compounds and their derivatives.

7. Describe how to manipulate an equilibrium to favour either products or reactants.

8. Describe and apply synthetic strategies to synthetic problems (e.g., mapping, drawing synthons, identifying bonds broken/formed).

9. Propose a synthesis for a target compound using reactions learned to date in the course.

10. Identify, predict, and justify the stereoselectivity or stereospecificity of a reaction.

11. Identify, predict, and justify the chemoselectivity of a reaction.

12. Identify, predict, and justify the regioselectivity or regiospecificity of a reaction.

Appendix 8: uOttawa chemistry program-level learning outcomes (2015)

Knowledge

• Describe the scientific theories and concepts in a given subject.

• Solve problems using the scientific theories and concepts in a given subject.

• Integrate knowledge from across components of the course and from prerequisite courses.

• Generate something new from their knowledge in a given subject (e.g., propose a new synthesis of a compound, design a new nanomaterial).

Scientific skills

• Define problems clearly, develop testable hypotheses, design and execute experiments, analyse data and draw appropriate conclusions.

• Make scientifically defensible arguments.

• Demonstrate computational and data-processing skills.

• Demonstrate numeracy and calculation skills, including error analysis, order-of-magnitude estimations, correct use of units, statistical analysis.

• Describe the strengths and limitations of experimental methods and results.

Laboratory skills

• Use and describe safe disposal techniques, comply with safety regulations, use MSDS sheets.

• Demonstrate appropriate laboratory skills in common laboratory techniques (e.g., syringe techniques, TLC).

• Demonstrate appropriate laboratory skills in using laboratory instruments (e.g., GCMS, UV-vis).

• Record observations and measurements of chemical properties, events, and changes in a reliable way.

• Design an experiment.

• Follow an experimental protocol.

• Apply scientific literacy skills to find and critically evaluate (quality and limitations of knowledge) specific information such as chemical properties (e.g., CRC handbook), experimental procedures (e.g., SciFinder), and conduct literature reviews.

• Organize information through spreadsheet use, reference management, etc.

Communication skills

• Present information in a clear and organized manner by a variety of means (e.g., oral presentation, poster, video).

• Write well-organized and concise reports in an appropriate style (diverse audiences, range of purposes).

• Use chemical structure drawing programs appropriately.

• Work in a group, including being an effective leader, team member, and working with a diverse group of peers.

• Define and implement solutions to problems related to working in a group.

• Communicate with peers, superiors, and subordinates in a professional manner (e.g., emails, cover letters).

Values

• Conduct self responsibly in a laboratory and course.

• Describe the role of chemistry in contemporary societal and global issues.

• Plan and organize effectively, manage time and assets, respect deadlines.

• Self-assess and self-direct their own learning (i.e., possess metacognitive abilities).

Appendix 9: reliability

First, the second rater was given a copy of the 3D-LAP and the meaning of terms in the protocol was discussed. Next, the rater applied the 3D-LAP to a complete list of learning outcomes for Module 4 (Table 4, Stage 1). The two raters then discussed the criteria of the 3D-LAP in further detail and clarified any ambiguity in the wording of the LOs under investigation; the specific LOs were not discussed in the context of the model. The second rater applied the protocol a second time to the same set of LOs (Table 4, Stage 2). Finally, both raters discussed each LO in the context of the model and came to a consensus (Table 4, Stage 3). The Krippendorf's α and percent agreement analyses are tabulated in Table 4 (Krippendorff, 2011).
Table 4 Summary of Krippendorf's α and percent agreement for the inter-rater reliability analysis of Module 4
Module 4 Stage 1 Stage 2 Stage 3
Krippendorff's α Agreement, % Krippendorff's α Agreement, % Krippendorff's α Agreement, %
Part A 0.687 90 0.898 97 1.000 100
Part B 0.526 83 0.733 91 1.000 100
Part C 0.537 84 0.713 91 1.000 100
Part D 0.471 83 0.705 92 1.000 100


The percent agreement between the raters began at moderate-to-high and improved as discussions became more detailed. The alpha values lag behind percent agreement in the first two stages due to the larger proportion of ratings that identified a criterion as absent. When many criteria are not observed, there is a higher probability of chance agreement, and in turn, a correspondingly lower alpha value. This three-stage approach has shed light on how reliably the 3D-LAP may be applied in the context of a curriculum evaluation model. Importantly, it has revealed the importance of attaining clarity in the communication of the LOs as well as ensuring all users have a common understanding of the meaning of each criterion of the 3D-LAP.

Acknowledgements

The University of Ottawa supported this work. We thank Tyler White for work as second rater.

References

  1. Advance HE, (2020), Advance HE's Equality Charters.
  2. American Chemical Society Committee on Chemists with Disabilities, (2001), in Miner D. L., Nieman R. and Swanson A. B. (ed.), Teaching Chemistry to Students with Disabilities: A Manual for High Schools, Colleges, and Graduate Programs, 4th edn, The American Chemical Society.
  3. Amos R. and Levinson R., (2019), “Socio-scientific inquiry-based learning: An approach for engaging with the 2030 Sustainable Development Goals through school science” Socio-scientific inquiry-based learning: An approach for engaging with the 2030 Sustainable Development Goals through school science, Int. J. Dev. Educ. Glob. Learn., 11(1), 29–49.
  4. Anderson J., (2017), Experts on the Future of Work, Jobs Training and Skills, Pew Res. Cent., https://www.pewresearch.org/internet/2017/05/03/the-future-of-jobs-and-jobs-training/.
  5. AODA, (2014), The Act (Accessibility for Ontarians with Disabilities Act).
  6. Arafeh S., (2016), Curriculum mapping in higher education: a case study and proposed content scope and sequence mapping tool, J. Furth. High. Educ., 40(5), 585–611.
  7. Archambault S. G. and Masunaga J., (2015), Curriculum Mapping as a Strategic Planning Tool, J. Libr. Adm., 55(6), 503–519.
  8. Arnold R. D. and Wade J. P., (2015), A Definition of Systems Thinking: A Systems Approach, Procedia Comput. Sci., 44, 669–678.
  9. Assaraf O. B.-Z. and Orion N., (2005), Development of system thinking skills in the context of earth system education, J. Res. Sci. Teach., 42(5), 518–560.
  10. Association of American Colleges, (2007), College Learning for the New Global Century.
  11. Australian Government Department of Education S. and E., (2013), Australian Qualifications Framework, 2nd edn.
  12. Bickman L., (2005), in Mathison S. (ed.), Encyclopedia of Evaluation, SAGE Publications.
  13. Biggs J., (1999), What the Student Does: teaching for enhanced learning, High. Educ. Res. Dev., 18(1), 57–75.
  14. Bodé N. E. and Flynn A. B., (2016), Strategies of Successful Synthesis Solutions: Mapping, Mechanisms, and More, J. Chem. Educ., 93(4), 593–604.
  15. Bodé N. E., Caron J., and Flynn A. B., (2016), Evaluating students’ learning gains and experiences from using nomenclature101.com, Chem. Educ. Res. Pract., 17(4), 1156–1173.
  16. Duerstock B. S. and Shingledecker C. A. (ed.), (2014), From college to careers: Fostering inclusion of persons with disabilities in STEM, The American Association for the Advancement of Science.
  17. Brookhart S. M., (2001), Successful Students’ Formative and Summative Uses of Assessment Information, Assess. Educ. Princ. Policy Pract., 8(2), 153–169.
  18. Calhoun J. G., Ramiah K., Weist E. M. G., and Shortell S. M., (2008), Development of a core competency model for the master of public health degree, Am. J. Public Health, 98(9), 1598–1607.
  19. Carle M., Visser R. and Flynn A. B., (2020), Evaluating students' learning gains, strategies, and errors using OrgChem101's module: organic mechanisms–mastering the arrow, Chem. Educ. Res. Pract., 21, 582–596.
  20. Carleton College, (2020), InTeGrate, Sci. Educ. Resour. Cent, https://serc.carleton.edu/integrate/index.html.
  21. CAST, (2018), Universal Design for Learning Guidelines version 2.2.
  22. Chemical Institute of Canada, (2018a), CSC President's Event fosters brainstorming session, Community Connect.
  23. Chemical Institute of Canada, (2018b), CSC President's Event fosters brainstorming session, Community Connect.
  24. Clauss A. D. and Nelsen S. F., (2009), Integrating Computational Molecular Modeling into the Undergraduate Organic Chemistry Curriculum, J. Chem. Educ., 86(8), 955.
  25. Conzemius A. and O’Neill J., (2006), The handbook for SMART school teams, Solutions Tree.
  26. Cooper M. M., (2013), Chemistry and the Next Generation Science Standards, J. Chem. Educ., 90(6), 679–680.
  27. Cooper M. M. and Klymkowsky M., (2013), Chemistry, Life, the Universe, and Everything: A New Approach to General Chemistry, and a Model for Curriculum Reform, J. Chem. Educ., 90(9), 1116–1122.
  28. Cooper M. M., Caballero M. D., Ebert-May D., Fata-Hartley C. L., Jardeleza S. E., Krajcik J. S., et al., (2015), Challenge faculty to transform STEM learning, Science, 350(6258), 281–282.
  29. Coppola B. P., (2015), An Inevitable Moment: US Brain Drain, Chang. Mag. High. Learn., 47(4), 36–45.
  30. Crudden C., (2019), Special Sessions|102nd Canadian Chemistry Conference and Exhibition, Can. Soc. Chem, https://www.ccce2019.ca/science-advocacy.
  31. Dallas J., (2015), SMART Goals: Everything you need to know about setting S.M.A.R.T. goals, Rex Vault Publishing.
  32. Dole J. A. and Sinatra G. M., (1998), Reconceptualizing change in the cognitive construction of knowledge, Educ. Psychol., 33(2/3), 109–128.
  33. Dweck C. S., (2006), Mindset: The new psychology of success, Random House.
  34. Dweck C. S., (2015), Carol Dweck Revisits the “Growth Mindset”.
  35. Edmondson K. M., (1995), Concept mapping for the development of medical curricula, J. Res. Sci. Teach., 32(7), 777–793.
  36. Ervin L., Carter B., and Robinson P., (2013), Curriculum mapping: not as straightforward as it sounds, J. Vocat. Educ. Train., 65(3), 309–318.
  37. Etkin N. (ed.), (2016), Making Chemistry Inclusive, Hayden-McNeil.
  38. Fetterman D. M., (2001), Foundations of Empowerment Evaluation, Sage Publications.
  39. Finkenstaedt-Quinn S. A., Halim A. S., Chambers T. G., Moon A., Goldman R. S., Gere A. R., and Shultz G. V., (2017), Investigation of the Influence of a Writing-to-Learn Assignment on Student Understanding of Polymer Properties, J. Chem. Educ., 94(11), 1610–1617.
  40. Flynn A. B., (2014), How do students work through organic synthesis learning activities? Chem. Educ. Res. Pract., 15(4), 747–762.
  41. Flynn A. B., (2015), Structure And Evaluation Of Flipped Chemistry Courses: Organic & Spectroscopy, Large And Small, First To Third Year, English And French, Chem. Educ. Res. Pract., 16, 198–211.
  42. Flynn A. B., (2017), Flipped Chemistry Courses: Structure, Aligning Learning Outcomes, and Evaluation, Online Approaches to Chemical Education, American Chemical Society, pp. 151–164.
  43. Flynn A. B., (2018), Growth & Goals: A module to help students take greater control of their learning.
  44. Flynn A. B., (2019), Flynn Research Group, Flynn Res. Gr. Website.
  45. Flynn A. B. and Featherstone R. B., (2017), Language of mechanisms: exam analysis reveals students’ strengths, strategies, and errors when using the electron-pushing formalism (curved arrows) in new reactions, Chem. Educ. Res. Pract., 18(1), 64–77.
  46. Flynn A. B. and Ogilvie W. W., (2015), Mechanisms before Reactions: A Mechanistic Approach to the Organic Chemistry Curriculum Based on Patterns of Electron Flow, J. Chem. Educ., 92(5), 803–810.
  47. Freeman S., Eddy S. L., McDonough M., Smith M. K., Okoroafor N., Jordt H., and Wenderoth M. P., (2014), Active learning increases student performance in science, engineering, and mathematics, Proc. Natl. Acad. Sci. U. S. A., 111(23), 8410–8415.
  48. Galloway K. R., Leung M. W., and Flynn A. B., (2019), Patterns of Reactions: a card sort task to investigate students’ organization of organic chemistry reactions, Chem. Educ. Res. Pract., 20(1), 30–52.
  49. Galloway K. R., Stoyanovich C., and Flynn A. B., (2017), Students’ interpretations of mechanistic language in organic chemistry before learning reactions, Chem. Educ. Res. Pract., 18(2), 353–374.
  50. Galloway K. R., Leung M. W., and Flynn A. B., (2018), A Comparison of How Undergraduates, Graduate Students, and Professors Organize Organic Chemistry Reactions, J. Chem. Educ., 95(3), 355–365.
  51. Gibbons R. E., Villafañe S. M., Stains M., Murphy K. L., and Raker J. R., (2018), Beliefs about learning and enacted instructional practices: An investigation in postsecondary chemistry education, J. Res. Sci. Teach., 55(8), 1111–1133.
  52. Grohs J. R., Kirk G. R., Soledad M. M., and Knight D. B., (2018), Assessing systems thinking: A tool to measure complex reasoning through ill-structured problems, Think. Ski. Creat., 28, 110–130.
  53. Harden R. M., (2001), AMEE Guide No. 21: Curriculum mapping: a tool for transparent and authentic teaching and learning. Med. Teach., 23(2), 123–137.
  54. Herrington D. G., Yezierski E. J., and Bancroft S. F., (2016), Tool trouble: Challenges with using self-report data to evaluate long-term chemistry teacher professional development, J. Res. Sci. Teach., 53(7), 1055–1081.
  55. Hubball H. and Burt H., (2007), Learning Outcomes and Program-level Evaluation in a Four-year Undergraduate Pharmacy Curriculum, Am. J. Pharm. Educ., 71(5), 90.
  56. Imansari N. and Sutadji E., (2017), A Conceptual Framework Curriculum Evaluation Electrical Engineering Education, Int. J. Eval. Res. Educ., 6(4), 265–269.
  57. Inclusion and Diversity Committee of the Royal Society of Chemistry, (2018a), Breaking the Barriers, Thomas Graham House.
  58. Inclusion and Diversity Committee of the Royal Society of Chemistry, (2018b), Diversity landscape of the chemical sciences, Thomas Graham House.
  59. Jewett S., Sutphin K., Gierasch T., Hamilton P., Lilly K., Miller K., et al., (2018), Awareness, Analysis, and Action: Curricular Alignment for Student Success in General Chemistry, J. Chem. Educ., 95(2), 242–247.
  60. Kellamis N. M. and Yezierski E. J., (2019), Applying the Next Generation Science Standards to Current Chemistry Classrooms: How Lessons Measure Up and How to Respond, J. Chem. Educ., 96(7), 1308–1317.
  61. Kirkpatrick, (1996), Great Ideas Revisited. Techniques for Evaluating Training Programs. Revisiting Kirkpatrick's Four-Level Model, Train. Dev., 50(1), 54–59.
  62. Kirkpatrick J. D. and Kirkpatrick W. K., (2016), Kirkpatrick's Four levels of Training Evaluation, Association for Talent Development.
  63. Kolomitro K. and Gee K., (2015), Developing Effective Learning Outcomes: A Practical Guide.
  64. Klymkowsky M. W. and Cooper M. M., (2012), Now for the hard part: the path to coherent curricular design, Biochem. Mol. Biol. Educ., 40(4), 271–272.
  65. Krippendorff K., (2011), Computing Krippendorff's Alpha-Reliability.
  66. Lam B. H. and Tsui K. T., (2016), Curriculum mapping as deliberation – examining the alignment of subject learning outcomes and course curricula, Stud. High. Educ., 41(8), 1371–1388.
  67. Lapierre K. R. and Flynn A. B., (2020), An electronic card sort task to investigate students’ changing interpretations of organic chemistry reactions, J. Res. Sci. Teach., 57(1), 87–111.
  68. Laverty J. T., Underwood S. M., Matz R. L., Posey L. A., Carmel J. H., Caballero M. D., et al., (2016), Characterizing College Science Assessments: The Three-Dimensional Learning Assessment Protocol, PLoS One, 11(9), e0162333.
  69. Mahaffy P. G., Krief A., Hopf H., Mehta G., Matlin S. A., and Henning H., (2018), Reorienting chemistry education through systems thinking, Nat. Rev. Chem., 2, 126.
  70. Marsh C. J. and Willis G., (1994), Curriculum: Alternative approaches, ongoing issues, Merrill.
  71. Matlin S. A., Mehta G., and Hopf H., (2015), Chemistry embraced by all, Science, 347(6227), 1179.
  72. Matlin S. A., Mehta G., Hopf H., and Krief A., (2016), One-world chemistry and systems thinking, Nat. Chem., 8(5), 393–398.
  73. McGill T. L., Williams L. C., Mulford D. R., Blakey S. B., Harris R. J., Kindt J. T., et al., (2019), Chemistry Unbound: Designing a New Four-Year Undergraduate Curriculum, J. Chem. Educ., 96(1), 35–46.
  74. Mertens D. M., (2015), Research and Evaluation in Education and Psychology, 4th edn, SAGE Publications, Inc.
  75. Muntinga M. E., Krajenbrink V. Q., Peerdeman S. M., Croiset G., and Verdonk P., (2016), Toward diversity-responsive medical education: taking an intersectionality-based approach to a curriculum evaluation, Adv. Health Sci. Educ., 21, 541–559.
  76. National Academy of Sciences, National Academy of Engineering and I. of M., (2010), Rising Above the Gathering Storm, Revisited, National Academies Press.
  77. National Research Council (NRC), (2004), in Confrey J. and Stohl V. (ed.), On Evaluating Curricular Effectiveness: Judging the Quality of K-12 Mathematics Evaluations, The National Academic Press.
  78. National Research Council (NRC), (2012), A Framework for K-12 Science Education.
  79. National Research Council (NRC), (2015), Next Generation Science Standards, http://nextgenscience.org.
  80. Natural Sciences and Engineering Research Council of Canada, (2020), Dimensions: equity, diversity, and inclusion Canada, https://www.nserc-crsng.gc.ca/NSERC-CRSNG/EDI-EDI/Dimensions_Dimensions_eng.asp.
  81. Natural Sciences and Engineering Research Council of Canada, (2020), Unconscious bias training module, Equity, Divers. Incl, https://www.nserc-crsng.gc.ca/NSERC-CRSNG/EDI-EDI/Dimensions_Dimensions_eng.asp.
  82. O’Connor E. K., Roy K., and Flynn A. B., (2020), Growth & Goals: a course-integrated open education resource to help students increase learning skills, Can. J. Scholarsh. Teach. Learn., submitted.
  83. O’Connor E. K., Roy K., and Flynn A. B., (2018), Structure, pilot, and evaluation of a new self-regulated learning, growth mindset, and metacognition module that is integrated in postsecondary courses in any level and discipline.
  84. Ontario Universities Council on Quality Assurance, (2019), Appendix 1: OCAV's Undergraduate and Graduate Degree Level Expectations—Ontario Universities Council on Quality Assurance, Qual. Assur. Framew, http://oucqa.ca/framework/appendix-1/.
  85. Orgill M. K., York S., and Mackellar J., (2019), Introduction to Systems Thinking for the Chemistry Education Community, J. Chem. Educ., 96(12), 2720–2729.
  86. Pazicni S. and Flynn A. B., (2019), Systems Thinking in Chemistry Education: Theoretical challenges and opportunities, J. Chem. Educ., 96(12), 2752–2763.
  87. Plaza C. M., Draugalis J. R., Slack M. K., Skrepnek G. H., and Sauer K. A., (2007), Curriculum mapping in program assessment and evaluation, Am. J. Pharm. Educ., 71(2), 20.
  88. Posner G. J., Strike K. A., Hewson P. W., and Gertzog W. A., (1982), Accommodation of a scientific conception: Toward a theory of conceptual change, Sci. Educ., 66(2), 211–227.
  89. Praslova L., (2010), Adaptation of Kirkpatrick's four level model of training criteria to assessment of learning outcomes and program evaluation in Higher Education, Educ. Assess., Eval. Acc., 22(3), 215–225.
  90. Pugh S. L., (2019), A Longitudinal View of Students’ Perspectives on Their Professional and Career Development, Through Optional Business Skills for Chemists Modules, During Their Chemistry Degree Programme, Research and Practice in Chemistry Education, Springer, Singapore, pp. 167–183.
  91. Queen's University E. O., Diversity and Equity Assessment and Planning (DEAP) Tool.
  92. Reid J. and Wilkes J., (2016), Developing and applying quantitative skills maps for STEM curricula, with a focus on different modes of learning, Int. J. Math. Educ. Sci. Technol., 47(6), 837–852.
  93. Reingold I. D., (2001), Bioorganic First: A New Model for the College Chemistry Curriculum, J. Chem. Educ., 78(7), 869.
  94. Richmond B., (1993), Systems thinking: critical thinking skills for the 1990s and beyond, Syst. Dynam. Rev., 9, 113–133.
  95. Richmond B., (1997), The “Thinking” in Systems Thinking: How Can We Make It Easier to Master? Syst. Thinker, 8(2), 1–5.
  96. Robley W., Whittle S., and Murdoch-Eaton D., (2005), Mapping generic skills curricula: outcomes and discussion, J. Furth. High. Educ., 29(4), 321–330.
  97. Royal Australian Chemical Institute, (2020), University Course Guide, Univ. Course Guide.
  98. Sadler T. D. and Zeidler D. L., (2005), The significance of content knowledge for informal reasoning regarding socioscientific issues: applying genetics knowledge to genetic engineering issues, Sci. Educ., 89(1), 71–93.
  99. Saini A., (2017), Inferior: how science got women wrong – and the new research that's rewriting the story, Beacon Press.
  100. Schultz M., O’Brien G., Schmid S., Lawrie G. A., Southam D. C., Priest S. J., et al., (2019), Improving the Assessment of Transferable Skills in Chemistry Through Evaluation of Current Practice, Research and Practice in Chemistry Education, Springer, Singapore, pp. 255–274.
  101. Science and Engineering Leadership Initiative (SELI) and Science and Engineering Leadership Initiative U. of D., (2019), Resources for Students with Disabilities in STEM Fields, Resour. community..
  102. Shultz G. V. and Gere A. R., (2015), Writing-to-Learn the Nature of Science in the Context of the Lewis Dot Structure Model, J. Chem. Educ., 92(8), 1325–1329.
  103. Smith M. K., Jones F. H. M., Gilbert S. L., and Wieman C. E., (2013), The Classroom Observation Protocol for Undergraduate STEM (COPUS): a new instrument to characterize university STEM classroom practices, CBE Life Sci. Educ., 12(4), 618–27.
  104. Spencer D., Riddle M., and Knewstubb B., (2012), Curriculum mapping to embed graduate capabilities, High. Educ. Res. Dev., 31(2), 217–231.
  105. Stains B. M., Harshman J., Barker M. K., Chasteen S. V, Cole R., DeChenne-Peters S. E., et al., (2018), Anatomy of STEM teaching in North American universities, Science, 359(6383), 1468–1470.
  106. Stake R. E., (1967), The Countenance of Educational Evaluation, Teach. Coll. Rec., 68, 523–540.
  107. Stoyanovich C., Gandhi A., and Flynn A. B., (2015), Acid–Base Learning Outcomes for Students in an Introductory Organic Chemistry Course, J. Chem. Educ., 92(2), 220–229.
  108. Stufflebeam D. L., (1983), The CIPP [Context Input Progress Product] model for program evaluation BT – Evaluation models, Evaluation models, Kluwer-Nijhoff.
  109. Sumsion J. and Goodfellow J., (2004), Identifying generic skills through curriculum mapping: a critical evaluation, High. Educ. Res. Dev., 23(3), 329–346.
  110. Talanquer V., (2018), Controlling intuition, The CERG Webinar Series #CERGiner.
  111. Talanquer V. and Pollard J., (2010), Let's teach how we think instead of what we know, Chem. Educ. Res. Pr., 11(2), 74–83.
  112. Talanquer V. and Pollard J., (2017), Reforming a Large Foundational Course: Successes and Challenges, J. Chem. Educ., 94(12), 1844–1851.
  113. Tertiary Education Quality and Standards Agency, Guidance Note: Course Design (including Learning Outcomes and Assessment), Guid. note Course Des.
  114. Tertiary Education Quality and Standards Agency, (2015), Higher Education Standards Framework.
  115. TRC, (2012), Truth and Reconciliation Commission of Canada: Calls to Action.
  116. Trygstad P. J., Banilower E. R., and Pasley J. D., (2016), Operationalizing the Science and Engineering Practices, Horizons Research, Inc.
  117. Uchiyama K. P. and Radin J. L., (2009), Curriculum Mapping in Higher Education: A Vehicle for Collaboration, Innov. High. Educ., 33(4), 271–280.
  118. Underwood S. M., Reyes-Gastelum D., and Cooper M. M., (2016), When do students recognize relationships between molecular structure and properties? A longitudinal comparison of the impact of traditional and transformed curricula, Chem. Educ. Res. Pract., 17, 365–380.
  119. Underwood S. M., Posey L. A., Herrington D. G., Carmel J. H., and Cooper M. M., (2018), Adapting Assessment Tasks To Support Three-Dimensional Learning, J. Chem. Educ., 95(2), 207–217.
  120. United Nations, (2015), Sustainable Development Goals, http://un.org/sustainabledevelopment.
  121. Van den Akker J., Gravemeijer K., McKenney S., and Nieveen N., (2006), Education Design Research, Routledge.
  122. Vázquez A. V., McLoughlin K., Sabbagh M., Runkle A. C., Simon J., Coppola B. P., and Pazicni S., (2012), Writing-To-Teach: A New Pedagogical Approach To Elicit Explanative Writing from Undergraduate Chemistry Students, J. Chem. Educ., 89(8), 1025–1031.
  123. Wang C.-L., (2015), Mapping or tracing? Rethinking curriculum mapping in higher education, Stud. High. Educ., 40(9), 1550–1559.
  124. Webb C., Dahl D., Pesterfield L., Lovell D., Zhang R., Ballard S., and Kellie S., (2013), Modeling Collaboration and Partnership in a Program Integrating NMR across the Chemistry Curriculum at a University and a Community and Technical College, J. Chem. Educ., 90(7), 873–876.
  125. Webber D. M. and Flynn A. B., (2018), How Are Students Solving Familiar and Unfamiliar Organic Chemistry Mechanism Questions in a New Curriculum? J. Chem. Educ., 98(9), 1451–1467.
  126. Wiggins G. P. and McTighe J., (2005), in McTighe J. (ed.), Understanding by design, Expanded 2, Association for Supervision and Curriculum Development.
  127. Williams L. C., Underwood S. M., Klymkowsky M. W., and Cooper M. M., (2015), Are Noncovalent Interactions an Achilles Heel in Chemistry Education? A Comparison of Instructional Approaches, J. Chem. Educ., 92(12), 1979–1987.
  128. World Economic Forum, (2016), The Future of Jobs: Employment, Skills and Workforce Strategy for the Fourth Industrial Revolution.
  129. World Economic Forum: Centre for the New Economy and Society, (2018), The Future of Jobs Report 2018 Insight Report Centre for the New Economy and Society.
  130. Yurtseven M. K., (2016), Decision Making And Systems Thinking: Educational Issues.
  131. Zeidler D. L., Sadler T. D., Applebaum S., and Callahan B. E., (2009), Advancing reflective judgment through socioscientific issues, J. Res. Sci. Teach., 46(1), 74–101.
  132. Zimmerman B. J., (1990), Self-Regulated Learning and Academic Achievement: An Overview, Educ. Psychol., 25(1), 3–17.
  133. Zimmerman B. J., (2002), Becoming a Self-Regulated Learner: An Overview, Theory Pract., 41(2), 64–70.

This journal is © The Royal Society of Chemistry 2020