Open Access Article
This Open Access Article is licensed under a
Creative Commons Attribution 3.0 Unported Licence

Physical chemistry students’ learning profiles and their relation to study-related burnout and perceptions of peer and self-assessment

Lauri J. Partanen *a, Liisa Myyry b and Henna Asikainen b
aSchool of Chemical Engineering, Department of Chemistry and Materials Science, Aalto University, Espoo, Finland. E-mail: lauri.partanen@aalto.fi
bFaculty of Educational Sciences, Centre for University Teaching and Learning, University of Helsinki, Helsinki, Finland

Received 7th July 2023 , Accepted 5th December 2023

First published on 7th December 2023


Abstract

We explored chemical engineering students’ approaches to learning, study-related burnout, and perceptions of peer and self-assessment in a challenging physical chemistry thermodynamics course. Cluster analysis revealed three learning profiles based on students’ approaches to learning: students who scored high in both organised studying and the deep approach to learning, students who scored high in the unreflective approach to learning, and students who scored high in all three approaches. According to our findings, students who employed deep learning strategies and managed their time carefully experience the least study-related burnout. These students also felt more efficacious when participating in assessment and had fever negative experiences of both peer and self-assessment. Consequently, physical chemistry educators should adopt practices that facilitate a deeper approach to learning, including paying careful attention to course workload and utilising teaching methodologies that can foster the deep approach like peer and self-assessment.


Introduction

Physical chemistry is often considered a challenging field of study among university chemistry students (Mammino, 2009). This is partly because a solid grasp of physicochemical concepts requires proficiency in mathematics (Tsaparlis, 2001) even though the transference of mathematical skills into applied subjects is not straightforward (Hadfield and Wieman, 2010; Becker and Towns, 2012). Furthermore, success in physical chemistry is not solely dependent on mathematical ability (Derrick and Derrick, 2002; Tsaparlis, 2007), as logical thinking and conceptual understanding are crucial in mastering physicochemical topics. Indeed, past research has documented a prevalence of alternative conceptions (Tsaparlis, 2007; Tsaparlis and Papaphotis, 2009; Bain et al., 2014; Bain and Towns, 2016) that stem from previous instruction, or from the commonplace use of concepts like temperature and work (van Roon et al., 1994; Thomas and Schwenz, 1998; Loverude et al., 2002; Kautz et al., 2005).

Apart from the abstract and mathematical nature of concepts, physical chemistry also poses challenges due to factors such as instructor-centred pedagogical approaches, excessive course content, limited resources, and waning student motivation (Sözbilir, 2004). Consequently, many students enter physical chemistry courses with negative perceptions and low expectations of success (Nicoll and Francisco 2001; Partanen, 2016). This is particularly concerning given that study-related burnout is prevalent among university students (Salmela-Aro and Read, 2017), including chemical engineers, (Gomez Jimenez et al., 2021) and that courses perceived as excessively difficult can exacerbate exhaustion (Maslach et al., 2001).

Research has shown that the way students approach learning impacts their likelihood of experiencing burnout. For example, burnout is more prevalent among students relying on the surface approach towards learning (Asikainen et al., 2022). This approach emphasises memorising and leaves students struggling with a fragmented knowledge base (Entwistle and Ramsden, 1983; Asikainen et al., 2013), putting them at risk of either changing their major or dropping out completely (Lastusaari et al., 2019). Meanwhile, students employing the deep approach focus on understanding and using meaningful learning strategies. The deep approach is connected with less study-related burnout (Asikainen et al., 2020).

In addition to students’ approaches to learning, their perceptions of the teaching–learning environment impact study-related burnout (Meriläinen, 2014; Meriläinen and Kuittinen, 2014). Indeed, recent research has connected chemical engineering students’ perceptions of peer and self-assessment in a thermodynamics course with both their learning approaches and study-related burnout (Partanen et al., 2023). In self-assessment learners are involved in judging their own learning, particularly their learning outcomes or achievements (Boud and Falchikov, 1989), whereas in peer assessment students evaluate similar status peers (Topping 1998). In the literature, peer and self-assessments have been associated with several benefits that are relevant in the physical chemistry context: on one side, peer assessment can enhance achievement, attitudes, and higher order thinking skills, such as critical thinking and problem-solving (Topping, 1998; Huisman et al., 2019; Hadzhikoleva et al., 2019), which are key competencies in physical chemistry. On the other side, self-assessment has been shown to increase metacognitive knowledge (Mok et al., 2006) and to help students take more responsibility for learning (Häsä et al., 2018). Specifically in physical chemistry, learning modules that include peer and self-assessment have been associated with enhanced learning outcomes (Partanen, 2018; Partanen, 2023) as well as improved attitudes (Partanen, 2020; Partanen, 2023).

Past studies have looked at different aspects of peer and self-assessment in the chemistry context such as its validity (Tsaparlis et al., 1999; Tashiro et al., 2021), implementation (Wenzel, 2007), and the ability to facilitate metacognition (Casselman and Atwood, 2017). Peer review has also been broadly used in the context of writing tasks, particularly laboratory reports (Margerum et al., 2007; Berry and Fawkes, 2010; Zwicky and Hands, 2016; Basso, 2020; Piccinno et al., 2023). However, there has been less research on students’ perceptions of peer and self-assessment. Even though student perceptions of the teaching–learning environment are known to be connected both to the learning approaches students employ (Richardson, 2005; Parpala et al., 2010; Herrmann et al., 2017) and the burnout they experience (Meriläinen, 2014; Meriläinen and Kuittinen, 2014), these connections have been little investigated at the course level. Thus, there is a lack of knowledge on how specific teaching practices like peer and self-assessment relate to student's approaches to learning and especially study-related burnout. In this study, we seek to understand the interplay between physical chemistry students’ learning approaches, study-related burnout and their perceptions of peer and self-assessment in the context of a thermodynamics course at Aalto university in Finland. This course is widely considered challenging, making it a potential contributor to study-related burnout.

Literature review

Approaches to learning

University students have different strategies and aims in learning. One way to conceptualise these different ways of studying and learning is through the Students’ approaches to learning (SAL) framework (Entwistle and Ramsden, 1983; Lonka et al., 2004; Asikainen and Gijbels, 2017). At the core of this framework are the deep and surface approaches to learning (Entwistle and Ramsden, 1983; Marton and Säljö, 1984). The deep approach emphasises learning aimed at understanding, combined with the use of meaningful learning strategies, relating ideas to previous knowledge, and applying critical thinking. Meanwhile, the surface approach emphasises memorising, which often leaves the student with a fragmented knowledge base (Entwistle and Ramsden, 1983; Asikainen et al., 2013). Recent studies have further highlighted the lack of reflection in this approach, with calls for it to be renamed the unreflective approach (Lindblom-Ylänne et al., 2019). This is also the term we will adopt in this article. For example, when reading a text, a student who is applying the deep approach to learning concentrates on the meaning of the learned text, wonders how it relates to their previous knowledge and evaluates different viewpoints presented in the text. Meanwhile, a student who is relying on the unreflective approach tries to memorise the text as it is written and does not reflect on their learning or try to connect it to previous knowledge. Both approaches have been identified in the chemical engineering context (Jumari et al., 2017). In addition to these two, a third approach that focuses on time management skills and effort has been recognised. It has been named organised studying (Entwistle and McCune, 2004).

Student's approach to learning can vary from one context to another (Richardson, 2011). Which approach is adopted is influenced by both the individual's learning orientation and their perception of task requirements. For example, when facing a heavy workload or threatening situations, individuals are more likely to adopt the unreflective approach (Tait and Entwistle, 1996; Kyndt et al., 2011). Research also indicates that the deep and organised approaches are related to positive perceptions of the teaching–learning environment, while the unreflective approach is connected to more negative ones (Richardson, 2005; Parpala et al., 2010; Kyndt et al., 2011; Herrmann et al., 2017).

One important way that a teacher can influence students’ learning approaches is through assessment (Rust et al., 2005; Struyven et al., 2005). For example, repetitive assessment methods are linked to the unreflective approach while methods that require deeper understanding, such as essays, correlate with the deep approach (Struyven et al., 2005). Likewise, assessments that measure real-life competencies seem to motivate students to adopt deep learning strategies (Gulikers et al., 2006). The interaction between the teaching–learning environment and students’ learning approaches is bidirectional: for instance, students tend to prefer examinations that align with their preferred learning approach (Struyven et al., 2005) with students applying a deep approach favouring challenging courses (Halme et al., 2021). Studies have also found that the deep and organised approaches are associated with more positive perceptions of assessment's role in supporting learning, compared to the unreflective approach (Parpala et al., 2010). However, contrary results by Gijbels et al. (2008) associate assessment that supports learning with an increase in the unreflective approach.

Research has revealed that students tend to cluster into different combinations regarding their approaches to learning. For example, Parpala et al. (2010) identified four distinct profiles among university students from various disciplines: (1) organised students, (2) students applying the deep approach, (3) students applying the unreflective approach, and (4) unorganised students applying the deep approach who scored high in the deep approach but low in organised studying. These profiles have been found in other studies as well (Salmisto et al., 2016; Haarala-Muhonen et al., 2017; Asikainen et al., 2020), while different combinations have emerged in others. For instance, a recent study by Asikainen and Katajavuori (2022) identified three profiles: (1) students employing the deep approach, (2) organised students, and (3) students embodying the unreflective approach. Meanwhile, another recent study discovered the following profiles: (1) unorganised and unreflective students, (2) deep and unorganised students, (3) students representing a deep approach, and (4) all high students (Parpala et al., 2021). The paradoxical all-high profile refers to students with dissonant or seemingly contradictory study strategies. It has been suggested that these students have not yet had sufficient time to develop a differentiated studying strategy (Freyer and Vermunt, 2018). On the other hand, Lindblom-Ylänne (2003) suggested that these students may have difficulties in changing or developing their study strategies even when they know that their current strategies are inefficacious.

Study-related burnout

Prolonged stress experienced during studies can aggravate to study-related burnout. Study-related burnout encompasses three main components: study-related exhaustion, cynicism, and inadequacy (Schaufeli et al., 2001; Salmela-Aro et al., 2009; Salmela-Aro and Kunttu, 2010). Study-related exhaustion refers to feelings of being overwhelmed by a heavy workload that negatively impacts other aspects of life such as sleep and close relationships. Cynicism involves a decline in interest and an indifferent attitude towards studying. Inadequacy entails feelings of incompetence and a sense of poor achievement in academic pursuits (Salmela-Aro et al., 2009). Problematically, study-related burnout has been negatively associated with academic achievement (Madigan and Curran, 2021), study engagement (Salmela-Aro and Read, 2017), intrinsic motivation aimed at personal growth, and the development of intellectual capacity (Hyytinen et al., 2022).

There is rising concern about burnout among university students. For example, Hyytinen et al. (2022) revealed that as many as 30% of first-year students are already at-risk level for study-related burnout. Other studies have shown that while 7% of students in higher education were at severe burnout risk, 30% were classified as simultaneously engaged and exhausted (Salmela-Aro and Read, 2017). Burnout risk has also risen to a worrying level among chemical engineering students (Gomez Jimenez, et al. 2021).

Students’ approaches to learning have been linked to wellbeing, as students’ perceptions of study workload interact with their processes of understanding (Hailikari et al., 2018). Notably, a perceived heavy workload can contribute to stress and ultimately lead to symptoms of exhaustion, which are characteristic of burnout (Maslach et al., 2001). Specifically, the unreflective approach is positively associated with study-related burnout (Asikainen et al., 2020; 2022), together with perceptions of workload in studies (Kyndt et al., 2011). Different learning profiles among students appear to further influence study-related burnout: unreflective learners, who exhibit high scores in the unreflective approach and relatively low scores in the deep approach and organised studying, tend to report higher levels of exhaustion, inadequacy, and cynicism. In contrast, students employing the deep approach tend to experience lower levels of exhaustion and inadequacy compared to both organised students and unorganised students applying a deep approach (Asikainen et al., 2020). This indicates that students lacking reflection practices are more susceptible to experiencing burnout. Furthermore, students with poor time management skills tend to experience more stress, exhaustion, and less interest in their studies (Heikkilä et al., 2012).

Perceptions of peer and self-assessment

Recently, Partanen et al. (2023) developed the Perceptions of peer and self-assessment (PPSA) instrument. The PPSA measures student perceptions of peer and self-assessment on three subscales: assessment as supporting learning, self-efficacy in assessment, and negative experiences of peer and self-assessment. Research indicates that students generally value engaging with the assessment process, and that their perceptions of peer and self-assessment tend to be positive (Hanrahan and Isaacs 2001; Wen and Tsai 2006; Andrade and Du 2007; Vickerman 2009; Carvalho, 2013; Mulder et al., 2014a; Micán and Medina 2017; Wanner and Palmer 2018; Andersson and Weurlander 2019; Andrade 2019). However as hinted by the three PPSA subscales, a plethora of distinct peer and self-assessment perceptions have been identified in the literature: on the positive side students report that participation in peer and self-assessment enhances motivation (Hanrahan and Isaacs 2001; Van Hattum-Janssen and Lourenço 2008; Planas Lladó et al., 2014; Gholami 2016). They also feel that peer assessment deepens understanding of the assessment criteria (Boud and Holmes 1981; Ndoye 2017; Wanner and Palmer 2018), fosters a sense of fairness in the assessment process (Ion et al., 2023), and facilitates better learning (Asikainen et al., 2014; Mulder et al., 2014a; Mulder et al., 2014b; Moneypenny et al., 2018; Partanen 2020). Peer and self-assessment are further perceived as developing critical thinking skills (van Helvoort 2012; Siow 2015; Andersson and Weurlander 2019), while self-assessment helps students take ownership of their learning (Lopez and Kossack 2007; Bourke 2014; Ndoye 2017).

On the negative side, students can find peer assessment intimidating or time-consuming (Hanrahan and Isaacs 2001; Sluijsmans et al., 2001; Moneypenny et al., 2018) while others express dissatisfaction with the notion of taking on the teacher's work (Van Hattum-Janssen and Lourenço 2008; Willey and Gardner 2010). Distrust towards the validity and fairness of peer assessment is also commonplace (Van Zundert et al., 2010; Kaufman and Schunn 2011; Patton 2012; Carvalho 2013; Wanner and Palmer 2018; Andersson and Weurlander 2019) as students are concerned with the competence and impartiality of their peer-assessors (Willey and Gardner 2010; Kaufman and Schunn 2011; Carvalho 2013; Mulder et al., 2014b) or their own ability to assess (Van Hattum-Janssen and Lourenço 2008). Especially when the evaluation criteria are unclear, peer and self-assessment can be frustrating and difficult (Hanrahan and Isaacs, 2001; Andrade and Du, 2007).

To date, the relationship between peer and self-assessment perceptions and wellbeing has received little research, but there are some indications of a potential connection. For one, there is the possible influence of peer support on wellbeing (John et al., 2018; Drysdale et al., 2022) suggesting that engaging in formative feedback and discussions with peers could enhance students' wellbeing. Self-assessment has also been specifically connected to self-efficacy (Panadero et al., 2017; Nieminen et al., 2021) which, in turn, is associated with wellbeing. Therefore, it can be hypothesised that self-assessment may have a positive impact on students' wellbeing. With regards to learning approaches, there is evidence that the teaching–learning environment and specifically assessment practices can impact students’ approaches to learning, as discussed before. However, few studies have looked specifically at peer or self-assessment, although there is some evidence that self-grading can promote the deep approach (Nieminen et al., 2021) and that peer assessment can promote deeper learning (Asikainen et al., 2014). There is also recent correlational evidence that feelings of self-efficacy in self-and peer assessment are related to the deep approach, while negative perceptions of peer and self-assessment correlate with the unreflective approach (Partanen et al., 2023).

Research questions

Our goal is to elucidate the connections between physical chemistry students’ learning approaches, study-related burnout and perceptions of peer and self-assessment within the context of a challenging physical chemistry course. As evidenced by the literature review, a nuanced understanding of the impact of learning approaches requires one to focus on learning profiles, i.e., clusters of the different learning approaches. Consequently, in this study we employ cluster analysis to investigate how students with dissimilar learning profiles differ in their peer and self-assessment perceptions and study-related burnout. Our research questions are

(1) What learning profiles are present among chemical engineering students in a thermodynamics course?

(2) How are the learning profiles related to student experiences of peer and self-assessment and study-related burnout?

Methods

Course background

The data were collected from a bachelor-level chemical thermodynamics course organised between 2019 and 2021 at Aalto University in Finland. The course is an obligatory part of the chemical engineering curriculum.

During the course, peer and self-assessment were used for six sets of course problems, and constituted approximately 10% of the overall course grade. The peer and self-assessment scheme shared communalities with the Peer assessment learning sessions proposed by O'moore and Baldock (2007): the deadlines for each problem set were communicated to the students at the beginning of the course, while the problems themselves were made available at least one week before the deadline. During the period leading up to the deadline, students had the opportunity to attend up to 3 or 4 walk-in study halls where they could collaborate with their peers to solve the problems. Teaching staff was also present in these study halls to offer assistance and guidance. Each problem set typically consisted of 2–3 problems. The problems were made of subtasks spanning the second and third knowledge categories and third through fifth cognitive processing categories of the Revised Bloom's taxonomy (Anderson and Krathwohl 2001).

The students submitted their solutions digitally through the course's online learning platform, where they also performed the peer and self-assessment. Each student was responsible for assessing their own solutions and those of two anonymous peers. To facilitate the assessment process, clear instructions, an assessment rubric, and model solutions were provided. Two sample subtasks and their assessment rubrics are provided in Appendix 1. The assessment criteria and practices were determined solely by the course instructor, without input from the students.

The peer and self-assessments included both grading and a formative component: along with assigning numerical marks, students were required to write feedback and justifications for their evaluations in an open response field. For instance, they had to explain any deductions they made based on the assessment rubric. While the presence of text in the open response fields was automatically checked, the content and quality of the student feedback were not examined.

The final mark for each student was calculated as an average of the three assessments they received. When the minimum and maximum marks differed significantly, a course instructor assessed the solutions. Students received credit based both on the quantity and quality of the numerical assessments they provided. The course instructor offered general feedback to the students, highlighting trends and observations based on the solutions submitted. While peer and self-assessment are used in certain courses at the university, they are still considered relatively novel and uncommon teaching methods.

Participants

The primary data set was gathered from an end-of-course survey at the 2020 and 2021 Chemical thermodynamics courses at Aalto University. In addition, results from an end-of-course survey in 2019 were used as a test-group to verify the clustering of the 2020–2021 primary data set.

In 2021, 139 students initially enrolled in the course. From the 127 students that did not drop out, 106 (83%) responded to the end-of-course survey. In 2020, the number of initially enrolled students was 155 of whom 149 did not drop out. From these, 124 (83%) responded to the end-of-course survey and provided research permission. In 2019, these numbers were 149, 119, and 114 (96%), respectively. Most (78%) of the participants in this study were in their second year, and half (55%) were females. The mean age was 21.6 years, with a range of 19–43 years.

Measures

The surveys in 2020 and 2021 included measures for learning approaches, study-related burnout, and student perceptions of peer and self-assessment. In contrast, the 2019 survey included only the learning approach measure. The three measures used in the primary data set are summarised below. Example items are provided in parentheses. The individual items for each measure are also available in Appendix 2. All measures utilised a Likert scale ranging from 1 (completely disagree) to 5 (completely agree). The end-of-course surveys also contained components that were not relevant to this study.
Learning approaches. The learning approaches were gauged with a 12-item HowUlearn questionnaire (Parpala and Lindblom-Ylänne 2012). The questionnaire encompassed three dimensions: deep approach (“I have carefully looked for evidence to reach my own conclusions about what I have been studying during the course.”), unreflected approach (“Much of what I have learned seems nothing more than unrelated bits and pieces.”) and organised studying (“I organised my study time carefully to make the best use of it.”). It should be noted that a specific approach to learning questionnaire exists for the chemistry context (Lastusaari et al., 2016; Lastusaari et al., 2019). This questionnaire divides the approaches into more precise dimensions, including practical approaches to laboratory learning. We chose to employ the general measure because the laboratory context was not relevant for this study. In addition, we wanted our results to be comparable with earlier research on learning profiles.
Study-related burnout. An instrument developed by Salmela-Aro et al. (2009) was employed for the study-related burnout. The instrument consisted of nine items making up three subscales: exhaustion (“I feel overwhelmed by the work related to my studies.”), cynicism (“I feel a lack of study motivation and often think of giving up.”), and inadequacy (“I often have feelings of inadequacy in my studies.”).
Perceptions of Peer and self-assessment. Students’ perceptions of peer and self-assessment were measured with the third version of the PPSA-instrument (Partanen et al., 2023). The questionnaire contained 15 items for both peer and self-assessment, which were divided into three analogical factors for peer and self-assessment using exploratory factor analysis in our previous study (Partanen et al., 2023). In the case of peer assessment perceptions, the three factors were peer assessment as supporting learning (“Peer assessment supported my learning in the course”), negative experiences of peer assessment (“Peer assessment increased the workload too much”), and self-efficacy in peer assessment (“I was able to assess the performance of my peers with the provided assessment rubrics and model solutions”). In the following, the acronyms PSL, PNE, and PSE were used for these factors, respectively. Meanwhile, SSL, SNE, and SSE were employed for the corresponding self-assessment factors.

Analysis

Statistical analyses were carried out using both SPSS and R. First, we conducted confirmatory factor analyses to scales measuring learning approaches and study-related burnout. Then, we used the learning approach data to create cluster profiles. Clustering was done using the k-means-algorithm with a Euclidean distance measure. The number and structure of clusters were determined using the NbClust (Charrad et al., 2014), and Cluster packages in R while the AMOS program was used for the confirmatory factor analysis. There was no evidence of response sets in the data.

Ethical considerations

Based on national guidelines from the Finnish Advisory Board on Research Integrity, the research did not need a prior ethical review statement from the ethical board as all participants were legally competent adults, and the study did not utilise sensitive personal information, risk the physical integrity of the participants, or cause mental harm or threats of safety beyond everyday life. Informed consent was provided at the beginning of the course each year through an online form containing a description of the research, data management practices and an assurance that individual responses would not be identified from the data. The course homepage further contained detailed information on the research and the full privacy statement.

Since the course instructor was part of the research team, careful steps were taken to avoid any perceptions that the students should participate in the research because of the instructor's involvement. While participants received approximately 1% of the total course marks of credit for responding to the end-of-course survey, it was emphasised that they would receive this credit irrespective of whether they provided research permission. The participants were informed that they could renege the research permission at any time with no detrimental effects on their course performance, and that the responses would have no effect on the course grade. There were also additional tasks that the students could do during to course to make up for the credits lost for declining to answer the survey.

Results

Descriptive statistics

Confirmatory factor analyses were conducted to the scales measuring approaches to learning and study-related burnout. The three-factor solution for approaches to learning with the deep, unreflective and organised scales resulted in an acceptable model (χ2 = 116[thin space (1/6-em)]590, p < 0.001, CFI = 0.928, RMSEA = 0.075). Similarly, for study-related burnout a model with three factors (inadequacy, cynicism, and exhaustion) resulted in an acceptable fit (χ2 = 44[thin space (1/6-em)]931, p = 0.006, CFI = 0.981, RMSEA = 0.062). The PPSA was not investigated this way because the primary data set contained participants from the original exploratory factor analysis used to design the questionnaire (Partanen et al., 2023).

Table 1 presents the descriptive statistics of the variables in this study for the primary 230 student sample. It also includes the correlation coefficients between the variables. Only coefficients that were statistically significant at the p < 0.05 – level are displayed. As expected, the correlations among variables within the same instrument, such as learning approaches, were generally stronger than those between variables from different instruments. Specifically, correlations above 0.45 or below −0.45 were observed between the subscales of the study-related burnout instrument, as well as between the deep and organised approaches to learning. Regarding the peer and self-assessment perception subscales, such correlations were found between the learning enhancing experiences of peer and self-assessment (PSL and SSL), the negative experiences of peer and self-assessment (PNE and SNE), the peer and self-assessment self-efficacies (PSE and SSE), peer assessment self-efficacy and learning enhancing experiences of peer assessment (PSE and PSL), and self-assessment self-efficacy and the learning enhancing experiences of self-assessment (SSE and SSL). Notably, inter-instrument correlations of this magnitude were only found between the unreflective approach to learning and the exhaustion and inadequacy aspects of study-related burnout.

Table 1 Mean values and standard deviations for the studied variables together with the statistically significant (p < 0.05) correlation coefficients for the primary N = 230 sample
Mean (sd) U D O EXH CYN INA PSL PNE PSE SSL SNE
PSL = peer assessment as supporting learning-factor, PNE = negative experiences of peer assessment-factor, PSE = self-efficacy in peer assessment-factor, SSL = self-assessment as supporting learning-factor, SNE = negative experiences of self-assessment-factor, SSE = self-efficacy in self-assessment-factor.
Unreflective (U) 3.27 (0.77)
Deep (D) 3.21 (0.66) −0.20
Organised (O) 2.85 (0.81) −0.17 0.46
Exhaustion (EXH) 2.92 (1.00) 0.48
Cynicism (CYN) 2.29 (1.09) 0.37 −0.29 −0.31 0.45
Inadequacy (INA) 3.25 (1.10) 0.53 −0.13 −0.24 0.66 0.65
PSL 3.08 (0.83) 0.15 0.15
PNE 2.33 (0.69) 0.22 0.19 0.21 0.18 −0.32
PSE 3.84 (0.73) −0.16 0.19 0.46 −0.44
SSL 3.22 (0.76) 0.17 0.17 0.77 −0.28 0.40
SNE 2.40 (0.73) 0.32 −0.15 0.18 0.19 0.21 −0.29 0.63 −0.29 −0.33
SSE 3.84 (0.70) −0.23 0.19 −0.15 −0.14 0.44 −0.42 0.75 0.49 −0.41


Cluster determination

We employed multiple cluster indices to determine the optimal number of clusters in our analysis. Initially, we utilised the learning approach data from our test set of 114 students from 2019. Based on the evaluation of 30 clustering indices within the NbClust R-package (Charrad et al., 2014), the 2- and 3-cluster solutions received support from 7 and 8 indices, respectively. Solutions with higher numbers of clusters were always supported by fewer than 3 indices.

For the primary research data, Table 2 presents cluster indices and the corresponding optimal number of clusters. Consistent with the findings from the test data, the 2- and 3-cluster solutions were predominant, with the 2-cluster solution receiving support from 10 indices, and the 3-cluster solution from 5. Again, solutions with a greater number of clusters were supported by fewer than 3 indices. To further assess the stability of our cluster solution we performed the clustering using 4 different distance measures besides the Euclidean and 7 different clustering methods besides k-means, resulting in 40 different method-distance measure pairs. The 2- and 3-cluster solutions were consistently favoured, with most cases preferring the 2-cluster solution.

Table 2 Sample values for fit indices used to determine the number of clusters. The star behind the value indicates whether the value supports the 2 or 3 cluster solution
Indexa 2 Clusters 3 Clusters
a See Table 1 in Charrad et al. (2014) for references of the individual indices.
KL 11.8848* 0.1519
CH 112.5318* 107.533
McClain 0.6960* 1.1895
Silhouette 0.2886* 0.2821
Cindex 0.3293 0.3237*
CCC 31.7747* 20.7020
DB 1.4234 1.2763*
Ratkowsky 0.3911* 0.3907
PtBiserial 0.3879 0.4304*
Gamma 0.4557 0.5524*
SDindex 3.6302 2.9521*


Table 3 displays the cluster sizes, means and standard deviations for the different learning approaches from the primary data set. Analogous results were obtained with the test data. We see that both 2- and 3-cluster solutions have reasonable numbers of students in each cluster. The 2-cluster solution appears to contain one cluster where the students are high in both deep and organised approaches to learning and lower in the unreflective approach. In contrast, the second cluster consists of students high in the unreflective approach. The 3-cluster solution mainly splits the first of these clusters into two, resulting in a new “All high”-cluster as indicated in Table 3. Since the 3-cluster solution appears to contain more information with reasonable cluster sizes, we chose it for further analysis. Fig. 1 visualises the cluster means for the learning approaches.

Table 3 Mean values and standard deviations for the unreflective, deep and organised approaches to learning for the 2- and 3-cluster solutions, and the number of members in each cluster
High organised and deep All high High unreflective
2-Cluster
N 110 120
Unreflective 2.94 (0.81) 3.56 (0.59)
Deep 3.59 (0.51) 2.86 (0.59)
Organised 3.45 (0.57) 2.30 (0.58)
3-Cluster
N 61 69 100
Unreflective 2.33 (0.45) 3.72 (0.48) 3.52 (0.58)
Deep 3.70 (0.48) 3.32 (0.55) 2.84 (0.61)
Organised 3.26 (0.67) 3.47 (0.47) 2.16 (0.50)



image file: d3rp00172e-f1.tif
Fig. 1 Cluster means for the three clustering variables.

Cluster comparison

After the three clusters were identified, we examined how cluster members scored on the study-related burnout scale, and how they perceived peer and self-assessment perceptions. Fig. 2 illustrates the cluster means for the subscales of the study-related burnout and peer and self-assessment perception measures. From Fig. 2, it is evident that the “High organised and deep” group experienced substantially lower levels of study-related burnout in all subscales. Meanwhile, the “All high” and “High unreflective” groups are similar with respect cynicism and inadequacy, but the “High unreflective” group reported less exhaustion.
image file: d3rp00172e-f2.tif
Fig. 2 Cluster means for the study-related burnout and peer and self-assessment perception subscales. EXH = exhaustion, CYN = cynicism, INA = inadequacy, PSL = peer assessment as supporting learning-factor, PNE = negative experiences of peer assessment-factor, PSE = self-efficacy in peer assessment-factor, SSL = self-assessment as supporting learning-factor, SNE = negative experiences of self-assessment-factor, SSE = self-efficacy in self-assessment-factor.

We further investigated the trends in means for study-related burnout using one-way ANOVAs and Dunnett's T3 post hoc tests. Table 4 displays the mean values of the three study-related burnout facets for the different clusters as well as their standard deviations. On the right, it also includes the effect sizes for the statistically significantly different means based on Dunnett's T3 post hoc test. The ANOVA resulted in a statistically significant result for all three subscales of study-related burnout. From Table 4, we observe that the differences in exhaustion are statistically significant between the “All high” and the two other groups, but are not significant between the “High organised and deep” and the “High unreflective” groups. In contrast, for both cynicism and inadequacy the differences between the “High organised and deep” and the two other groups are statistically significant. With the exemption of the medium-sized Cohen's d value of 0.49 between the “All high” and “High unreflective”-clusters, all effect sizes in Table 4 are between large and very large according to the interpretation guidelines by Sawilowsky (2009).

Table 4 Cluster comparison for the exhaustion, cynicism, and inadequacy subscales of study-related burnout
Mean (sd) Effect sizebc
All high High unreflective
a The ANOVA results for the exhaustion, cynicism and inadequacy subscales were statistically significant with F(2,227) = 11.579, p < 0.001, F(2,227) = 18.581, p < 0.001, and F(2,227) = 24.113, p = 0.001, respectively. b Effect sizes for the mean difference between the two clusters indicated on the left and above. Only statistically significant effect sizes are displayed. c Dunnett's T3 post hoc test significances: p < 0.001, ***; 0.001 < p < 0.01, **; and 0.01 < p < 0.05, *.
Exhaustiona
High organised & deep 2.54 (1.01) 0.79***
All high 3.34 (1.01) 0.49**
High unreflective 2.88 (0.88)
Cynicisma
High organised & deep 1.62 (0.80) 0.79*** 1.04***
All high 2.41 (1.16)
High unreflective 2.61 (1.04)
Inadequacya
High organised & deep 2.48 (1.10) 0.92*** 1.08***
All high 3.46 (1.02)
High unreflective 3.57 (0.93)


Differences between student's peer and self-assessment perceptions are also evident in Fig. 2. Specifically, students in the “High organised and deep” cluster exhibit more or equally positive perceptions of learning and efficacy along with fewer negative experiences of both peer and self-assessment compared to the other clusters. Conversely, the “All high” and “High unreflective” groups demonstrate similar efficacy and negative perceptions of peer and self-assessment. However, when it comes to peer and self-assessment supporting learning, the “All-high” students’ scores are closer to the results of the “High organised and deep” group. This distinction together with the differences observed in Fig. 1 underscores how the three-cluster solution adds depth to the interpretation of the data. As anticipated from the correlations in Table 1, students’ perceptions of peer assessment mirror closely the ones from self-assessment.

As before, the observed differences in cluster means from Fig. 2 were further studied using ANOVA and Dunnett's T3 post hoc tests. The results are summarised in Tables 5 and 6, which display the numerical ANOVA results, the effect sizes and the significances of the post hoc tests. For peer assessment, the ANOVA revealed main effects for the self-efficacy and negative experiences subscales, while the outcome for the supporting learning subscale was nonsignificant. A similar pattern was observed for self-assessment. In both peer and self-assessment the “High organised and deep” group scored significantly lower in negative experiences and higher in self-efficacy compared to the other groups. These differences corresponded mostly to medium or large effect sizes (Sawilowsky, 2009), with larger effect sizes observed for self-assessment.

Table 5 Cluster comparison for the subscales of students’ peer assessment perceptions
Mean (sd) Effect sizebc
All high High unreflective
a The ANOVA results were statistically significant for the self-efficacy and negative experiences subscales, with F(2,227) = 5.751, p = 0.004 and F(2,227) = 4.726, p = 0.010, respectively, and nonsignificant for the supporting learning subscale with F(2,227) = 2.325, p = 0.100. b Effect sizes for the mean difference between the two clusters indicated on the left and above. Only statistically significant effect sizes are displayed. c Dunnett's T3 post hoc test significances: p < 0.001, ***; 0.001 < p < 0.01, **; and 0.01 < p < 0.05, *.
Peer assessment as supporting learninga
High organised & deep 3.17 (0.92)
All high 3.20 (0.88)
High unreflective 2.95 (0.73)
Negative experiences of peer assessmenta
High organised & deep 2.11 (0.67) 0.46* 0.43*
All high 2.47 (0.85)
High unreflective 2.36 (0.54)
Self-efficacy in peer assessmenta
High organised & deep 4.10 (0.70) 0.50* 0.52**
All high 3.73 (0.77)
High unreflective 3.75 (0.68)


Table 6 Cluster comparison for the subscales of students’ self-assessment perceptions
Mean (sd) Effect sizebc
All high High unreflective
a The ANOVA results were statistically significant for the self-efficacy and negative experiences subscales, with F(2,227) = 8.142, p < 0.001 and F(2,227) = 14.039, p < 0.001, respectively, and nonsignificant for the supporting learning subscale with F(2,227) = 2.697, p = 0.070. b Effect sizes for the mean difference between the two clusters indicated on the left and above. Only statistically significant effect sizes are displayed. c Dunnett's T3 post hoc test significances: p < 0.001, ***; 0.001 < p < 0.01, **; and 0.01 < p < 0.05, *.
Self-assessment as supporting learninga
High organised & deep 3.38 (0.82)
All high 3.25 (0.87)
High unreflective 3.10 (0.62)
Negative experiences of peer assessmenta
High organised & deep 2.00 (0.62) 0.74*** 0.88***
All high 2.56 (0.85)
High unreflective 2.54 (0.61)
Self-efficacy in peer assessmenta
High organised & deep 4.14 (0.62) 0.58** 0.65***
All high 3.73 (0.77)
High unreflective 3.73 (0.63)


Finally, we also investigated the correlations between students’ perceptions of peer and self-assessment and study-related burnout within each cluster. For the “High organised and deep” cluster, a statistically significant correlation of 0.31 was found between the exhaustion and negative experiences of self-assessment variables. Meanwhile, for the “All high” cluster both exhaustion and cynicism were significantly correlated with negative experiences of peer assessment with correlation coefficients of 0.27 and 0.29, respectively. Lastly, both exhaustion and inadequacy were correlated with perceptions of peer assessment as supporting learning in the “High unreflective” cluster. These correlation coefficients were 0.22 and 0.27, respectively. All of these correlations were relatively modest.

Discussion

The first goal of our study was to examine the different learning profiles among students in a chemical engineering thermodynamics course. This course is generally considered challenging, making it a potentially important contributor to study-related burnout among aspiring chemical engineers. Consequently, we also explored how different learners experience burnout and how they perceived peer and self-assessment. This was done to elucidate the interaction between the teaching–learning environment, burnout, and student's approaches to learning with the hope of eventually using the first to impact the latter two.

We found three learning profiles from our data (1) “High organised and deep”, (2) “All high” and (3) “High unreflective”. Of these three, the “All high” group had the highest mean of all three for the unreflective approach to learning and organised studying, while the “High organised and deep” had the highest mean for the deep approach. Similar profiles to our “High organised and deep” and “High unreflective” have been found in previous studies (Parpala et al., 2010) including with engineering students (Salmisto et al., 2016), but these studies did not include the “All high” profile. In contrast, a recent study by Parpala et al. (2021) did find an “All high” profile, with elevated scores for all the approaches to learning.

Interestingly, the unreflective students formed the largest group in this study. This contradicts earlier general studies where the “High unreflective” group is usually the smallest (Parpala et al., 2010; Asikainen and Katajavuori, 2022). As these studies have measured learning approaches at a general level, the higher prevalence of the unreflective approach could just signal the challenging nature and the high workload associated with the thermodynamics course. Alternatively, there is some evidence that students’ discipline can also impact the adopted learning approach, with less deep and more unreflective approaches typically encountered in fields like science and economy (Baeten et al., 2010). Regardless, it appears that qualitatively different results can emerge at the course-level compared to the general one. This is interesting, as studies exploring learning profiles at course-level are relatively scarce. These results also provide potential corroboration that the adoption of the deep and unreflective approaches depends on the course context and the learning environment (Postareff et al., 2018).

Our results showed that the “High organised and deep” group experienced almost every aspect of burnout, i.e., exhaustion, cynicism, and inadequacy substantially less than students in the other profiles. That is to say, chemical engineering students who aim to understand and use meaningful learning strategies in physical chemistry and who manage their time effectively experience less exhaustion, cynicism, and inadequacy. These results line up with earlier studies showing that first year students applying a deep approach experience less burnout symptoms than students utilising the unreflective approach (Asikainen et al., 2020). In addition, both our results and those of Parpala et al. (2021) indicate that the most exhaustion was suffered by the “All high” students. The high score in exhaustion set the “All high” group apart from the “High unreflective group”, justifying the use of the three-cluster solution over the two-cluster one.

Who are the “All high” students? Parpala et al. (2021) thought that these students may have difficulties with their learning strategies while being aware of how they should develop their studying (Lindblom-Ylänne, 2003). Upon observing a similar group in their study of first-year students, Freyer and Vermunt (2018) further suggested that these are students who have not yet had sufficient time to develop a differentiated study strategy. Perhaps some of our students might also still be learning study skills, as most course participants were starting their second year. It should also be noted that the thermodynamics course differed in several regards from what the students had come to expect from their first year. For example, in addition to the mathematics-heavy approach of physical chemistry, the students had to operate in a student-centred learning environment. The course also included somewhat unfamiliar assessment practices in the form of peer and self-assessment. As undergraduates may be more used to traditional assessment (Alt and Raichel, 2020), the new learning environment may place demands on the students to modify their learning strategies, which can increase the workload (Beaten et al., 2010) and lead to exhaustion.

Our study also uncovered variation in how students with different learning profiles perceive peer and self-assessment. Students representing the “High organised and deep” profile scored higher in efficacy experiences and lower in negative experiences of both peer and self-assessment with larger differences observed for self-assessment. This is in line with previous research linking both the deep approach to learning and organised studying with positive perceptions of assessment and the unreflective approach with negative perceptions (Parpala et al., 2010; Herrmann et al., 2017). Because peer and self-assessment can feel difficult for students (Hanrahan and Isaacs 2001; Andrade and Du 2007; Van Hattum-Janssen and Lourenço 2008), these results also agree with the finding that students utilising the deep approach tend to prefer challenging courses (Halme et al., 2021). The fact that self-efficacy is also positively related to the deep approach and negatively to the unreflective approach to learning (Prat-Sala and Redford, 2010), further lends credence to our results. In short, students who employ deep learning strategies and manage their time carefully not only experience less study-related burnout but also feel more efficacious when participating in assessment and have fever negative experiences.

In sum, our study associated multiple benefits with belonging to the “High organised and deep” learning approach group when studying physical chemistry, particularly in terms of study-related burnout. As the learning environment impacts students’ approaches to learning, it would thus be worthwhile for physical chemistry instructors to employ instructional practices that facilitate the adoption of the deep approach. Based on the review by Baeten et al. (2010), this could be achieved through a reasonable overall workload, student-centred teaching approaches, teacher involvement and orientation towards changing students’ conceptions, and assessment that encourages deep learning and resembles students’ future practice. For example, Process Oriented Guided Inquiry Learning has recently served as the basis for several physical chemistry laboratory modules (Hunnicutt et al., 2015; Phillips et al., 2019; Cole et al., 2020; Partanen, 2023) and there is some evidence indicating that it can facilitate the adoption of the deep approach while undermining the unreflective one (Joshi and Lau, 2023). Regardless, the deep approach is hard to induce (Baeten et al., 2013). For example, the “High unreflective” group emerged largest in the thermodynamics course, despite our reliance on student-centred teaching and peer and self-assessment, which has been shown to help foster a deeper approach to learning (Lynch et al., 2012; Nieminen et al., 2021). In fact, our results are reminiscent of the findings of Gijbels et al. (2008) where even though students experienced what appeared to be a student-centred and constructivist learning environment and perceived the assessment as demanding deeper levels of understanding, they still ended up adopting increasingly unreflective approaches to learning. The two explanations they offer for this paradoxical outcome are: first, some students might have successful experiences with assessment that focuses on lower-level cognitive skills from prior education, which makes the confrontation with assessment demanding higher-order thinking skills stressful and arduous. Second, as students in the thermodynamics course had to perform several novel self-study and assessment activities, the total workload could feel high.

In our study, self-efficacy and lack of negative experiences in assessment were connected with the profile that was high in the deep approach, while the profiles high in the unreflective approach tended to possess more negative perceptions of assessment. Given the result that self-assessment can induce the deep approach (Nieminen et al., 2021), these associations could mean that to bolster this effect, the teacher needs to facilitate assessment self-efficacy while minimizing negative experiences. This underscores the need for clear assessment instructions, sufficient tools, and support while also monitoring how students perceive the assessment tasks and their workload. Peer and self-assessment are associated with various benefits that are relevant to physical chemistry learning, including critical thinking, metacognitive knowledge, and problem-solving gains (Topping, 1998; Mok et al., 2006; Huisman et al., 2019; Hadzhikoleva et al., 2019), better learning outcomes (Partanen, 2018; Partanen, 2023) and improved attitudes (Partanen, 2020; Partanen, 2023). Consequently, its use in physical chemistry education is cautiously recommended. Further research is needed to explore the role that challenging courses play in university students’ wellbeing, and the impact of peer and self-assessment in promoting wellbeing.

Limitations

This study has some limitations. First, we only analysed self-report data at one measurement point with cluster analysis. Thus, no causal relationships can be derived between the studied variables. In addition, changes in the course between the study years could have impacted the results. In the clustering, we tried to mitigate this by separating our data into the 2019 test set and the primary data set from 2020 and 2021. We also compared the yearly means for the learning approaches and study-related burnout variables which were similar for both 2020 and 2021. A third limitation was that all data was collected from a single thermodynamics course at one Finnish university. Finally, we gathered the bulk of the data during the COVID-19 pandemic, when most teaching was conducted remotely. This might impact the generalisability of our findings. In future studies, data should be gathered from multiple physical chemistry courses in several universities to ensure the generalisability of the results. Multiple time points could be used to analyse causal relationships.

Conclusions and practical implications

We investigated different learning profiles and their connections to study-related burnout and students’ perceptions of peer and self-assessment in a challenging chemical engineering thermodynamics course. According to our findings, the students who employ deep learning strategies and manage their time carefully experience less study-related burnout. They also feel more efficacious when participating in assessment and have fever negative experiences of both peer and self-assessment. In light of these findings, physical chemistry educators might consider adopting practices that help foster a deeper approach to learning, like peer and self-assessment. Such practices give students a perception of an active role in teaching and let them build understanding themselves by relating ideas and construct critical arguments. At the same time, teachers should pay careful attention to the course workload, and be mindful not to adopt too many novel instructional approaches into a single course.

Conflicts of interest

The authors declare no conflicts of interest.

Appendices

Appendix 1: Sample problem questions and grading guidelines

Below are provided two sample problem subtasks (1.1 and 2.1) and their assessment rubrics. Both are parts of a larger problem sets with general overarching themes.
Problem 1. Solar power is one of the most promising renewable energy sources to replace climate-damaging fossil fuels. However, the challenge with solar power is its intermittent availability. For this reason, its large-scale implementation requires reliable and clean energy storage methods. These include, for example, the electrochemical decomposition of water into hydrogen and oxygen with the help of a suitable catalyst in a reaction:
2H2O(l) → 2H2(g) + O2(g)

The generated hydrogen gas has a high energy density and its combustion forms only water.

(For more info, see https://science.sciencemag.org/content/355/6321/eaad4998).

Subtask 1.1 Calculate the heat released in the reaction at 1 bar and 25 °C. Based on the reaction equation, what can you conclude about the work done in this reaction? Use your conclusion to compare the change in the internal energy to the change in enthalpy. (3 points).

Assessment rubric 1.1 (Start from zero points and add points if the following conditions are met:)

+1p. Enthalpy calculated correctly using enthalpies of formation and the result is within 5 kJ mol−1 of the model answer. (OR there must be some kind of justification why the result is minus two times the formation enthalpy of liquid water.)

+1p. The student has understood that work is being done in the process because the amount of gas increases. (The numerical calculation of the work is also sufficient as justification.)

+1p. The student has realized that the change in internal energy is smaller than the change in enthalpy, because work is negative. (A numerical calculation of these is also sufficient as a justification.)

Problem 2. Your employer at the meteorological institute has hired you to find out how air affects freezing point. For this, you have sealed two water samples in containers of the same volume. One container has only water and the other has a mixture of air at 1 atm and water. Air contains 80% nitrogen by volume and 20% oxygen by volume. The cryoscopic constant of water is 1.86 K kg mol−1, the Henry's law constant of nitrogen in water at 0 °C is 5.45 × 109 Pa and that of oxygen is 2.55 × 109 Pa.

Subtask 2.1 Qualitatively compare the freezing points of water in the containers. Carefully justify your answer using the chemical potential. Draw a picture!

Assessment rubric 2.1 (Start from zero points and add points if the following conditions are met:)

+1p. The student has explained (or shown through a picture) that impurities such as dissolved gases in the solution lead to a lowering of the freezing point.

+2p. The student has explained that the point where the chemical potential of the solid and the liquid is the same (= freezing point) moves to a lower temperature as the chemical potential of the solution decreases, which leads to a lower freezing point. (A picture like the one depicted in the model answer is sufficient so long as the student has also described what is in the picture as a part of their solution.) If the explanation is incomplete, but in the answer somehow connects the lowering of the freezing point to the lowering of the chemical potential of the solution in relation to the pure solution, you can give +1p.

Appendix 2: Questionnaire items used in the study

All measures utilised a Likert scale ranging from 1 (completely disagree) to 5 (completely agree). The end-of-course surveys also contained components that were not relevant to this study.

Approaches to learning (Parpala and Lindblom-Ylänne 2012).

The students were instructed to think about their teaching and learning in the Thermodynamics course.

1. Deep approach

1.1 Ideas and perspectives I came across during my studies have made me contemplate them afterwards.

1.2 I have carefully looked for evidence to reach my own conclusions about what I am studying in the course.

1.3 During my studies in this course I have tried to relate new material to my previous knowledge.

1.4 I have tried to relate what I learned in this course to what I have learned in other courses.

2. Unreflective approach

2.1 The things I need to learn have seemed so complicated that I have had difficulties in understanding them.

2.2 I have often had to repeat things in order to learn them.

2.3 I have often had trouble in making sense of the things I have to learn.

2.4 Much of what I have learned seems nothing more than unrelated bits and pieces.

3. Organised approach

3.1 Overall, I have been systematic and organised in my studying.

3.2 I have organised my study time carefully to make the best use of it.

3.3 I have planned my studies in this course so that I can fit everything in.

3.4 I have put a lot of effort into my studies in this course.

Study-related burnout (Salmela-Aro et al., 2009).

The students were instructed pick the Likert option that best described the current situation in their studies.

4. Exhaustion

4.1 I feel overwhelmed by the work related to my studies.

4.2 During my free time I worry over matters related to my studies.

4.3 The pressure of my studies causes problems in my close relationships with others.

4.4 I often sleep badly because of matters related to my studies.

5. Cynicism

5.1 I feel a lack of study motivation and often think of giving up.

5.2 I feel that I am losing interest in my studies.

5.3 I am continually wondering whether my studies have any meaning.

6. Inadequacy

6.1 I often have feelings of inadequacy in my studies.

6.2 I used to expect I would achieve much more in my studies than I expect now.

Perceptions of peer assessment (Partanen et al., 2023).

7. Peer assessment as supporting learning-factor

7.1 Peer assessment supported my learning in the course.

7.2 Peer assessment motivated me to engage more deeply with course tasks.

7.3 Peer assessment made me participate more in course activities.

7.4 Peer assessment helped me recognise my errors.

7.5 Peer assessment increased my motivation to study in the course.

7.6 Peer assessment made me review the course contents more.

8. Negative experiences of peer assessment-factor

8.1 Peer assessment is unfair because the effort students put into it varies a lot.

8.2 The criteria for peer assessment felt unjust when compared to the tasks.

8.3 I felt that I could not assess others’ work reliably.

8.4 I feel that many students were not able to assess their peers’ work.

8.5 Peer assessment increased the workload too much.

8.6 In peer assessment, the students are forced to do the teacher's work as well.

9. Self-efficacy in peer assessment-factor

9.1 I was able to assess the performance of my peers with the provided assessment rubrics and model solutions.

9.2 I was able to apply the assessment rubric to mark my peers’ solutions.

9.3 I understood how the tasks were supposed to be solved when I compared them to the assessment rubric and model solutions.

Perceptions of self-assessment (Partanen et al., 2023)

10. Self-assessment as supporting learning-factor

10.1 Self-assessment supported my learning in the course.

10.2 Self-assessment made me review the course contents more.

10.3 Self-assessment made me participate more in course activities.

10.4 Self-assessment helped me to recognise my errors.

10.5 Self-assessment motivated me to engage more deeply with course tasks.

10.6 Self-assessment increased my motivation to study in the course.

11. Negative experiences of self-assessment-factor

11.1 Self-assessment increased the workload too much.

11.2 Self-assessment was unpleasant and stressful.

11.3 Performing self-assessment felt frustrating.

11.4 It was difficult to assess my performance if I had not mastered the tasks myself.

11.5 I felt that I could not assess my work reliably.

11.6 There is little benefit to self-assessment compared to the workload.

12. Self-efficacy in self-assessment-factor

12.1 I was able to apply the assessment rubric to mark my solutions.

12.2 I was able to assess my performance with the provided assessment rubrics and model solutions.

12.3 I understood how the tasks were supposed to be solved when I compared them to the assessment rubric and model solutions.

References

  1. Alt D. and Raichel N., (2020), Higher education students’ perceptions of and attitudes towards peer assessment in multicultural classrooms, Asia-Pac. Educ. Res., 29, 567–580 DOI:10.1007/s40299-020-00507-z.
  2. Anderson L. W. and Krathwohl D. R., (2001), A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives, New York: Longman.
  3. Andersson M. and Weurlander M., (2019), Peer review of laboratory reports for engineering students, Eur. J. Eng. Educ., 44(3), 417–428 DOI:10.1080/03043797.2018.1538322.
  4. Andrade H. L., (2019), A critical review of research on student self-assessment, Front. Educ.4, 1–13 DOI:10.3389/feduc.2019.00087.
  5. Andrade H. and Du Y., (2007), Student responses to criteria-referenced self-assessment, Assess. Eval. High. Educ., 32(2), 159–181 DOI:10.1080/02602930600801928.
  6. Asikainen H. and Gijbels D., (2017), Do students develop towards more deep approaches to learning during studies? A systematic review on the development of students' deep and surface approaches to learning in higher education, Educ. Psychol. Rev., 29, 205–234 DOI:10.1007/s10648-017-9406-6.
  7. Asikainen H. and Katajavuori N., (2022), First-year experience in the COVID-19 situation and the association between students’ approaches to learning, study-related burnout and experiences of online studying, Soc. Sci., 11(9), 390 DOI:10.3390/socsci11090390.
  8. Asikainen H., Parpala A., Virtanen V. and Lindblom-Ylänne S., (2013), The relationship between student learning process, study success and the nature of assessment: a qualitative study, Stud. Educ. Eval., 39(4), 211–217 DOI:10.1016/j.stueduc.2013.10.008.
  9. Asikainen H., Virtanen V., Postareff L. and Heino P., (2014), The validity and students' experiences of peer assessment in a large introductory class of gene technology, Stud. Educ. Eval., 43, 197–205 DOI:10.1016/j.stueduc.2014.07.002.
  10. Asikainen H., Salmela-Aro K., Parpala A. and Katajavuori N., (2020), Learning profiles and their relation to study-related burnout and academic achievement among university students, Learn. Individ. Differ., 78, 101781 DOI:10.1016/j.lindif.2019.101781.
  11. Asikainen H., Nieminen J. H., Häsä J. and Katajavuori N., (2022), University students’ interest and burnout profiles and their relation to approaches to learning and achievement, Learn. Individ. Differ., 93, 102105 DOI:10.1016/j.lindif.2021.102105.
  12. Baeten M., Kyndt E., Struyven K. and Dochy F., (2010), Using student-centred learning environments to stimulate deep approaches to learning: Factors encouraging or discouraging their effectiveness, Educ. Res. Rev., 5(3), 243–260 DOI:10.1016/j.edurev.2010.06.001.
  13. Baeten M., Struyven K. and Dochy F., (2013), Student-centred teaching methods: can they optimise students’ approaches to learning in professional higher education? Stud. Educ. Eval., 39(1), 14–22 DOI:10.1016/j.stueduc.2012.11.001.
  14. Bain K. and Towns M. H., (2016), A review of research on the teaching and learning of chemical kinetics, Chem. Educ. Res. Pract., 17, 246–262 10.1039/C5RP00176E.
  15. Bain K., Moon A., Mack M. R. and Towns M. H., (2014), A review of research on the teaching and learning of thermodynamics at the university level, Chem. Educ. Res. Pract., 15, 320–335 10.1039/C4RP00011K.
  16. Basso A., (2020), Results of a peer review activity in an organic chemistry laboratory course for undergraduates, J. Chem. Educ., 97(11), 4073–4077 DOI:10.1021/acs.jchemed.0c00373.
  17. Becker N. and Towns M., (2012), Students’ understanding of mathematical expressions in physical chemistry contexts: an analysis using Sherin's symbolic forms, Chem. Educ. Res. Pract., 13, 209–220 10.1039/C2RP00003B.
  18. Berry D. E. and Fawkes K. L., (2010), Constructing the components of a lab report using peer review, J. Chem. Educ., 87(1), 57–61 DOI:10.1021/ed8000107.
  19. Boud D. and Falchikov N., (1989), Quantitative studies of student self-assessment in higher education: a critical analysis of findings, High. Educ., 18(5), 529–549 DOI:10.1007/bf00138746.
  20. Boud D. J. and Holmes W. H., (1981), Self and peer marking in an undergraduate engineering course, IEEE Trans. Educ., 24(4), 267–274 DOI:10.1109/TE.1981.4321508.
  21. Bourke R., (2014), Self-assessment in professional programmes within tertiary institutions, Teach. High. Educ., 19(8), 908–918 DOI:10.1080/13562517.2014.934353.
  22. Carvalho A., (2013), Students' perceptions of fairness in peer assessment: evidence from a problem-based learning course, Teach. High. Educ., 18(5), 491–505 DOI:10.1080/13562517.2012.753051.
  23. Casselman B. L. and Atwood C. H., (2017), Improving general chemistry course performance through online homework-based metacognitive training, J. Chem. Educ., 94(12), 1811–1821 DOI:10.1021/acs.jchemed.7b00298.
  24. Charrad M., Ghazzali N., Boiteau V. and Niknafs A., (2014), NbClust: an R Package for determining the relevant number of clusters in a data set, J. Stat. Softw., 61(6), 1–36 DOI:10.18637/jss.v061.i06.
  25. Cole R. S., Muniz M., Harvey E., Sweeney R. and Hunnicutt S., (2020), How should apples be prepared for a fruit salad? A guided inquiry physical chemistry experiment, J. Chem. Educ., 97(12), 4475–4481 DOI:10.1021/acs.jchemed.0c00517.
  26. Derrick M. E. and Derrick F. W., (2002), Predictors of success in physical chemistry, J. Chem. Educ., 79(8), 1013–1016 DOI:10.1021/ed079p1013.
  27. Drysdale M. T. B., McBeath M. L. and Callaghan S. A., (2022), The feasibility and impact of online peer support on the well-being of higher education students, J. Ment. Health Train. Educ. Pract., 17(3), 206–217 DOI:10.1108/JMHTEP-02-2021-0012.
  28. Entwistle N. and McCune V., (2004), The conceptual bases of study strategy inventories, Educ. Psychol. Rev., 16, 325–345 DOI:10.1007/s10648-004-0003-0.
  29. Entwistle N. and Ramsden P., (1983), Understanding student learning, New York: Croom Helm.
  30. Freyer L. K. and Vermunt J. D., (2018), Regulating approaches to learning: Testing learning strategy convergences across a year at university, Br. J. Educ. Psychol., 88(1), 21–41 DOI:10.1111/bjep.12169.
  31. Gholami H., (2016), Self-assessment and learner autonomy, Theory Pract. Lang. Stud., 6(1), 46–51 DOI:10.17507/tpls.0601.06.
  32. Gijbels D., Segers M. and Struyf E., (2008), Constructivist learning environments and the (im) possibility to change students’ perceptions of assessment demands and approaches to learning, Inst. Sci., 36, 431–443 DOI:10.1007/s11251-008-9064-7.
  33. Gomez Jimenez S. G., Lizardo Perez A. C., Pulido Tellez A. and Rodriguez Bastarmerito R., (2021), Factors that provoke burnout of chemical engineering students in Universidad Juárez Autónoma de Tabasco, ICERI2021 Proc., 6896–6905 DOI:10.21125/iceri.2021.1560.
  34. Gulikers J. T. M., Bastiaens T. J., Kirschner P. A. and Kester L., (2006), Relations between student perceptions of assessment authenticity, study approaches and learning outcomes, Stud. Educ. Eval., 32(4), 381–400 DOI:10.1016/j.stueduc.2006.10.003.
  35. Haarala-Muhonen A., Ruohoniemi M., Parpala A., Komulainen E. and Lindblom-Ylänne S., (2017), How do the different study profiles of first-year students predict their study success, study progress and the completion of degrees?, High. Educ., 74, 949–962 DOI:10.1007/s10734-016-0087-8.
  36. Hadfield L. C. and Wieman C. E., (2010), Student interpretations of equations related to the first law of thermodynamics, J. Chem. Educ., 87(7), 750–755 DOI:10.1021/ed1001625.
  37. Hadzhikoleva S., Hadzhikolev E. and Kasakliev N., (2019), Using peer assessment to enhance higher order thinking skills, TEM J., 8(1), 242–247 DOI:10.18421/TEM81-34.
  38. Hailikari T., Tuononen T. and Parpala A., (2018), Students' experiences of the factors affecting their study progress: differences in study profiles, J. Furth. High. Educ., 42(1), 1–12 DOI:10.1080/0309877X.2016.1188898.
  39. Halme M., Myyry L. and Pirttilä-Backman A.-M., (2021), Business and social science students’ course preferences and learning approaches, Front. Educ., 6, 529197 DOI:10.3389/feduc.2021.529197.
  40. Hanrahan S. J. and Isaacs G., (2001), Assessing self- and peer-assessment: the students' views, High. Educ. Res. Dev., 20(1), 53–70 DOI:10.1080/07294360123776.
  41. Häsä J., Nieminen J. H., Rämö J., Pesonen H. and Pauna M., (2018), Determining your own grade: student perceptions on self-assessment in a large mathematics course, Proc. INDRUM, 366–367.
  42. Heikkilä A., Lonka K., Nieminen J. and Niemivirta M., (2012), Relations between teacher students' approaches to learning, cognitive and attributional strategies, well-being, and study success, High. Educ., 64, 455–471 DOI:10.1007/s10734-012-9504-9.
  43. Herrmann K. J., Bager-Elsborg A. and Parpala A., (2017), Measuring perceptions of the learning environment and approaches to learning: validation of the learn questionnaire, Scand. J. Educ. Res., 61(5), 526–539 DOI:10.1080/00313831.2016.1172497.
  44. Huisman B., Saab N., van den Broek P. and van Driel J., (2019), The impact of formative peer feedback on higher education students’ academic writing: a meta-analysis, Assess. Eval. High. Educ., 44(6), 863–880 DOI:10.1080/02602938.2018.1545896.
  45. Hunnicutt S. S., Grushow A. and Whitnell R., (2015), Guided-inquiry experiments for physical chemistry: the POGIL-PCL Model, J. Chem. Educ., 92(2), 262–268 DOI:10.1021/ed5003916.
  46. Hyytinen H., Tuononen T., Nevgi A. and Toom A., (2022), The first-year students' motives for attending university studies and study-related burnout in relation to academic achievement, Learn. Individ. Differ., 97, 102165 DOI:10.1016/j.lindif.2022.102165.
  47. John N. M., Page O., Martin S. C. and Whittaker P., (2018), Impact of peer support on student mental wellbeing: a systematic review, MedEdPublish, 7(170), 1–13 DOI:10.15694/mep.2018.0000170.1.
  48. Joshi N. and Lau S.-K., (2023), Effects of process-oriented guided inquiry learning on approaches to learning, long-term performance, and online learning outcomes, Interact. Learn. Environ., 31(5), 3112–3127 DOI:10.1080/10494820.2021.1919718.
  49. Jumari N. F., Yusof K. M. and Phang F. A., (2017), How do first year malaysian chemical engineering students approach learning? Chem. Eng. Trans., 56, 1009–1014 DOI:10.3303/CET1756169.
  50. Kaufman J. H. and Schunn C. D., (2011), Students’ perceptions about peer assessment for writing: their origin and impact on revision work, Instr. Sci., 39, 387–406 DOI:10.1007/s11251-010-9133-6.
  51. Kautz C. H., Heron P. R. L., Loverude M. E. and McDermott L. C., (2005), Student understanding of the ideal gas law, part I: a macroscopic perspective, Am. J. Phys., 73, 1055–1063 DOI:10.1119/1.2049286.
  52. Kyndt E., Dochy F., Struyven K. and Cascallar E., (2011), The direct and indirect effect of motivation for learning on students' approaches to learning through the perceptions of workload and task complexity, High. Educ. Res. Dev., 30(2), 135–150 DOI:10.1080/07294360.2010.501329.
  53. Lastusaari M., Laakkonen E. and Murtonen M., (2016), ChemApproach: validation of a questionnaire to assess the learning approaches of chemistry students, Chem. Educ. Res. Pract., 17, 723–730 10.1039/C5RP00216H.
  54. Lastusaari M., Laakkonen E. and Murtonen M., (2019), Persistence in studies in relation to learning approaches and first-year grades: a study of university chemistry students in Finland, Chem. Educ. Res. Pract., 20, 452–467 10.1039/C8RP00244D.
  55. Lindblom-Ylänne S., (2003), Broadening an understanding of the phenomenon of dissonance, Stud. High. Educ., 28(1), 63–77 DOI:10.1080/03075070309306.
  56. Lindblom-Ylänne S., Parpala A. and Postareff L., (2019), What constitutes the surface approach to learning in the light of new empirical evidence? Stud. High. Educ., 44(12), 2183–2195 DOI:10.1080/03075079.2018.1482267.
  57. Lonka K., Olkinuora E. and Mäkinen J., (2004), Aspects and prospects of measuring studying and learning in higher education, Educ. Psychol. Rev., 16, 301–323 DOI:10.1007/s10648-004-0002-1.
  58. Lopez R. and Kossack S., (2007), Effects of recurring use of self-assessment in university courses, Int. J. Learn., 14(4), 203–216 DOI:10.18848/1447-9494/CGP/v14i04/45277.
  59. Loverude M. E., Kautz C. H. and Heron P. R. L., (2002), Student understanding of the first law of thermodynamics: relating work to the adiabatic compression of an ideal gas, Am. J. Phys., 70(2), 137–148 DOI:10.1119/1.1417532.
  60. Lynch R., McNamara P. M. and Seery N., (2012), Promoting deep learning in a teacher education programme through self- and peer-assessment and feedback, Eur. J. Teach. Educ., 35(2), 179–197 DOI:10.1080/02619768.2011.643396.
  61. Madigan D. J. and Curran T., (2021), Does burnout affect academic achievement? A meta-analysis of over 100[thin space (1/6-em)]000 students, Educ. Psychol. Rev., 33, 387–405 DOI:10.1007/s10648-020-09533-1.
  62. Mammino L., (2009), Teaching physical chemistry in disadvantaged contexts: challenges, strategies and responses, in Gupta-Bhowon, M., Jhaumeer-Laulloo, S., Li Kam Wah, H. and Ramasami, P. (ed.) Chemistry education in the ICT age, Dordrecht: Springer.
  63. Margerum L. D., Gulsrud M., Manlapez R., Rebong R. and Love A., (2007), Application of calibrated peer review (CPR) writing assignments to enhance experiments with an environmental chemistry focus, J. Chem. Educ., 84(2), 292–295 DOI:10.1021/ed084p292.
  64. Marton F. and Säljö R., (1984), Approaches to learning, in Marton F., Hounsell D. and Entwistle N. (ed.), The experience of learning, Edinburgh: Scottish Academic Press.
  65. Maslach C., Schaufeli W. B. and Leiter M. P., (2001), Job burnout, Annu. Rev. Psychol., 52(1), 397–422 DOI:10.1146/annurev.psych.52.1.397.
  66. Meriläinen M., (2014), Factors affecting study-related burnout among Finnish university students: teaching-learning environment, achievement motivation and the meaning of life, Qual. High. Educ., 20(3), 309–329 DOI:10.1080/13538322.2014.978136.
  67. Meriläinen M. and Kuittinen M., (2014), The relation between Finnish university students’ perceived level of study-related burnout, perceptions of the teaching–learning environment and perceived achievement motivation, Pastor. Care Educ., 32(3), 186–196 DOI:10.1080/02643944.2014.893009.
  68. Micán, A. D. and Medina L. C., (2017), Boosting vocabulary learning through self-assessment in an English language teaching context, Assess. Eval. High. Educ., 42(3), 398–414 DOI:10.1080/02602938.2015.1118433.
  69. Mok M. M. C., Lung C. L., Cheng D. P. W., Cheung R. H. P. and Ng M. L., (2006), Self-assessment in higher education: experience in using a metacognitive approach in five case studies, Assess. Eval. High. Educ., 31(4), 415–433 DOI:10.1080/02602930600679100.
  70. Moneypenny D. B., Evans M. and Kraha A., (2018), Student perceptions of and attitudes towards peer review, Am. J. Distance Educ., 32(4), 236–247 DOI:10.1080/08923647.2018.1509425.
  71. Mulder R., Baik C., Naylor R. and Pearce J., (2014a), How does student peer review influence perceptions, engagement and academic outcomes? A case study, Assess. Eval. High. Educ., 39(6), 657–677 DOI:10.1080/02602938.2013.860421.
  72. Mulder R. A., Pearce J. M. and Baik C., (2014b), Peer review in higher education: student perceptions before and after participation, Act. Learn. High. Educ., 15(2), 157–171 DOI:10.1177/14697874145273.
  73. Ndoye A., (2017), Peer/self assessment and student learning, Int. J. Teach. Learn. High. Educ., 29(2), 255–269.
  74. Nicoll G. and Francisco J. S., (2001), An investigation of the factors influencing student performance in physical chemistry, J. Chem. Educ., 78(1), 99–102 DOI:10.1021/ed078p99.
  75. Nieminen J. H., Asikainen H. and Rämö J., (2021), Promoting deep approach to learning and self-efficacy by changing the purpose of self-assessment: a comparison of summative and formative models, Stud. High. Educ., 46(7), 1296–1311 DOI:10.1080/03075079.2019.1688282.
  76. O'moore L. M. and Baldock T. E., (2007), Peer assessment learning sessions (PALS): an innovative feedback technique for large engineering classes, Eur. J. Eng. Educ., 32(1),43–55 DOI:10.1080/03043790601055576.
  77. Panadero E., Jonsson A. and Botella J., (2017), Effects of self-assessment on self-regulated learning and self-efficacy: four meta-analyses, Educ. Res. Rev., 22, 74–98 DOI:10.1016/j.edurev.2017.08.004.
  78. Partanen L., (2016), Student oriented approaches in the teaching of thermodynamics at universities – developing an effective course structure, Chem. Educ. Res. Pract., 17, 766–787 10.1039/c6rp00049e.
  79. Partanen L., (2018), Student-centred active learning approaches to teaching quantum chemistry and spectroscopy: quantitative results from a two-year action research study, Chem. Educ. Res. Pract., 19, 885–904 10.1039/C8RP00074C.
  80. Partanen L., (2020), How student-centred teaching in quantum chemistry affects students’ experiences of learning and motivation – a self-determination theory perspective, Chem. Educ. Res. Pract., 21, 79–94 10.1039/C9RP00036D.
  81. Partanen L., (2023), A guided inquiry learning design for a large-scale chemical thermodynamics laboratory module, J. Chem. Educ., 100(1), 118–124 DOI:10.1021/acs.jchemed.2c00387.
  82. Parpala A. and Lindblom-Ylänne S., (2012), Using a research instrument for developing quality at the university, Qual. High. Educ., 18(3), 313–328 DOI:10.1080/13538322.2012.733493.
  83. Parpala A., Lindblom-Ylänne S., Komulainen E., Litmanen T. and Hirsto L., (2010), Students' approaches to learning and their experiences of the teaching-learning environment in different disciplines, Br. J. Educ. Psychol., 80(2), 269–282 DOI:10.1348/000709909X476946.
  84. Parpala A., Katajavuori N., Haarala-Muhonen A. and Asikainen H., (2021), How did students with different learning profiles experience ‘normal’ and online teaching situation during COVID-19 spring? Soc. Sci., 10(9), 337 DOI:10.3390/socsci10090337.
  85. Partanen L. J., Virtanen V., Asikainen H. and Myyry L., (2023), Chemical engineering students’ perceptions of peer and self-assessment, Eur. J. Eng. Educ. DOI:10.1080/03043797.2023.2250753.
  86. Patton C., (2012), Some kind of weird, evil experiment’: student perceptions of peer assessment, Assess. Eval. High. Educ., 37(6), 719–731 DOI:10.1080/02602938.2011.563281.
  87. Phillips J. A., Jones G. H. and Iski E. V., (2019), Using a guided-inquiry approach to teach Michaelis – Menten kinetics, J. Chem. Educ., 96(9), 1948–1954 DOI:10.1021/acs.jchemed.9b00031.
  88. Piccinno T. F., Basso A. and Bracco F., (2023), Results of a peer review activity carried out alternatively on a compulsory or voluntary basis, J. Chem. Educ., 100(2), 489–495 DOI:10.1021/acs.jchemed.2c00229.
  89. Planas Lladó, A., Feliu Soley L., Fraguell Sansbelló R. M., Arbat Pujolras G., Pujol Planella J., Roura-Pascual N., Suñol Martínez J. J. and Montoro Moreno L., (2014), Student perceptions of peer assessment: an interdisciplinary study, Assess. Eval. High. Educ., 39(5), 592–610 DOI:10.1080/02602938.2013.860077.
  90. Postareff L., Mattsson M. and Parpala A., (2018), The effect of perceptions of the teaching-learning environment on the variation in approaches to learning – Between-student differences and within-student variation, Learn. Individ. Differ., 68, 96–107 DOI:10.1016/j.lindif.2018.10.006.
  91. Prat-Sala M. and Redford P., (2010), The interplay between motivation, self-efficacy, and approaches to studying, Br. J. Educ. Psychol., 80(2), 283–305 DOI:10.1348/000709909X480563.
  92. Richardson J. T. E., (2005), Students’ perceptions of academic quality and approaches to studying in distance education, Br. Educ. Res. J., 31(1), 7–27 DOI:10.1080/0141192052000310001.
  93. Richardson J. T. E., (2011), Approaches to studying, conceptions of learning and learning styles in higher education, Learn. Individ. Differ., 21(3), 288–293 DOI:10.1016/j.lindif.2010.11.015.
  94. Rust C., O’Donovan B. and Price M., (2005), A social constructivist assessment process model: how the research literature shows us this could be best practice, Assess. Eval. High. Educ., 30(3), 231–240 DOI:10.1080/02602930500063819.
  95. Salmela-Aro K. and Kunttu K., (2010), Study burnout and engagement in higher education, Unterrichtswissenschaft, 38, 318–333.
  96. Salmela-Aro K. and Read S., (2017), Study engagement and burnout profiles among Finnish higher education students, Burn. Res., 7, 21–28 DOI:10.1016/j.burn.2017.11.001.
  97. Salmela-Aro K., Kiuru N., Leskinen E. and Nurmi J.-E., (2009), School burnout inventory (SBI): reliability and validity, Eur. J. Psychol. Assess., 25(1), 48–57 DOI:10.1027/1015-5759.25.1.48.
  98. Salmisto A., Postareff L. and Nokelainen P., (2016), Relations between civil engineering students’ approaches to learning, perceptions of the development of professional skills and perceived workload, Ann. Conf. Eur. Soc. Eng. Educ., 2016.
  99. Sawilowsky S., (2009), New effective size rules of thumb, J. Mod. Appl. Stat. Methods, 8(2), 597–599 DOI:10.22237/jmasm/1257035100.
  100. Schaufeli W. B., Bakker A. B., Hoogduin K., Schaap C. and Kladler A., (2001), On the clinical validity of the Maslach burnout inventory and the Burnout measure, Psychol. Health, 16(5), 565–582 DOI:10.1080/08870440108405527.
  101. Siow L.-F., (2015), Students' perceptions on self- and peer-assessment in enhancing learning experience, Malays. Online J. Educ. Sci., 3(2), 21–35.
  102. Sluijsmans, D. M. A., Moerkerke G., Van Merriënboer J. J. G. and Dochy F. J. R., (2001), Peer assessment in problem based learning, Stud. Educ. Eval., 27(2), 153–173 DOI:10.1016/S0191-491X(01)00019-0.
  103. Sözbilir M., (2004), What makes physical chemistry difficult? Perceptions of Turkish chemistry undergraduates and lecturers, J. Chem. Educ., 81(4), 573–578 DOI:10.1021/ed081p573.
  104. Struyven K., Dochy F. and Janssens S., (2005), Students’ perceptions about evaluation and assessment in higher education: a review, Assess. Eval. High. Educ., 30(4), 325–341 DOI:10.1080/02602930500099102.
  105. Tait H. and Entwistle N., (1996), Identifying students at risk through ineffective study strategies, High. Educ. 31, 97–116 DOI:10.1007/BF00129109.
  106. Tashiro J., Parga D., Pollard J. and Talanquer V., (2021), Characterizing change in students' self-assessments of understanding when engaged in instructional activities, Chem. Educ. Res. Pract., 22, 662–682 10.1039/D0RP00255K.
  107. Thomas P. L. and Schwenz R. W., (1998), College physical chemistry students' conceptions of equilibrium and fundamental thermodynamics, J. Res. Sci. Teach., 35(10), 1151–1160 DOI:10.1002/(SICI)1098-2736(199812)35%3A10[double bond splayed right]1151%3A%3AAID-TEA6[double bond splayed left]3.0.CO%3B2-K.
  108. Topping K., (1998), Peer assessment between students in colleges and universities, Rev. Educ. Res., 68(3), 249–276 DOI:10.3102/003465430680032.
  109. Tsaparlis G., (2001), Towards a meaningful introduction to the Schrödinger equation through historical and heuristic approaches, Chem. Educ. Res. Pract., 2, 203–213 10.1039/B1RP90023D.
  110. Tsaparlis G., (2007) Teaching and learning physical chemistry: a review of educational research, in Ellison M. D. and Schoolcraft T. A. (ed.) Advances in teaching physical chemistry, ACS symposium series, American Chemical Society, pp. 75–112, 973 DOI:10.1021/bk-2008-0973.ch007.
  111. Tsaparlis G. and Papaphotis G., (2009), High-school students’ conceptual difficulties and attempts at conceptual change: the case of basic quantum chemical concepts, Int. J. Sci. Educ., 31(7), 895–930 DOI:10.1080/09500690801891908.
  112. Tsaparlis G., Zoller U., Fastow M. and Lubezky A., (1999), Students’ self-assessment in chemistry examinations requiring higher- and lower-order cognitive skills, J. Chem. Educ., 76(1), 112–113 DOI:10.1021/ed076p112.
  113. Van Hattum-Janssen, N. and Lourenço J. M., (2008), Peer and self-assessment for first-year students as a tool to improve learning, J. Prof. Iss. Eng. Educ. and Pract., 134(4), 346–352 DOI:10.1061/(ASCE)1052-3928(2008)134:4(346).
  114. van Helvoort A. A. J., (2012), How adult students in information studies use a scoring rubric for the development of their information literacy skills, J. Acad. Librariansh., 38(3), 165–171 DOI:10.1016/j.acalib.2012.03.016.
  115. van Roon P. H., van Sprang H. F. and Verdonk A. H., (1994), Work and ‘heat’: on a road towards thermodynamics, Int. J. Sci. Educ., 16(2), 131–144 DOI:10.1080/0950069940160203.
  116. Van Zundert M., Sluijsmans D. and van Merriënboer J., (2010), Effective peer assessment processes: research findings and future directions, Learn. Instr., 20(4), 270–279 DOI:10.1016/j.learninstruc.2009.08.004.
  117. Vickerman P., (2009), Student perspectives on formative peer assessment: an attempt to deepen learning? Assess. Eval. High. Educ., 34(2), 221–230 DOI:10.1080/02602930801955986.
  118. Wanner T. and Palmer E., (2018), Formative self-and peer assessment for improved student learning: the crucial factors of design, teacher participation and feedback, Assess. Eval. High. Educ., 43(7), 1032–1047 DOI:10.1080/02602938.2018.1427698.
  119. Wen M. L. and Tsai C.-C., (2006), University students' perceptions of and attitudes toward (online) peer assessment, High. Educ., 51, 27–44 DOI:10.1007/s10734-004-6375-8.
  120. Wenzel T. J., (2007), Evaluation tools to guide students' peer-assessment and self-assessment in group activities for the lab and classroom, J. Chem. Educ., 84(1), 182–186 DOI:10.1021/ed084p182.
  121. Willey K. and Gardner A., (2010), Investigating the capacity of self and peer assessment activities to engage students and promote learning, Eur. J. Eng. Educ., 35(4), 429–443 DOI:10.1080/03043797.2010.490577.
  122. Zwicky D. A. and Hands M. D., (2016), The effect of peer review on information literacy outcomes in a chemical literature course, J. Chem. Educ., 93(3), 477–481 DOI:10.1021/acs.jchemed.5b00413.

This journal is © The Royal Society of Chemistry 2024