‘What does the term Critical Thinking mean to you?’ A qualitative analysis of chemistry undergraduate, teaching staff and employers' views of critical thinking

S. M. Danczak *, C. D. Thompson and T. L. Overton
School of Chemistry, Monash University, Victoria 3800, Australia. E-mail: Stephen.danczak@monash.edu

Received 19th December 2016 , Accepted 13th February 2017

First published on 13th February 2017

Good critical thinking is important to the development of students and a valued skill in commercial markets and wider society. There has been much discussion regarding the definition of critical thinking and how it is best taught in higher education. This discussion has generally occurred between philosophers, cognitive psychologists and education researchers. This study examined the perceptions around critical thinking of 470 chemistry students from an Australian University, 106 chemistry teaching staff and 43 employers of chemistry graduates. An open-ended questionnaire was administered to these groups, qualitatively analysed and subsequently quantified. When asked to define critical thinking respondents identified themes such as ‘analysis’, ‘critique’, ‘objectivity’, ‘problem solving’, ‘evaluate’ and ‘identification of opportunities and problems’. Student respondents described the smallest number of themes whereas employers described the largest number of themes. When asked where critical thinking was developed during the study of chemistry students overwhelmingly described practical environments and themes around inquiry-based learning. When teaching staff were asked this question they commonly identified critiques, research, projects and practical environments to some extent. This research highlights that there is only limited shared understanding of the definition of critical thinking and where it is developed in the study of chemistry. The findings within this article would be of interest to higher education teaching practitioners of science and chemistry, those interested in development of graduate attributes and higher order cognitive skills (HOCS) and those interested in the student and employer perspectives.


The development of critical thinking is a long standing goal of education at all levels (primary, secondary and tertiary) and a virtue valued by wider society. The importance of critical thinking is commonly referred to in literature discussions regarding employability and transferable skills (Ghulam and David, 1999; Leggett et al., 2004; Sarkar et al., 2016). In Australia, Group of Eight Universities list critical thinking and reasoning among the attributes of their science graduates (Australian National University, 2015; Monash University, 2015; The University of Adelaide, 2015; The University of Melbourne, 2015). The need for continued improvement in graduate employee higher order thinking can be seen through national initiatives (for example in the USA, UK, Canada and Australia) developing minimum learning requirements referred to as threshold learning outcomes (Wilson et al., 1997; Pithers and Soden, 2000; Tapper, 2004; Jones et al., 2011).

As the need for innovation, and anticipating and leading change continues to grow, employers recognise the importance of critical thinking and critical reflection (Desai et al., 2016). It has become an expectation that graduates are able to demonstrate a range of transferable skills such as critical thinking (Lowden et al., 2011). In a survey of 400 US employers, 92% of respondents rated critical thinking as ‘important’ or ‘very important’ in an undergraduate degree and the fifth most applied skill in the work place (Jackson, 2010a).

A recent study commissioned by the Office of the Chief Scientist of Australia surveyed 1065 employers representing a range of industries (Prinsley and Baranyai, 2015). Over 80% of respondents indicated critical thinking as ‘important’ or ‘very important’ as a skill or attribute in the workplace. Critical thinking was considered the second most important skill or attribute behind active learning. In 2012 Graduate Careers Australia found that of the 45% of chemistry graduates available for full-time or part-time employment, only 66% had obtained employment in a chemistry related field (Graduate Careers Australia, 2015). These findings suggest that skills which may be transferable to a range of employment settings, such as critical thinking, are worthwhile developing at the tertiary level.

The definition of critical thinking

The definition of critical thinking is frequently discussed in the literature, particularly among philosophers, psychologists and education researchers. From a philosophical perspective a comprehensive dialogue regarding critical thinking emerged in the form of the Delphi report (Facione, 1990). This report summarised a year-long discussion between 47 academics from philosophy, education, social sciences and physical sciences. They arrived at a general consensus that critical thinking is ‘purposeful, self-regulatory judgement which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgement is based’ (Facione, 1990, p. 2).

The report concluded that a person who exhibits good critical thinking is in possession of a series of cognitive skills and dispositions. The consensus of the Delphi experts was that a good critical thinker is proficient in the skills of interpretation, analysis, evaluation, inference, explanation and self-regulation (Facione, 1990). Furthermore, the report stated that a good critical thinker demonstrates a series of dispositions which is required for the individual to utilise the aforementioned skills. According to the report a ‘good critical thinker, is habitually disposed to engage in, and to encourage others to engage in, critical judgement’ (Facione, 1990, p. 12). These dispositions were later categorised into inquisitiveness, open-mindedness, systematicity, analyticity, truth seeking, critical thinking self-confidence and maturity (Facione, 1990).

Cognitive psychology and education research take a more evidence based approach to defining critical thinking and the skills and dispositions that it encompasses. The term critical thinking itself is often used to describe a set of cognitive skills, strategies or behaviours that increase the likelihood of a desired outcome (Halpern, 1996; Tiruneh et al., 2014). Dressel and Mayhew (1954) suggested it is educationally useful to define critical thinking as the sum of specific behaviours which could be observed from student acts. These critical thinking abilities are identifying central issues, recognising underlying assumptions, evaluating evidence or authority and drawing warranted conclusions.

Psychologists typically explored and defined critical thinking via a series of reasoning schemas; conditional reasoning, statistical reasoning, methodological reasoning and verbal reasoning (Nisbett et al., 1987; Lehman and Nisbett, 1990). Halpern (1993) refined the cognitive psychologists' definition of critical thinking as the thinking required to solve problems, formulate inferences, calculate likelihoods and make decisions. Halpern listed a series of skills and dispositions required for good critical thought. Those skills are verbal reasoning, argument analysis, thinking as hypothesis testing, understanding and applying likelihood, uncertainty and probability, decision making and problem solving (Halpern, 1998). The dispositions Halpern described are a willingness to engage and persist with complex tasks, habitually planning and resisting impulsive actions, flexibility or open-mindedness, a willingness to self-correct and abandon non-productive strategies and an awareness of the social context for thoughts to become actions (Halpern, 1998). Glaser (1984) further elaborated on the awareness of context to suggest that critical thinking requires proficiency in metacognition.

In the case of science education there is often an emphasis of critical thinking as a skill set (Bailin, 2002). There are concerns that from a pedagogical perspective many of the skills or processes commonly ascribed as part of critical thinking are difficult to observe and therefore difficult to assess. Consequently, Bailin suggests that the concept of critical thinking should explicitly focus on adherence to criteria and standards to reflect ‘good’ critical thinking (Bailin, 2002, p. 368).

Recent literature has lent evidence to the notion that there are several useful definitions of critical thinking of equally valuable meaning (Moore, 2013). The findings of this work identified themes such as ‘critical thinking: as judgement; as scepticism; as originality; as sensitive reading; or as rationality.’ The emphasis with which these themes were used was dependent on the teaching practitioners' context.

Can critical thinking be taught?

How critical thinking is taught and the extent to which critical thinking skills may be transferable between disciplines is a highly contentious issue. Teaching critical thinking dates back to ancient Greece in the philosophies of Socrates and Plato (Mann, 1979). Studies in the early to mid-twentieth century suggested that students are only able to think critically within their specialised discipline (Thorndike and Woodworth, 1901a, 1901b, 1901c; Inhelder and Piaget, 1958; Wason, 1966) and, therefore, to teach critical thinking in an abstract environment provides no educational benefit to the student's capacity to think critically (McPeak, 1981).

In later years cognitive psychology leant evidence to the argument that critical thinking could be developed within a specific discipline and those reasoning skills were, at least to some degree, transferable to situations encountered in daily life (Lehman et al., 1988; Lehman and Nisbett, 1990). This led to a more pragmatic view that the best critical thinking occurs within ones area of expertise, termed domain specificity (Ennis, 1990), however critical thinking can still be effectively developed with or without content specific knowledge (McMillan, 1987; Ennis, 1989). However the debate regarding the dependence of content specific knowledge in the development of critical thinking continues to be discussed (Moore, 2011; Davies, 2013).

Attempts to teach critical thinking are common in the chemistry education literature. These range from writing exercises (Oliver-Hoyo, 2003; Martineau and Boisvert, 2011; Stephenson and Sadler-Mcknight, 2016), inquiry-based projects (Gupta et al., 2015), flipped lectures (Flynn, 2011) and open-ended practicals (Klein and Carney, 2014) to gamification (Henderson, 2010) and work integrated learning (WIL) (Edwards et al., 2015). While this literature captures that critical thinking is being developed, it seldom discusses the perception of the students.

This study aimed to identify the perceptions of critical thinking of chemistry students, teaching staff and employers. The study investigated how each of these groups define critical thinking and where students and teaching staff believed critical thinking was developed during the study of chemistry.


The research aims were achieved via qualitative analysis of open-ended questionnaire data collected in either paper or digital formats.

Data collection instrument

An open ended questionnaire was designed and administered to all three year levels of Monash University undergraduate chemistry students in 2015. The questionnaire contained questions regarding demographic data (age and gender) and two open-ended fields asking ‘What does the term “Critical Thinking” mean to you?’ (Q1) and ‘Can you provide an example of when you have had the opportunity to develop your critical thinking while studying chemistry?’ (Q2a) All participants were informed that their participation was voluntary, anonymous and would in no way affect their academic records. Participants were provided with an explanatory statement outlining these terms and all procedures were in accordance with Monash University Human Research Ethics Committee (MUHREC) regulations (project number CF15/560 – 2015000258).

A similar questionnaire was administered in hard copy to the teaching associates (TAs) and academics within the School of Chemistry at Monash University and via an online format to a different cohort from a range of institutions. The questionnaire consisted of items asking participants to identify teaching activities undertaken within the previous year, and at which year levels they taught these activities. They were asked open-ended questions which aligned with the student questionnaire: ‘What does the term “Critical Thinking” mean to you?’ (Q1) and ‘Can you provide an example of when you have provided students with the opportunity to develop their critical thinking while studying chemistry?’ (Q2b).

Employers were contacted directly via email and provided with a link to an online questionnaire. The questionnaire consisted of four open-ended question: ‘What does the term “Critical Thinking” mean to you?’ (Q1) and three demographic questions regarding which country the participant's organisation was based, which sector their business was in and the highest qualification the participant held.

Student participants

The first year cohort was drawn from two units: Chemistry I and Advanced Chemistry I consisted of approximately 1000 students. Chemistry I is a general chemistry course with a mixed cohort of science students (880 in 2015), some of whom completed chemistry studies in high school and some who studied science but have not previously studied chemistry. Advanced Chemistry I consisted only of students who have completed chemistry in high school. Advanced Chemistry I covers the same content as Chemistry I with additional time in laboratory sessions. All first year participants were provided with the questionnaire at the conclusion of a compulsory laboratory safety induction session during orientation in the first week of semester one. This data was considered a representative random sample of first year chemistry students as the induction session was a prerequisite of all students commencing study in the first year chemistry laboratory.

The second year cohort consisted of 359 students from Synthetic Chemistry I, a course focused on organic and inorganic synthetic techniques from practical and theoretical perspectives. This course is a core unit for any student pursing a chemistry major. Participants were provided with the questionnaire at the end of a practical session during the first two weeks of semester one. The practical activity conducted within this time was known to typically only take students three of the four hours allocated to the practical session. As the activity was a compulsory part of the course, and given it was an essential part of the chemistry major, this cohort could be considered a representative random sample of second year chemistry major students.

Finally, the third year cohort was drawn from 84 students studying Advanced Inorganic Chemistry. This course builds on the theoretical knowledge and practical skills developed in Synthetic Chemistry I, focusing specifically on inorganic chemistry. Typically students completing a chemistry major undertook this unit but alternative courses were available. Participants were provided with the questionnaire during practical sessions in the first four weeks of semester and encouraged to complete it during the session. Since the activities in these sessions were very demanding and time was generally scarce for these students the sampling was regarded as convenient. Furthermore as not all chemistry majors may have undertaken Advanced Inorganic Chemistry the data obtained from this cohort may be non-representative.

Teaching staff participants

47 TAs from Chemistry I and Advanced Chemistry I were provided with the questionnaire at the conclusion of a compulsory laboratory safety induction during orientation of semester one. As such the data obtained from this cohort could be considered a representative random sample of TAs who taught at a first year undergraduate level.

A senior TAs and academics cohort consisted of academic staff and TAs with several years teaching experience. These academics and senior TAs typically taught chemistry courses other than Chemistry I or Advanced Chemistry I. 12 individuals were approached during semester one of 2015 and were advised to return the questionnaire via unlabelled internal mail.

Finally an online academic cohort consisted of around 300 members of a chemistry education email discussion group predominately from the UK and Europe. These participants received a link to an online version of the questionnaire sent via a third party.

All TAs and academic staff were advised their participation was voluntary and they could opt out by not completing the questionnaire in accordance with MUHREC regulations. All senior TAs, academics and online academics were previously known to highly value the scholarship of teaching thus increasing the likelihood of their participation. Consequently this would be considered a non-representative and convenient sample of experienced teaching staff.

Employer participants

Participants were drawn from a list of respondents who were known to have previously participated in similar qualitative research (Sarkar et al., 2016). Over 200 employers on this list were contacted but the number of responses was quite low (21%). The data from this cohort was a convenient sample and non-representative of all employers of chemistry graduates. All participants were informed that their participation was voluntary and they could opt out by not completing the questionnaire in accordance with MUHREC regulations.

Research theoretical framework

With respect to the open-ended questions posed to the various groups a realist world view was adopted (Edmunds and Brown, 2013). The philosophical framework informing this work was that of constructivism (Matthews, 1993). Constructivism postulates that individuals construct their meaning of concepts such as critical thinking from their experiences and interactions with the world around them. (Lemanski and Overton, 2011). The underpinning assumption was that there may be many truths regarding critical thinking which may be gleaned from data collected in a qualitative manner.

The data was analysed qualitatively and the next stage involved quantification of that qualitative analysis. The qualitative data was analysed with no prior assumptions regarding the number of ways in which individuals may think about critical thinking. The qualitative analysis was then quantified to identify whether there were any common ways in which individuals experienced critical thinking. The nature of these commonalities was not assumed however a retrospective comparison with the literature informed the inferences drawn from the data.

Data analysis

The responses from 470 undergraduate, 40 first year TA, 12 senior TA and academic, 55 online academic and 43 employer questionnaires were transcribed verbatim. This data was then imported into the qualitative analysis tool Nvivo version 10.

The questionnaire data for each cohort was imported into Nvivo as seven separate ‘sources’: first year students (A), second year students (B), third year students (C), TAs (D), senior TAs and academics (E), online academics (F) and employers (G). These cohorts were then merged into three major groups. Students, consisting of A, B and C, teaching staff consisting of D, E and F and employers (G).

Six chemistry education researchers working within the Chemistry Education Research Group (CERG) at Monash University were provided a random selection of 10% of all responses to Q1 and Q2a/Q2b. They were asked to identify key words suggesting emergent themes in each question and from these emergent themes ‘codes’ were generated by the primary researcher for participants' responses (Bryman and Burgess, 1994). Having reviewed the data once, the responses were studied in greater detail to determine whether there were any hidden themes which the initial analysis failed to identify. A third review of the emergent themes within each question was conducted and using a redundancy approach similar themes were combined. This resulted in 21 unique themes for Q1 and 19 unique themes for Q2a/Q2b to used in coding all responses.

The data from the emergent themes of each question was then analysed quantitatively. To determine the number of participants within each group describing a specific theme, the total number of responses within each theme per group was determined using Nvivo's ‘Matrix Coding’ function. This data was exported to Microsoft Excel and the number of participants describing a specific theme within each group was then expressed as a percentage. This percentage was determined using the number of responses for a theme within a group divided by the total number of participants who answered a given question from that group. These percentages were then presented graphically.


470 students, 106 teaching staff and 43 employers responded to the question: ‘What does the term “Critical Thinking” mean to you?’ (Q1). 410 of these students and 86 of the teaching staff also responded to the question ‘Can you provide an example of when you have had the opportunity to develop your critical thinking while studying chemistry?’ (Q2a) or ‘Can you provide an example of when you have provided students with the opportunity to develop their critical thinking while studying chemistry?’ (Q2b) in the case of the teaching staff.

Table 1 shows the gender distribution and median age of students who chose to provide this data. As can be seen, there is a slightly larger population of male students, by 12%. The median age of students is 19 years old which is the typical age of most first or second year Australian undergraduate university students.

Table 1 Demographic data of all undergraduate student participants
Student cohort Gender with which the students identify Median student age
Male Female
1st year students 53% (n = 151) 47% (n = 132) 18 (n = 216)
2nd year students 59% (n = 107) 41% (n = 73) 19 (n = 129)
3rd year students 57% (n = 13) 43% (n = 10) 20 (n = 18)
All undergraduates 56% (n = 271) 44% (n = 215) 19 (n = 363)

Table 2 shows the teaching activities and year levels taught by the various cohorts within the teaching staff group. Respondents were able to select multiple teaching activities and year levels taught. The TA cohort typically taught first year laboratory sessions whereas senior TAs and academics all taught at various year levels via laboratory, tutorial and lecture activities.

Table 2 Teaching activities and year levels taught by respondents
Teaching staff cohort
TAs Senior TAs/academics Online academic
Total participants per cohort: TAs (n = 40), senior TAs/academics (n = 12), online academics (n = 54).
Teaching activity Laboratory n = 30 n = 9 n = 46
Tutorial n = 2 n = 8 n = 44
Lectures n = 0 n = 11 n = 50
Year levels taught No experience n = 12 n = 0 n = 0
1st year n = 30 n = 6 n = 48
2nd year n = 8 n = 10 n = 44
3rd year n = 1 n = 10 n = 44
Hons/M/PhD n = 0 n = 3 n = 48

Table 3 provides the demographic data for employers. The respondents' main offices were predominantly found in Australia and the respondents themselves generally had a tertiary level qualification, with 40% of respondents holding a PhD. The most common sector in which respondents worked were chemical, pharmaceutical or petrochemicals (16%). There was also a reasonable representation of respondents from development, innovation or manufacturing (12%), life sciences (14%) and government (12%).

Table 3 Demographic data of employer participants: country of main office, industry sector and highest qualification held
Country Sector Qualification
a These employers identified multiple sectors and were thus coded according to both themes. b Chemical, pharmaceuticals or petrochemicals. c Development, innovation or manufacturing. d Science or life-science. e Health, medical or pathology. f Environment or conservation. g Fast moving consumer goods.
Australia 72% (n = 31) Chemicalb 16% (n = 7) PhD 40% (n = 17)
UK 26% (n = 11) Developmentc 12% (n = 5) Masters 21% (n = 9)
Belgium 2% (n = 1) Scienced 14% (n = 6) Grad. dip. 5% (n = 2)
Government 12% (n = 5) Post-grad cert. 2% (n = 1)
Healthe 9% (n = 4) Bachelors 30% (n = 13)
Environmentf 7% (n = 2) High school 2% (n = 1)
FMCGg 7% (n = 2)
Mining 5% (n = 2)
Consulting 5% (n = 2)
Education 5% (n = 2)
Chemical and Developmenta 5% (n = 2)
Chemical and FMCGa 2% (n = 1)
Government and Environmenta 2% (n = 1)
Other 5% (n = 2)

The 21 themes generated in response to the question: ‘What does the term “Critical Thinking” mean to you?’ (Q1) can be found in Table 4 along with a definition and brief quote to illustrate the meaning attributed to these themes. The quantitative analysis found in Fig. 1 describes the frequency with which each of these themes was expressed by students, teaching staff and employers.

Table 4 Themes emerging in responses to Q1
Theme Definition Example
Analysis Information, data or evidence analysed or broken down. “Ability to unpack complex situations…”
Application of knowledge What is known or learnt is applied in some way. “Evaluate…from first principles and personal knowledge…”
Arriving at an outcome The end product of critical thinking. E.g. conclusion, argument or course of action. “…form a valid, informed opinion.”

“…an appropriate solution.”

Context (macro) Implication of an outcome with much greater boundaries at an organisational or societal level. “Ethical and economical solution.”

“Outside aspects and factors…”

Creative ‘Creative thinking’ or discussed innovation. “…imaginative generation of ideas.”
Critique Identify assumptions, reasoning, arguments or presumed facts and determining credibility, validity and reliability. “…question the concepts…”

“…challenging the evidence…”

Decision making Used in ‘making a decision’ or for example ‘arriving at a decision’. “…make an informed decision…”
Evaluate Attributing value to a stimulus. Appraising, determining value or identifying meaning. “Reflecting on the meaning…”

“…filtering of that info…”

Identification of opportunities and problems Appropriate questioning to understand a problem. Identification of potential issues or opportunities. “identify where intervention will have the most impact”
Interpretation of information Engaging with a stimulus and understanding that information. “…interpreting the data…”

“…understand concepts…”

Lateral thinking Use of the term ‘lateral thought’ and ‘out of the box’. “… (Thinking) in an abstract manner.”
Logical approach Application of a logic, reasonable or rational thought process. “…reasoned judgements…”

“…finding a rational truth…”

Objectivity Taking an unbiased approach. Sceptical or open minded. “…consider various points of view…”
Problem solving Problem and/or something that needs to be resolved. “…work through a problem…”
Productivity Thinking which in some way has a constructive use, e.g. efficient. “…where intervention will have the most impact…”
Reflection Metacognitive processes of ‘why am I thinking what I'm thinking?’ “Thinking about your thinking”
Research Collection of (experimental) data, evidence or information. “…gathering information or data…”
Systematic approach How thoughts are organised. Order of operations. “…arrange it (information) in a way that it informs outcomes.”
Testing Exploring and testing knowledge, evidence, claims or arguments. “…draw conclusions based on hypothesis testing.”
Under pressure Time constraint or when stakes are high. “…under pressure situations…”
Understanding the local context Action or opinion is required and have some sort of impact. “…what must be done in a situation…”

image file: c6rp00249h-f1.tif
Fig. 1 Percentage of responses describing a given theme by cohort in response to Q1.

It is important to note that a single response may be coded to multiple themes or in some instances none at all. Table 5 provides a breakdown of how many responses contain a given number of themes. For example 87 responses from the first year cohort contain only a single theme whereas 11 responses from employers contain three themes. The mean number of themes per response or coding density was determined for each cohort and each group. Students described a mean value of 1.73 themes per response, teaching staff described an average of 2.75 themes per response and employers described 3.98 themes per response.

Table 5 Number of responses describing a given number of themes in response to Q1
No. themes 0a 1 2 3 4 5 6 7 8 9 Mean
Cohort/group Number of responses which described the above number of themes
a Responses that were coded to zero themes were either considered not to make sense or responses that could not be given meaning without further investigation (see discussion).
1st year 30 87 90 37 19 2 1 1 1.79
2nd year 32 55 51 28 12 1 1.64
3rd year 5 6 7 3 3 1.71
Students 67 148 148 68 34 3 1 1 1.73
TAs 5 10 15 7 3 1.83
Sen TAs/academics 1 2 4 5 3.00
Online 4 12 13 17 6 2 3.28
Teaching staff 6 11 26 21 24 6 2 2.75
Employers 1 8 11 8 7 4 3 1 3.98

In response to the question; ‘Can you provide an example of when you have had the opportunity to develop your critical thinking while studying chemistry?’ (Q2a) or ‘Can you provide an example of when you have provided students with the opportunity to develop their critical thinking while studying chemistry?’ (Q2b) 19 themes were generated. Table 6 contains these themes, their definitions and brief excerpts to convey the meaning attributed to these themes. The quantitative analysis found in Fig. 2 describes the frequency with which each of these themes was expressed in student and teaching staff responses.

Table 6 Themes emerging from responses to Q2a/Q2b
Theme Definition Example
Algorithmic problem solving All the data is provided and the solutions are known. “Solving chemical equations.”
Application of knowledge Use of knowledge, usually developed within a course. “…fundamental knowledge…”
Assessing knowledge Formative or summative feedback assessments. “Weekly tests & semester exams.”
Creating an argument Generate hypothesis, opinion, argument or conclusion. “…justify their (students) choices…”
Critiquing Decide quality of experimental data, method or argument. “…assess aspects of experimental design.”
Developing knowledge Developing specific content knowledge. “…perform lab task & understand the theory…”
Discussion method Engaged in dialogue with students, TAs or academics. “…what would happen if…?”
Engaging with experimental data Engaging with data generated in the lab or from research. “…did not achieve expected results…”
Experimental design Developing an experimental method in a laboratory setting. “…create their (students) own experiments.”
Inquiry based learning Often described as an ‘IDEA prac’. “…we had to come up with a method for a prac”
Leading questions Thought processes guided by open-ended questions. “…prompt them (students) with questions …”
Lecture environment Activity taking place in a lecture or as part of a lecture. “Reading materials before and after lectures…”
Open-ended problem solving Not all the data provided, ill defined or unknown solution. “…direct answers may not be able to be found…”
Practical environment Activities taking place in/or as result of a laboratory. “…making paracetamol…”
Project work Project which occurred over an extended time period. “…problem to solve for the semester…”
Research Use of the term ‘research’ or indicating research “…research projects…”
Safety Safe procedures in a laboratory environment. “…handling dangerous chemicals.”
Testing Experimentation or test of a hypothesis. “Not just assuming our hypothesis is right.”
Tutorial environment Tutorials', ‘tutes’ or ‘tutorial questions’. “…through discussions in tutorial.”
Writing Lab reports, essays or literature reviews. “Writing lab reports with discussion (section).”

image file: c6rp00249h-f2.tif
Fig. 2 Percentage of responses describing a given theme by cohort in response to Q2a/Q2b.

Once again a single response could be coded to multiple themes or none at all. Table 7 shows how many responses contained a given number of themes. For example 108 first year responses were coded to a single theme compared to only two Senior TA/Academic responses. Students described an average of 1.32 themes per response and teaching staff described an average of 2.25 themes per response.

Table 7 Number of responses describing a given number of themes for Q2a/Q2b
No. themes 0a 1 2 3 4 5 6 Mean
Cohort/group Number of responses which described the above number of themes
a Responses that were coded to zero themes were done so as these responses were either not to make sense, responses that could not be given meaning without further investigation, or stated that the respondent had either not studied or taught chemistry before (see Discussion).
1st year 53 108 42 10 1 1.06
2nd year 7 70 89 7 1 1.57
3rd year 0 6 4 2 3 2.13
Students 60 184 135 19 5 1.32
TAs 3 9 11 3 1.54
Sen TAs/acad. 0 2 5 2 2 2.36
Online acad. 0 14 11 16 6 4 1 2.58
Teaching staff 3 25 27 21 8 4 1 2.25


This study aimed to collect chemistry undergraduate students', teaching staff and employers' views pertaining to their definition of critical thinking and in the case of the students and teaching staff, where they believed critical thinking was developed when studying chemistry at university. Many clear patterns emerged from the qualitative analysis. However, the representation and limitations of the data set must be considered before making any generalisations.

Data representation and limitations

All student questionnaires were conducted in a laboratory environment whereas teaching staff completed the questionnaire in a number of environments; in the laboratory, in their office or online. This may have impacted the number of themes per response or coding density of each cohort for a given question. Table 5 illustrates that when responding to Q1 students typical described at least one theme whereas teaching staff and employers on average described at least 3 themes. This lower coding density may be due to maturity or possibly less exposure to critical thinking activities to be able to articulate what it is. Alternatively, since students typically received this questionnaire at the end of a two to three hour laboratory activity they may have been less inclined to write lengthy responses.

A similar pattern of coding density can be observed between TAs (D) versus Senior TAs and Academics (E), Online Academics (F) and Employers (G). It would appear that those participants who were approached directly or online made a concerted effort to respond to the questions as can be seen in Tables 5 and 7, where at least 3 themes were typically described by cohorts E, F and G. Again it is worth considering the experience that cohort D have with critical thinking. The majority of this cohort were on semester long contracts and only had teaching experience in a first year laboratory environment (Table 2). It is possible these participants may not exercise their critical thinking skills as frequently as academics who routinely engage in activities such as peer reviewing journal submissions which exercise these skills more frequently. This aligns with the constructivist notion that an individual creates their meaning of a given construct from their environment (Lemanski and Overton, 2011) and in this research how the participants believe that construct is applied in their daily lives.

With respect to demographic data, there was a slightly larger representation of students identifying as male compared to female. This was observed in all student cohorts, however it is important to note that there was slightly larger number of female students enrolled in chemistry at Monash University as compared to male students. As can be seen from Table 1, the median age for students was nineteen years old. This value was skewed slightly as a result of such large numbers of respondents from first and second year cohorts.

Larger samples of first and second year students and first year TAs were obtained due to the environments in which the questionnaire was conducted (namely compulsory laboratory sessions). Aside from the slightly larger number of male student respondents, there can be some confidence that the data obtained is representative of a random sample of the respective cohorts and the findings may be generalizable.

Obtaining data from senior TAs, academics and employers was far more difficult and consequently the data collected was more reflective of non-representative convenience sampling. Therefore, the findings herein may have limited generalisability with respect to senior TAs, academics and employers.

Defining critical thinking

As can be seen from Table 5, over sixty responses were attributed no themes. This was for one of two reasons. Respondents may have attempted to demonstrate their wit rather than their understanding of the term critical thinking with responses such as “means a lot”, “not annoying your TA” or “thinking critically”. More commonly responses that were attributed no themes were due to defining critical thinking as “thinking on a much deeper level” or “thinking in a complex manner”. The meaning of these responses was unclear and difficult to code. Whilst it could be argued that many themes, such as ‘analysis’ would benefit from probing via focus groups (Edmunds and Brown, 2013, pp. 22–24), this is particularly true of terms such as ‘deep thinking’. Consequently, these responses were not coded.

The theme ‘analysis’ was frequently expressed by all groups (students, teaching staff and employers). At least 20% of all responses identified analysis as part of the meaning of critical thinking. In the case of the student group it was, in fact, the most common theme, with just over 25% of respondents using it to define critical thinking. The term analysis or analysing was commonly used to describe interaction with some sort intellectual stimulus, whether it be an idea, data or a problem. Many responses referred to ‘analysing something’ to suggest a breath of critical thinking.

Students strongly identified with three other themes: ‘critique’, ‘objectivity’ and ‘problem solving’. Problem solving was the second most commonly expressed theme by student respondents with just over 23% of responses describing it. The link between critical thinking and problem solving appears to be a common association made by students (Tapper, 2004). Critique and objectivity were identified in approximately 17% of responses. The relatively smaller number of themes described by students is not altogether surprising as other qualitative studies have shown students often have difficultly conceptualising critical thinking (Duro et al., 2013).

Teaching staff most commonly described the themes ‘critique’ (40%) and ‘evaluate’ (42%) when defining critical thinking. In other recent studies a similar emphasis on interpreting information via analysis and evaluation was also observed (Duro et al., 2013; Desai et al., 2016). Teaching staff were much more goal orientated than students with 28% of responses describing ‘arriving at an outcome’. Outcomes were very task orientated a kin to Barnett's (1997) ‘critical being’, either developing a plan relating to experimental design or arriving at a conclusion as a result of experimental data. For example:

“The ability to examine evidence, come to a conclusion based on that evidence…”

Teaching staff also commonly described the themes ‘application of knowledge’, ‘logical approach’, ‘objectivity’ and ‘problem solving’ in approximately 20% of responses. It is worth noting that students and teaching staff express the theme of ‘objectivity’ with similar frequencies (18% and 19%, respectively). Of all three groups, teaching staff use the theme of problem solving the least when defining critical thinking (18%). While only 14% of teaching staff respondents described the theme of ‘interpreting information’ the value of this as being part of critical thinking was higher than with the student (11%) and employer (9%) groups.

As can be seen from Table 5 employers typically described the largest number themes in their responses. ‘Problem solving’ was the most common theme expressed by over 44% of employers. Employers were goal orientated much like teaching staff, commonly describing themes of ‘application of knowledge’ (19%), ‘objectivity’ (30%), ‘logical approach’ (21%), ‘evaluate’ (30%) and ‘arriving at an outcome’ (33%). Arriving at an outcome contained a wide breadth of examples in employer responses. However, there was some focus on using evidence to inform a conclusion which would lead to a course of action for the organisation to take:

“…a necessary approach to solving or answering problems, developing a product or process.”

Employers expressed four themes unique to their group: ‘context (macro)’ (12%), ‘creative’ (19%), ‘systematic approach’ (21%) and ‘identification of opportunities and problems’ (35%). The latter focused on the use of critical thinking as a method of uncovering what is not immediately apparent:

“To consider the problem to expose route cause(s) in a rationale and logical manner and apply lateral thinking to seek solutions to the problem.”

The above response also includes in its definition of critical thinking;

“The ability of a person to identify a problem that does not have a readily available or off the shelf solution.”

This is an excellent example of responses identifying creativity in conjunction with the theme of problem identification. The general sentiment of employers was that critical thinking is important to innovation within the organisation and is suggestive of what Jackson (2010b) refers to as ‘Pro-c creativity’ or the creativity associated within a professional environment.

Furthermore, employers were unique in describing critical thinking with the theme of ‘context (macro)’. What this theme references is that employers identified the application of critical thinking on a much broader social scale. For example:

“…understand the implications from an organisational perspective.”

“…collaborating the thoughts and views of others to gain a clearer insight of the real challenge.”

Employers acknowledged that the results of critical thinking can have an impact in commercial and societal contexts. While students and teaching staff have a somewhat more internalised definition of critical thinking, employers appear to have a more social application of critical thinking as seen in some the literature (Desai et al., 2016).

One of the most interesting features of this data was that the terms ‘judgement’ and ‘inference’, found in the Delphi definition of critical thinking (Facione, 1990), were seldom used by respondents. In fact below are the only two student responses to use the term ‘judgement’:

“Not taking things at face value and giving topics considerable thought and analysis before coming to a conclusion/judgement on it.” – First year respondent

“Analysis of a problem to make a judgement.” – Second year respondent

It is worth noting that a similar minority of respondents used the term ‘opinion’ in their definition of critical thinking;

“Ability to objectively analyse, process and form an opinion of a particular subject.”

And a slightly larger number of respondents used the term ‘conclusion’:

“A skill to understand a thing more clearly and make conclusion.”

When the Delphi report describes core critical thinking skills the terms ‘judgement’ and ‘opinion’ are used somewhat synonymously. Similarly, ‘drawing conclusions’ is explicitly stated as a sub skill of the skill of ‘inference’ (Facione, 1990, p. 10). This suggests that a larger number of respondents using ‘opinion’ or ‘conclusion’ may in fact be referencing the terms ‘judgement’ or ‘inference’. However without further probing what respondents mean by ‘conclusion’ or ‘opinion’ this is not a certainty.

There is also very little emphasis around self-regulation or the metacognitive processes typically associated with ‘good’ critical thinking (Glaser, 1984; Bailin, 2002). Perhaps this is implied when respondents described the theme of ‘objectivity’:

“Thinking about situations with an open view point and analysing what you're doing.”

What is very clear from this data is the emphasis on problem solving in the definition of critical thinking. This was a very prominent feature of the data from students and employers. With respect to the students this may be due to the perception that scientific facts are unquestionable and the algorithmic problem solving pedagogies commonly employed in science education (Zielinski, 2004; DeWit, 2006; Cloonan and Hutchinson, 2011). This feature of the data was slightly less common in teaching staff, but it was very prominent with employers. This might be due to the fact that employers are typically adept at reflecting on open-ended problems and identifying any parameters or approximations required (Randles and Overton, 2015). This experience with open-ended problems may also explain the description of the theme of ‘identification of problems and opportunities’ which was somewhat unique to employers.

Interestingly the Delphi report does not consider problem solving an element of critical thinking. Instead it proposes problem solving and critical thinking are ‘closely related forms of higher-order thinking’ (Facione, 1990, p. 5). Similarly Halpern suggests that certain behaviours are associated with critical thinking or problem solving but that these higher order cognitive skills are not mutually exclusive (Halpern, 1996, pp. 317–363). This cognitive psychology view is more reflective of the data that has emerged from respondents in this study which might otherwise be considered misconceptions with respect to critical thinking.

Regardless of this interpretation, it would be interesting to ask students, teachers and professionals from other disciplines to define critical thinking. It is quite possible that an emphasis on judgement may occur in humanities, commerce or arts and perhaps there would be less use of the theme of problem solving. For example when a group of business academics were asked to describe which critical thinking skills were important to graduates entering the workforce within their discipline, 47% of responses described problem solving and 34% of responses described analysis (Desai et al., 2016).

The other interesting feature of this data are the points of difference between groups and what these may be attributed to. For example teaching staff emphasised the themes of ‘critique’ and ‘evaluate’. A common aspect of an academics role is to be involved in peer review and academic writing so it is not surprising that these themes arise so frequently. Likewise employers' frequency of themes around identification, innovation and context are reflective of a competitive commercial environment. Given the respondents association between critical thinking and problem solving, these perceptions around evaluation and identifying problems could also be a reflection of behaviours typical of expert open-ended problem solvers (Randles and Overton, 2015). Both employers and teaching staff have a goal oriented definition of critical thinking which may be a product of maturity and/or their exposure to professional environments. Again this may be an example of constructivism (Lemanski and Overton, 2011).

As can be seen in Table 8, all groups used themes around analysis, critiquing, objectivity and problem solving to define critical thinking. In addition teaching staff and employers use themes relating to the application of knowledge, arriving at an outcome, evaluation and using a logical approach. Employers further expand on their definition to include themes regarding creativity, considering the broader context, taking a systematic approach and identifying opportunities and problems. These themes regarding the definition of critical thinking can be synthesised thus:

Table 8 Common themes emerging by group in responses to Q1
Theme Students (%) Teaching staff (%) Employers (%)
Expressed as increments of >10% for ease of readability and to highlight similarities and differences between groups.
Analysis >20 >20 >30
Application of knowledge >20 >10
Arriving at an outcome >20 >30
Context (macro) >10
Creative >10
Critique >10 >40 >20
Evaluation >40 >20
Identifying opportunities… >30
Logical approach >10 >20
Objectivity >10 >10 >20
Problem solving >20 >10 >40
Systematic approach >20

To analyse and critique objectively when solving a problem. – Students

To analyse, critique and evaluate through the logical and objective application of knowledge to arrive at an outcome when solving a problem. – Teaching staff

To analyse, critique and evaluate problems and opportunities through the logical, systematic, objective and creative application of knowledge so as to arrive at an outcome and recognise the large scale context in which these problems and opportunities occur. – Employers

While there are some similarities between the definitions of critical thinking it would be inaccurate to suggest that there is a shared definition. Furthermore, the depth to which critical thinking was defined appears to reflect the constructivist phenomena. Employers most commonly reflect definitions found in the literature (Facione, 1990; Halpern, 1996; Tiruneh et al., 2014). Employers appear to have a broader definition of critical thinking and this may be related to the fact that employers work in very broad contexts and a range of experiences, going beyond chemistry to deal with issues such as budgets, policies and human resources.

Where is critical thinking developed while studying chemistry at university?

Much like Q1 some responses were not attributed to themes (see Table 7). This was predominantly in the student group. While some respondents continued to demonstrate their aptitude for comedy and not seriously engage with the questionnaire, the majority of responses which weren't coded stated they had not previously studied chemistry. Similarly, ten TAs were just commencing their honours research year and had not had any teaching experience at the time of the questionnaire.

With respect to the teaching staff, the wording of the question they received must be considered to put the responses in context: ‘Can you provide an example of when you have provided students with the opportunity to develop their critical thinking while studying chemistry?’ (Q2b) This wording elicited responses which were drawn from the respondents' recent teaching activities and may actually differ from where the respondent believes students develop their critical thinking most. For example many TAs from cohort A only have practical experience to draw on whereas cohorts B and C also have lecture and/or tutorial actives to base their response on (Table 2). Conversely some respondents from cohorts B and C only had lecture or tutorial experience to draw on.

When asked to provide an example of where they believed they developed their critical thinking while studying chemistry, 45% of students identified an activity relating to a practical environment. The second most common theme was ‘inquiry based learning’ (17%). What was most interesting was that 36% of second year students and 14% of third year students specifically mentioned ‘IDEA pracs’. These practicals were guided inquiry activities the students performed as part of their first year laboratory program (Rayner et al., 2013). The fact that after two years in some cases students identified these activities demonstrates the effectiveness of inquiry-based learning in developing transferable skills such as critical thinking.

It is important to recognise that students do not identify activities that make the teaching of critical thinking explicit. Students in other studies identified courses around scientific communication as opportunities where critical thinking was explicitly taught (Tapper, 2004). Beyond these courses, much like the students in the current study, the development of critical thinking became more implicit and students became dependent on feedback from writing activities (Tapper, 2004; Duro et al., 2013). It is clear from the literature, without a deliberate effort to make critical thinking goals explicit in discipline specific courses, students find it difficult to conceptualise, and perceive critical thinking as an intuitive skill that develops over time (Tapper, 2004; Beachboard and Beachboard, 2010; Duro et al., 2013; Loes et al., 2015).

Teaching staff also identified practical environments (26%) as to when they developed students' critical thinking. However, four additional themes were also prominent in their responses: ‘application of knowledge’ (21%), ‘critique’ (33%), ‘project work’ (21%) and ‘research’ (19%). These themes are reflective of activities described in recent literature designed to elicit higher order cognitive skills (Cowden and Santiago, 2016; Stephenson and Sadler-Mcknight, 2016; Toledo and Dubas, 2016). Critique activities ranged from critiquing experimental design to writing literature reviews:

“I may provide students with some experimental evidence and they need to evaluate whether these are consistent with specific mechanisms.”

“Choosing and researching a topic to conduct a literature review on. Writing a review to include critical appraisal of the information covered.”

“Research paper-based assessments in which students are asked to locate and extract information, analyse data and critically assess aspects of experimental design.”

“…paper analysis which requires use of many variables in understanding change factors and outcomes in reaction.”

The ‘application of knowledge’ most often described activities taking place predominantly in a lecture environment and in some instances in a practical environment. Themes of ‘project work’ and ‘research’ often described activities in practical environments. Many of these responses focus on final year research projects:

“Mainly this comes from the crucial role of the research project, generally in the final year of study when the student has had the opportunity to build up their knowledge base across a broad range of chemistry.”

The above statement would suggest that critical thinking can only be achieved with a solid foundation of discipline specific knowledge. While it holds true that an individual is a better critical thinker within their discipline specific knowledge (McPeak, 1981; Moore, 2011) it is not true that a large body knowledge is a necessary prerequisite to develop critical thinking (Ennis, 1989; Davies, 2013).

According to this data students and teaching staff have some limited agreement that critical thinking is developed in a practical environment. However, that is where the similarities end. Despite teaching staff believing that they develop critical thinking through the application of knowledge this is not apparent to the students.

Implications for practice

The methods described by teaching staff is what Ennis (1989) described as the Immersion approach, whereby subject matter is covered in great depth but the critical thinking goals are implicit. It would appear the more overt approaches suggested by Ennis (1989) and McMillan (1987) would assist students in recognising when they are being taught critical thinking. This could also contribute to students more thoroughly articulating what critical thinking is.

Teaching staff commonly acknowledge that students develop their critical thinking in active environments in accordance with the literature (Biggs, 2012). However the research projects the respondents commonly describe are often elective subjects or offered as vacation internships, the numbers of which are limited and will only become scarcer as student numbers continue to grow. It would be useful to determine if teaching staff believed project work is an opportunity to measure student critical thinking or whether it is better measured via other activities (if at all) and compare this to the literature (Desai et al., 2016).

A recent meta-analysis would suggest, a combination of teaching activities afford the greatest effect with respect to the development of critical thinking (Abrami et al., 2015). These teaching activities according to Abrami and colleagues are described as ‘authentic instruction’, ‘dialogue’ and ‘mentoring’. These findings are reflective of the present work where practical inquiry based learning, discussions and research projects were commonly described as opportunities to develop critical thinking. It is advisable for chemistry educators wishing to develop critical thinking in students that the activities described by students and teaching staff within this research form a foundation within their practice, emphasising authentic problem solving and Socratic dialogue (Abrami et al., 2015).

Future work

As described earlier, there are several limitations to this study. To further understand the meaning behind terms such ‘deep thinking’ or ‘out-of-the-box’, focus group interviews would prove useful. A larger sample size, particular with respect to third year students within Monash University, teaching staff and employers could improve the quality of the data. The expression of certain themes may become more or less prominent in a larger sample size and would refine the definition of critical thinking described by employers in particular. Likewise, providing this questionnaire to students in other faculties or even other institutions from various countries would add robustness to the findings as the majority of participants were Australian. Those interested conducting the questionnaire are encouraged to contact the authors via email.


When looking at the results of this study there are several clear differences between students, teaching staff and employers. These differences may arise from several factors such as education, maturity, experience, environment or possible a combination of all of these. This may be a reflection of the constructivist notion that an individual creates meaning of constructs such as critical thinking as result of and through interacting with their environment. This is exemplified by the emphasis on problem solving by students and employers when defining critical thinking whereas teaching staff more commonly associate critiquing and evaluation with critical thinking. Specifically, employers appeared to have a more thorough definition of critical thinking which may be due to broader contexts found in the workplace.

When asked to define critical thinking via an open ended questionnaire students, teaching staff and employers all described the themes of analysis, critique, objectivity and problem solving. Teaching staff and employers commonly expressed themes around evaluation, goal orientation and use of logic. Employers also believed creativity, larger scale contexts, taking a systematic approach and identifying of opportunities and problems are important aspects of critical thinking. This would suggest there is only a limited shared definition of critical thinking between students, teaching staff and employers which centres on analysis and problem solving.

In the same open ended questionnaire students and teaching staff described where they believed they developed student critical thinking. Overwhelmingly students described practical environments and inquiry based learning activities developed critical thinking. Teaching staff expressed themes around the application and critiquing of knowledge and to some extent practical environments and research projects. Again there appeared to be limited overlap between the perceptions of students and teaching staff and the need for more immersive student experiences, such as inquiry-based learning and work integrated learning (Edwards et al., 2015), is apparent in the development of transferable skills such as critical thinking.

If the workplace is expecting tertiary institutes to provide chemistry graduates for the workforce, a shared definition of critical thinking is imperative. However, there appears to be a somewhat limited shared understanding as to what critical thinking skills entail. If there are so many facets to critical thinking how can universities accommodate the development of these? Initiatives such work integrated learning (Edwards et al., 2015) aim to give students experience in commercial environments and perhaps in combination with inquiry-based pedagogies, a shared understanding of critical thinking and how to develop it can occur.


The authors would like to acknowledge undergraduate and teaching staff participants from Monash University and academics and employers who took the time to complete the questionnaire online. This research was made possible through the Australian Post-graduate Award funding and with guidance of the Monash University Human Ethics Research Committee.


  1. Abrami P. C., Bernard R. M., Borokhovski E., Waddington D. I., Wade C. A. and Persson T., (2015), Strategies for teaching students to think critically: a meta-analysis, Rev. Educ. Res., 85(2), 275–314.
  2. Australian National University, (2015), Chemistry major, retrieved from http://programsandcourses.anu.edu.au/2016/major/CHEM-MAJ.
  3. Bailin S., (2002), Critical thinking and science education, Sci. Educ., 11, 361–375.
  4. Barnett R., (1997), Higher education: a critical business, Buckingham: Open University Press.
  5. Beachboard M. R. and Beachboard J. C., (2010), Critical-thinking pedagogy and student perceptions of university contributions to their academic development. (report), Informing Science: the International Journal of an Emerging Transdiscipline, 13, 53–71.
  6. Biggs J., (2012), What the student does: teaching for enhanced learning, Higher Educ. Res. Dev., 31(1), 39–55.
  7. Bryman A. and Burgess R. G., (1994), Reflections on qualitative data analysis, in Bryman A. and Burgess R. G. (ed.), Analyzing qualitative data, London: Sage.
  8. Cloonan C. A. and Hutchinson J. S., (2011), A chemistry concept reasoning test, Chem. Educ. Res. Pract., 12, 205–209.
  9. Cowden C. D. and Santiago M. F., (2016), Interdisciplinary explorations: promoting critical thinking via problem-based learning in an advanced biochemistry class, J. Chem. Educ., 93, 464–469.
  10. Davies M., (2013), Critical thinking and the disciplines reconsidered, Higher Educ. Res. Dev., 32(4), 529–544.
  11. Desai M. S., Berger B. D. and Higgs R., (2016), Critical thinking skills for business school graduates as demanded by employers: a strategic perspective and recommendations, Academy of Educational Leadership Journal, 20(1), 10–31.
  12. DeWit D. G., (2006), Predicting inorganic reaction products: a critical thinking exercise in general chemistry, J. Chem. Educ., 83, 1625–1628.
  13. Dressel P. L. and Mayhew L. B., (1954), General education: explorations in evaluation, Washington, D.C.: American Council on Eduction.
  14. Duro E., Elander J., Maratos F. A., Stupple E. J. N. and Aubeeluck A., (2013), In search of critical thinking in psychology: an exploration of student and lecturer understandings in higher education, Psychology Learning & Teaching, 12, 275–281.
  15. Edmunds S. and Brown G., (2013), Section 6: undertaking pedagogic research using qualitative methods, in Groves M. and Overton T. (ed.), Getting started in pedagogic research within the stem disciplines, Edgbaston, Birmingham, UK: University of Birmingham STEM Education Centre, pp. 21–26.
  16. Edwards D., Perkins K., Pearce J. and Hong J., (2015), Work intergrated learning in stem in Australian universities: Australian Council for Education Research.
  17. Ennis R. H., (1989), Critical thinking and subject specificity: clarification and needed research, Educ. Res., 18(3), 4–10.
  18. Ennis R. H., (1990), The extent to which critical thinking is subject-specific: further clarification, Educ. Res., 19(4), 13–16.
  19. Facione P. A., (1990), Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction, Executive summary. “The delphi report”. Millbrae, CA.
  20. Flynn A. B., (2011), Developing problem-solving skills through retrosynthetic analysis and clickers in organic chemistry, J. Chem. Educ., 88, 1496–1500.
  21. Ghulam R. N. and David B., (1999), Graduates' perceptions of transferable personal skills and future career preparation in the UK, Education + Training, 41(4), 184–193.
  22. Glaser R., (1984), Education and thinking: the role of knowledge, Am. Psychol., 39(2), 93–104.
  23. Graduate Careers Australia, (2015), Chemistry – bachelor graduates (all), retrieved from http://www.graduatecareers.com.au/Research/GradJobsDollars/BachelorAll/Chemistry/index.htm.
  24. Gupta T., Burke K. A., Mehta A. and Greenbowe T. J., (2015), Impact of guided-inquiry-based instruction with a writing and reflection emphasis on chemistry students' critical thinking abilities, J. Chem. Educ., 92(1), 32–38.
  25. Halpern D. F., (1993), Assessing the effectiveness of critical thinking instruction, J. Gen. Educ., 50(4), 238–254.
  26. Halpern D. F., (1996), Thought and knowledge: An introduction to critical thinking, Mahwah, N.J.: L. Erlbaum Associates.
  27. Halpern D. F., (1998), Teaching critical thinking for transfer across domains. Dispositions, skills, structure training, and metacognitive monitoring, Am. Psychol., 53, 449–455.
  28. Henderson D. E., (2010), A chemical instrumentation game for teaching critical thinking and information literacy in instrumental analysis courses, J. Chem. Educ., 87, 412–415.
  29. Inhelder B. and Piaget J., (1958), The growth of logical thinking from childhood to adolescence: an essay on the construction of formal operational structures, London: Routledge & Kegan Paul.
  30. Jackson D., (2010a), An international profile of industry-relevant competencies and skill gaps in modern graduates, International Journal of Management Education, 8(3), 29–58.
  31. Jackson N., (2010b), Developing creativity for professional capability through lifewide education, in Jackson N. (ed.), Learning to be Professional through a Higher Education, retrieved from http://learningtobeprofessional.pbworks.com/f/JACKSON+A4.pdf.
  32. Jones S., Yates B. and Kelder J.-A., (2011), Learning and teaching academic standards project, science: learning and teaching academic standards statement September 2011, retrieved from http://www.acds-tlcc.edu.au/wp-content/uploads/sites/14/2015/02/altc_standards_SCIENCE_240811_v3_final.pdf.
  33. Klein G. C. and Carney J. M., (2014), Comprehensive approach to the development of communication and critical thinking: bookend courses for third- and fourth-year chemistry majors, J. Chem. Educ., 91, 1649–1654.
  34. Leggett M., Kinnear A., Boyce M. and Bennett I., (2004), Student and staff perceptions of the importance of generic skills in science, Higher Educ. Res. Dev., 23, 295–312.
  35. Lehman D. R. and Nisbett R. E., (1990), A longitudinal study of the effects of undergraduate training on reasoning, Dev. Psychol., 26, 952–960.
  36. Lehman D. R., Lempert R. O. and Nisbett R. E., (1988), The effects of graduate training on reasoning: formal discipline and thinking about everyday-life events, Am. Psychol., 43, 431–442.
  37. Lemanski T. and Overton T., (2011), An introduction to qualitative research, retrieved from http://https://hydra.hull.ac.uk/assets/hull:4506/content.
  38. Loes C. N., Salisbury M. H. and Pascarella E. T., (2015), Student perceptions of effective instruction and the development of critical thinking: a replication and extension, Higher Education: The International Journal of Higher Education Research, 69, 823–838.
  39. Lowden K., Hall S., Elliot D. and Lewin J., (2011), Employers' perceptions of the employability skills of new graduates: research commissioned by the edge foundation, retrieved from http://www.educationandemployers.org/wp-content/uploads/2014/06/employability_skills_as_pdf_-_final_online_version.pdf.
  40. Mann L., (1979), On the trail of process: a historical perspective on cognitive processes and their training, New York: Grune & Stratton.
  41. Martineau E. and Boisvert L., (2011), Using wikipedia to develop students' critical analysis skills in the undergraduate chemistry curriculum, J. Chem. Educ., 88, 769–771.
  42. Matthews M. R., (1993), Constructivism and science education: some epistemological problems, J. Sci. Educ. Technol., 2(1), 359–370.
  43. McMillan J., (1987), Enhancing college students' critical thinking: a review of studies, J. Assoc. Inst. Res., 26(1), 3–29.
  44. McPeak J. E., (1981), Critical thinking and education, Oxford: Martin Roberston.
  45. Monash University, (2015), Undergraduate - area of study. Chemistry, retrieved from http://www.monash.edu.au/pubs/2015handbooks/aos/chemistry/.
  46. Moore T., (2013), Critical thinking: seven definitions in search of a concept, Studies in Higher Education, 38(4), 506–522.
  47. Moore T. J., (2011), Critical thinking and disciplinary thinking: a continuing debate, Higher Educ. Res. Dev., 30(3), 261–274.
  48. Nisbett R. E., Fong G. T., Lehman D. R. and Cheng P. W., (1987), Teaching reasoning, Science, 238, 625–631.
  49. Oliver-Hoyo M. T., (2003), Designing a written assignment to promote the use of critical thinking skills in an introductory chemistry course, J. Chem. Educ., 80, 899–903.
  50. Pithers R. T. and Soden R., (2000), Critical thinking in education: a review, Educ. Res., 42(3), 237–249.
  51. Prinsley R. and Baranyai K., (2015), Stem skills in the workforce: what do employers want? retrieved from http://www.chiefscientist.gov.au/wp-content/uploads/OPS09_02Mar2015_Web.pdf.
  52. Randles C. A. and Overton T. L., (2015), Expert vs. novice: approaches used by chemists when solving open-ended problems, Chem. Educ. Res. Pract., 16(4), 811–823.
  53. Rayner G. M., Charlton-Robb K.-M., Thompson C. D. and Hughes T., (2013), Interdisciplinary collaboration to integrate inquiry-oriented learning in undergraduate science practicals, Int. J. Innovation Sci. Math. Educ., 21(5), 1–11.
  54. Sarkar M., Overton T., Thompson C. and Rayner G., (2016), Graduate employability: views of recent science graduates and employers, Int. J. Innovation Sci. Math. Educ., 24(3), 31–48.
  55. Stephenson N. S. and Sadler-Mcknight N. P., (2016), Developing critical thinking skills using the science writing heuristic in the chemistry laboratory, Chem. Educ. Res. Pract., 17(1), 72–79.
  56. Tapper J., (2004), Student perceptions of how critical thinking is embedded in a degree program, Higher Educ. Res. Dev., 23(2), 199–222.
  57. The University of Adelaide, (2015), University of Adelaide graduate attributes, retrieved from http://www.adelaide.edu.au/learning/strategy/gradattributes/.
  58. The University of Melbourne, (2015), Handbook – chemistry, retrieved from http://https://handbook.unimelb.edu.au/view/2015/!R01-AA-MAJ%2B1007.
  59. Thorndike E. L. and Woodworth R. S., (1901a), The influence of improvement in one mental function upon the efficiency of other functions. (I), Psychol. Rev., 8(3), 247–261.
  60. Thorndike E. L. and Woodworth R. S., (1901b), The influence of improvement in one mental function upon the efficiency of other functions. II. The estimation of magnitudes, Psychol. Rev., 8(4), 384–395.
  61. Thorndike E. L. and Woodworth R. S., (1901c), The influence of improvement in one mental function upon the efficiency of other functions: functions involving attention, observation and discrimination, Psychol. Rev., 8(6), 553–564.
  62. Tiruneh D. T., Verburgh A. and Elen J., (2014), Effectiveness of critical thinking instruction in higher education: a systematic review of intervention studies, Higher Educ. Stud., 4(1), 1–17.
  63. Toledo S. and Dubas J. M., (2016), Encouraging higher-order thinking in general chemistry by scaffolding student learning using Marzano's taxonomy, J. Chem. Educ., 93(1), 64–69.
  64. Wason P. C., (1966), New horizons, in Foss B. (ed.), Psychology, Harmondsworth, England: Penguin.
  65. Wilson K. L., Lizzio A. and Ramsden P., (1997), The development, validation and application of the course experience questionnaire, Stud. Higher Educ., 22(1), 33–53.
  66. Zielinski T. J., (2004), Critical thinking in chemistry using symbolic mathematics documents, J. Chem. Educ., 81(10), 1533.

This journal is © The Royal Society of Chemistry 2017