S. M.
Danczak
*,
C. D.
Thompson
and
T. L.
Overton
School of Chemistry, Monash University, Victoria 3800, Australia. E-mail: Stephen.danczak@monash.edu
First published on 13th February 2017
Good critical thinking is important to the development of students and a valued skill in commercial markets and wider society. There has been much discussion regarding the definition of critical thinking and how it is best taught in higher education. This discussion has generally occurred between philosophers, cognitive psychologists and education researchers. This study examined the perceptions around critical thinking of 470 chemistry students from an Australian University, 106 chemistry teaching staff and 43 employers of chemistry graduates. An open-ended questionnaire was administered to these groups, qualitatively analysed and subsequently quantified. When asked to define critical thinking respondents identified themes such as ‘analysis’, ‘critique’, ‘objectivity’, ‘problem solving’, ‘evaluate’ and ‘identification of opportunities and problems’. Student respondents described the smallest number of themes whereas employers described the largest number of themes. When asked where critical thinking was developed during the study of chemistry students overwhelmingly described practical environments and themes around inquiry-based learning. When teaching staff were asked this question they commonly identified critiques, research, projects and practical environments to some extent. This research highlights that there is only limited shared understanding of the definition of critical thinking and where it is developed in the study of chemistry. The findings within this article would be of interest to higher education teaching practitioners of science and chemistry, those interested in development of graduate attributes and higher order cognitive skills (HOCS) and those interested in the student and employer perspectives.
As the need for innovation, and anticipating and leading change continues to grow, employers recognise the importance of critical thinking and critical reflection (Desai et al., 2016). It has become an expectation that graduates are able to demonstrate a range of transferable skills such as critical thinking (Lowden et al., 2011). In a survey of 400 US employers, 92% of respondents rated critical thinking as ‘important’ or ‘very important’ in an undergraduate degree and the fifth most applied skill in the work place (Jackson, 2010a).
A recent study commissioned by the Office of the Chief Scientist of Australia surveyed 1065 employers representing a range of industries (Prinsley and Baranyai, 2015). Over 80% of respondents indicated critical thinking as ‘important’ or ‘very important’ as a skill or attribute in the workplace. Critical thinking was considered the second most important skill or attribute behind active learning. In 2012 Graduate Careers Australia found that of the 45% of chemistry graduates available for full-time or part-time employment, only 66% had obtained employment in a chemistry related field (Graduate Careers Australia, 2015). These findings suggest that skills which may be transferable to a range of employment settings, such as critical thinking, are worthwhile developing at the tertiary level.
The report concluded that a person who exhibits good critical thinking is in possession of a series of cognitive skills and dispositions. The consensus of the Delphi experts was that a good critical thinker is proficient in the skills of interpretation, analysis, evaluation, inference, explanation and self-regulation (Facione, 1990). Furthermore, the report stated that a good critical thinker demonstrates a series of dispositions which is required for the individual to utilise the aforementioned skills. According to the report a ‘good critical thinker, is habitually disposed to engage in, and to encourage others to engage in, critical judgement’ (Facione, 1990, p. 12). These dispositions were later categorised into inquisitiveness, open-mindedness, systematicity, analyticity, truth seeking, critical thinking self-confidence and maturity (Facione, 1990).
Cognitive psychology and education research take a more evidence based approach to defining critical thinking and the skills and dispositions that it encompasses. The term critical thinking itself is often used to describe a set of cognitive skills, strategies or behaviours that increase the likelihood of a desired outcome (Halpern, 1996; Tiruneh et al., 2014). Dressel and Mayhew (1954) suggested it is educationally useful to define critical thinking as the sum of specific behaviours which could be observed from student acts. These critical thinking abilities are identifying central issues, recognising underlying assumptions, evaluating evidence or authority and drawing warranted conclusions.
Psychologists typically explored and defined critical thinking via a series of reasoning schemas; conditional reasoning, statistical reasoning, methodological reasoning and verbal reasoning (Nisbett et al., 1987; Lehman and Nisbett, 1990). Halpern (1993) refined the cognitive psychologists' definition of critical thinking as the thinking required to solve problems, formulate inferences, calculate likelihoods and make decisions. Halpern listed a series of skills and dispositions required for good critical thought. Those skills are verbal reasoning, argument analysis, thinking as hypothesis testing, understanding and applying likelihood, uncertainty and probability, decision making and problem solving (Halpern, 1998). The dispositions Halpern described are a willingness to engage and persist with complex tasks, habitually planning and resisting impulsive actions, flexibility or open-mindedness, a willingness to self-correct and abandon non-productive strategies and an awareness of the social context for thoughts to become actions (Halpern, 1998). Glaser (1984) further elaborated on the awareness of context to suggest that critical thinking requires proficiency in metacognition.
In the case of science education there is often an emphasis of critical thinking as a skill set (Bailin, 2002). There are concerns that from a pedagogical perspective many of the skills or processes commonly ascribed as part of critical thinking are difficult to observe and therefore difficult to assess. Consequently, Bailin suggests that the concept of critical thinking should explicitly focus on adherence to criteria and standards to reflect ‘good’ critical thinking (Bailin, 2002, p. 368).
Recent literature has lent evidence to the notion that there are several useful definitions of critical thinking of equally valuable meaning (Moore, 2013). The findings of this work identified themes such as ‘critical thinking: as judgement; as scepticism; as originality; as sensitive reading; or as rationality.’ The emphasis with which these themes were used was dependent on the teaching practitioners' context.
In later years cognitive psychology leant evidence to the argument that critical thinking could be developed within a specific discipline and those reasoning skills were, at least to some degree, transferable to situations encountered in daily life (Lehman et al., 1988; Lehman and Nisbett, 1990). This led to a more pragmatic view that the best critical thinking occurs within ones area of expertise, termed domain specificity (Ennis, 1990), however critical thinking can still be effectively developed with or without content specific knowledge (McMillan, 1987; Ennis, 1989). However the debate regarding the dependence of content specific knowledge in the development of critical thinking continues to be discussed (Moore, 2011; Davies, 2013).
Attempts to teach critical thinking are common in the chemistry education literature. These range from writing exercises (Oliver-Hoyo, 2003; Martineau and Boisvert, 2011; Stephenson and Sadler-Mcknight, 2016), inquiry-based projects (Gupta et al., 2015), flipped lectures (Flynn, 2011) and open-ended practicals (Klein and Carney, 2014) to gamification (Henderson, 2010) and work integrated learning (WIL) (Edwards et al., 2015). While this literature captures that critical thinking is being developed, it seldom discusses the perception of the students.
This study aimed to identify the perceptions of critical thinking of chemistry students, teaching staff and employers. The study investigated how each of these groups define critical thinking and where students and teaching staff believed critical thinking was developed during the study of chemistry.
A similar questionnaire was administered in hard copy to the teaching associates (TAs) and academics within the School of Chemistry at Monash University and via an online format to a different cohort from a range of institutions. The questionnaire consisted of items asking participants to identify teaching activities undertaken within the previous year, and at which year levels they taught these activities. They were asked open-ended questions which aligned with the student questionnaire: ‘What does the term “Critical Thinking” mean to you?’ (Q1) and ‘Can you provide an example of when you have provided students with the opportunity to develop their critical thinking while studying chemistry?’ (Q2b).
Employers were contacted directly via email and provided with a link to an online questionnaire. The questionnaire consisted of four open-ended question: ‘What does the term “Critical Thinking” mean to you?’ (Q1) and three demographic questions regarding which country the participant's organisation was based, which sector their business was in and the highest qualification the participant held.
The second year cohort consisted of 359 students from Synthetic Chemistry I, a course focused on organic and inorganic synthetic techniques from practical and theoretical perspectives. This course is a core unit for any student pursing a chemistry major. Participants were provided with the questionnaire at the end of a practical session during the first two weeks of semester one. The practical activity conducted within this time was known to typically only take students three of the four hours allocated to the practical session. As the activity was a compulsory part of the course, and given it was an essential part of the chemistry major, this cohort could be considered a representative random sample of second year chemistry major students.
Finally, the third year cohort was drawn from 84 students studying Advanced Inorganic Chemistry. This course builds on the theoretical knowledge and practical skills developed in Synthetic Chemistry I, focusing specifically on inorganic chemistry. Typically students completing a chemistry major undertook this unit but alternative courses were available. Participants were provided with the questionnaire during practical sessions in the first four weeks of semester and encouraged to complete it during the session. Since the activities in these sessions were very demanding and time was generally scarce for these students the sampling was regarded as convenient. Furthermore as not all chemistry majors may have undertaken Advanced Inorganic Chemistry the data obtained from this cohort may be non-representative.
A senior TAs and academics cohort consisted of academic staff and TAs with several years teaching experience. These academics and senior TAs typically taught chemistry courses other than Chemistry I or Advanced Chemistry I. 12 individuals were approached during semester one of 2015 and were advised to return the questionnaire via unlabelled internal mail.
Finally an online academic cohort consisted of around 300 members of a chemistry education email discussion group predominately from the UK and Europe. These participants received a link to an online version of the questionnaire sent via a third party.
All TAs and academic staff were advised their participation was voluntary and they could opt out by not completing the questionnaire in accordance with MUHREC regulations. All senior TAs, academics and online academics were previously known to highly value the scholarship of teaching thus increasing the likelihood of their participation. Consequently this would be considered a non-representative and convenient sample of experienced teaching staff.
The data was analysed qualitatively and the next stage involved quantification of that qualitative analysis. The qualitative data was analysed with no prior assumptions regarding the number of ways in which individuals may think about critical thinking. The qualitative analysis was then quantified to identify whether there were any common ways in which individuals experienced critical thinking. The nature of these commonalities was not assumed however a retrospective comparison with the literature informed the inferences drawn from the data.
The questionnaire data for each cohort was imported into Nvivo as seven separate ‘sources’: first year students (A), second year students (B), third year students (C), TAs (D), senior TAs and academics (E), online academics (F) and employers (G). These cohorts were then merged into three major groups. Students, consisting of A, B and C, teaching staff consisting of D, E and F and employers (G).
Six chemistry education researchers working within the Chemistry Education Research Group (CERG) at Monash University were provided a random selection of 10% of all responses to Q1 and Q2a/Q2b. They were asked to identify key words suggesting emergent themes in each question and from these emergent themes ‘codes’ were generated by the primary researcher for participants' responses (Bryman and Burgess, 1994). Having reviewed the data once, the responses were studied in greater detail to determine whether there were any hidden themes which the initial analysis failed to identify. A third review of the emergent themes within each question was conducted and using a redundancy approach similar themes were combined. This resulted in 21 unique themes for Q1 and 19 unique themes for Q2a/Q2b to used in coding all responses.
The data from the emergent themes of each question was then analysed quantitatively. To determine the number of participants within each group describing a specific theme, the total number of responses within each theme per group was determined using Nvivo's ‘Matrix Coding’ function. This data was exported to Microsoft Excel and the number of participants describing a specific theme within each group was then expressed as a percentage. This percentage was determined using the number of responses for a theme within a group divided by the total number of participants who answered a given question from that group. These percentages were then presented graphically.
Table 1 shows the gender distribution and median age of students who chose to provide this data. As can be seen, there is a slightly larger population of male students, by 12%. The median age of students is 19 years old which is the typical age of most first or second year Australian undergraduate university students.
Student cohort | Gender with which the students identify | Median student age | |
---|---|---|---|
Male | Female | ||
1st year students | 53% (n = 151) | 47% (n = 132) | 18 (n = 216) |
2nd year students | 59% (n = 107) | 41% (n = 73) | 19 (n = 129) |
3rd year students | 57% (n = 13) | 43% (n = 10) | 20 (n = 18) |
All undergraduates | 56% (n = 271) | 44% (n = 215) | 19 (n = 363) |
Table 2 shows the teaching activities and year levels taught by the various cohorts within the teaching staff group. Respondents were able to select multiple teaching activities and year levels taught. The TA cohort typically taught first year laboratory sessions whereas senior TAs and academics all taught at various year levels via laboratory, tutorial and lecture activities.
Teaching staff cohort | ||||
---|---|---|---|---|
TAs | Senior TAs/academics | Online academic | ||
Total participants per cohort: TAs (n = 40), senior TAs/academics (n = 12), online academics (n = 54). | ||||
Teaching activity | Laboratory | n = 30 | n = 9 | n = 46 |
Tutorial | n = 2 | n = 8 | n = 44 | |
Lectures | n = 0 | n = 11 | n = 50 | |
Year levels taught | No experience | n = 12 | n = 0 | n = 0 |
1st year | n = 30 | n = 6 | n = 48 | |
2nd year | n = 8 | n = 10 | n = 44 | |
3rd year | n = 1 | n = 10 | n = 44 | |
Hons/M/PhD | n = 0 | n = 3 | n = 48 |
Table 3 provides the demographic data for employers. The respondents' main offices were predominantly found in Australia and the respondents themselves generally had a tertiary level qualification, with 40% of respondents holding a PhD. The most common sector in which respondents worked were chemical, pharmaceutical or petrochemicals (16%). There was also a reasonable representation of respondents from development, innovation or manufacturing (12%), life sciences (14%) and government (12%).
Country | Sector | Qualification | |||
---|---|---|---|---|---|
a These employers identified multiple sectors and were thus coded according to both themes. b Chemical, pharmaceuticals or petrochemicals. c Development, innovation or manufacturing. d Science or life-science. e Health, medical or pathology. f Environment or conservation. g Fast moving consumer goods. | |||||
Australia | 72% (n = 31) | Chemicalb | 16% (n = 7) | PhD | 40% (n = 17) |
UK | 26% (n = 11) | Developmentc | 12% (n = 5) | Masters | 21% (n = 9) |
Belgium | 2% (n = 1) | Scienced | 14% (n = 6) | Grad. dip. | 5% (n = 2) |
Government | 12% (n = 5) | Post-grad cert. | 2% (n = 1) | ||
Healthe | 9% (n = 4) | Bachelors | 30% (n = 13) | ||
Environmentf | 7% (n = 2) | High school | 2% (n = 1) | ||
FMCGg | 7% (n = 2) | ||||
Mining | 5% (n = 2) | ||||
Consulting | 5% (n = 2) | ||||
Education | 5% (n = 2) | ||||
Chemical and Developmenta | 5% (n = 2) | ||||
Chemical and FMCGa | 2% (n = 1) | ||||
Government and Environmenta | 2% (n = 1) | ||||
Other | 5% (n = 2) |
The 21 themes generated in response to the question: ‘What does the term “Critical Thinking” mean to you?’ (Q1) can be found in Table 4 along with a definition and brief quote to illustrate the meaning attributed to these themes. The quantitative analysis found in Fig. 1 describes the frequency with which each of these themes was expressed by students, teaching staff and employers.
Theme | Definition | Example |
---|---|---|
Analysis | Information, data or evidence analysed or broken down. | “Ability to unpack complex situations…” |
Application of knowledge | What is known or learnt is applied in some way. | “Evaluate…from first principles and personal knowledge…” |
Arriving at an outcome | The end product of critical thinking. E.g. conclusion, argument or course of action. |
“…form a valid, informed opinion.”
“…an appropriate solution.” |
Context (macro) | Implication of an outcome with much greater boundaries at an organisational or societal level. |
“Ethical and economical solution.”
“Outside aspects and factors…” |
Creative | ‘Creative thinking’ or discussed innovation. | “…imaginative generation of ideas.” |
Critique | Identify assumptions, reasoning, arguments or presumed facts and determining credibility, validity and reliability. |
“…question the concepts…”
“…challenging the evidence…” |
Decision making | Used in ‘making a decision’ or for example ‘arriving at a decision’. | “…make an informed decision…” |
Evaluate | Attributing value to a stimulus. Appraising, determining value or identifying meaning. |
“Reflecting on the meaning…”
“…filtering of that info…” |
Identification of opportunities and problems | Appropriate questioning to understand a problem. Identification of potential issues or opportunities. | “identify where intervention will have the most impact” |
Interpretation of information | Engaging with a stimulus and understanding that information. |
“…interpreting the data…”
“…understand concepts…” |
Lateral thinking | Use of the term ‘lateral thought’ and ‘out of the box’. | “… (Thinking) in an abstract manner.” |
Logical approach | Application of a logic, reasonable or rational thought process. |
“…reasoned judgements…”
“…finding a rational truth…” |
Objectivity | Taking an unbiased approach. Sceptical or open minded. | “…consider various points of view…” |
Problem solving | Problem and/or something that needs to be resolved. | “…work through a problem…” |
Productivity | Thinking which in some way has a constructive use, e.g. efficient. | “…where intervention will have the most impact…” |
Reflection | Metacognitive processes of ‘why am I thinking what I'm thinking?’ | “Thinking about your thinking” |
Research | Collection of (experimental) data, evidence or information. | “…gathering information or data…” |
Systematic approach | How thoughts are organised. Order of operations. | “…arrange it (information) in a way that it informs outcomes.” |
Testing | Exploring and testing knowledge, evidence, claims or arguments. | “…draw conclusions based on hypothesis testing.” |
Under pressure | Time constraint or when stakes are high. | “…under pressure situations…” |
Understanding the local context | Action or opinion is required and have some sort of impact. | “…what must be done in a situation…” |
It is important to note that a single response may be coded to multiple themes or in some instances none at all. Table 5 provides a breakdown of how many responses contain a given number of themes. For example 87 responses from the first year cohort contain only a single theme whereas 11 responses from employers contain three themes. The mean number of themes per response or coding density was determined for each cohort and each group. Students described a mean value of 1.73 themes per response, teaching staff described an average of 2.75 themes per response and employers described 3.98 themes per response.
No. themes | 0a | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | Mean |
---|---|---|---|---|---|---|---|---|---|---|---|
Cohort/group | Number of responses which described the above number of themes | ||||||||||
a Responses that were coded to zero themes were either considered not to make sense or responses that could not be given meaning without further investigation (see discussion). | |||||||||||
1st year | 30 | 87 | 90 | 37 | 19 | 2 | 1 | 1 | 1.79 | ||
2nd year | 32 | 55 | 51 | 28 | 12 | 1 | 1.64 | ||||
3rd year | 5 | 6 | 7 | 3 | 3 | 1.71 | |||||
Students | 67 | 148 | 148 | 68 | 34 | 3 | 1 | 1 | 1.73 | ||
TAs | 5 | 10 | 15 | 7 | 3 | 1.83 | |||||
Sen TAs/academics | 1 | 2 | 4 | 5 | 3.00 | ||||||
Online | 4 | 12 | 13 | 17 | 6 | 2 | 3.28 | ||||
Teaching staff | 6 | 11 | 26 | 21 | 24 | 6 | 2 | 2.75 | |||
Employers | 1 | 8 | 11 | 8 | 7 | 4 | 3 | 1 | 3.98 |
In response to the question; ‘Can you provide an example of when you have had the opportunity to develop your critical thinking while studying chemistry?’ (Q2a) or ‘Can you provide an example of when you have provided students with the opportunity to develop their critical thinking while studying chemistry?’ (Q2b) 19 themes were generated. Table 6 contains these themes, their definitions and brief excerpts to convey the meaning attributed to these themes. The quantitative analysis found in Fig. 2 describes the frequency with which each of these themes was expressed in student and teaching staff responses.
Theme | Definition | Example |
---|---|---|
Algorithmic problem solving | All the data is provided and the solutions are known. | “Solving chemical equations.” |
Application of knowledge | Use of knowledge, usually developed within a course. | “…fundamental knowledge…” |
Assessing knowledge | Formative or summative feedback assessments. | “Weekly tests & semester exams.” |
Creating an argument | Generate hypothesis, opinion, argument or conclusion. | “…justify their (students) choices…” |
Critiquing | Decide quality of experimental data, method or argument. | “…assess aspects of experimental design.” |
Developing knowledge | Developing specific content knowledge. | “…perform lab task & understand the theory…” |
Discussion method | Engaged in dialogue with students, TAs or academics. | “…what would happen if…?” |
Engaging with experimental data | Engaging with data generated in the lab or from research. | “…did not achieve expected results…” |
Experimental design | Developing an experimental method in a laboratory setting. | “…create their (students) own experiments.” |
Inquiry based learning | Often described as an ‘IDEA prac’. | “…we had to come up with a method for a prac” |
Leading questions | Thought processes guided by open-ended questions. | “…prompt them (students) with questions …” |
Lecture environment | Activity taking place in a lecture or as part of a lecture. | “Reading materials before and after lectures…” |
Open-ended problem solving | Not all the data provided, ill defined or unknown solution. | “…direct answers may not be able to be found…” |
Practical environment | Activities taking place in/or as result of a laboratory. | “…making paracetamol…” |
Project work | Project which occurred over an extended time period. | “…problem to solve for the semester…” |
Research | Use of the term ‘research’ or indicating research | “…research projects…” |
Safety | Safe procedures in a laboratory environment. | “…handling dangerous chemicals.” |
Testing | Experimentation or test of a hypothesis. | “Not just assuming our hypothesis is right.” |
Tutorial environment | Tutorials', ‘tutes’ or ‘tutorial questions’. | “…through discussions in tutorial.” |
Writing | Lab reports, essays or literature reviews. | “Writing lab reports with discussion (section).” |
Once again a single response could be coded to multiple themes or none at all. Table 7 shows how many responses contained a given number of themes. For example 108 first year responses were coded to a single theme compared to only two Senior TA/Academic responses. Students described an average of 1.32 themes per response and teaching staff described an average of 2.25 themes per response.
No. themes | 0a | 1 | 2 | 3 | 4 | 5 | 6 | Mean |
---|---|---|---|---|---|---|---|---|
Cohort/group | Number of responses which described the above number of themes | |||||||
a Responses that were coded to zero themes were done so as these responses were either not to make sense, responses that could not be given meaning without further investigation, or stated that the respondent had either not studied or taught chemistry before (see Discussion). | ||||||||
1st year | 53 | 108 | 42 | 10 | 1 | 1.06 | ||
2nd year | 7 | 70 | 89 | 7 | 1 | 1.57 | ||
3rd year | 0 | 6 | 4 | 2 | 3 | 2.13 | ||
Students | 60 | 184 | 135 | 19 | 5 | 1.32 | ||
TAs | 3 | 9 | 11 | 3 | 1.54 | |||
Sen TAs/acad. | 0 | 2 | 5 | 2 | 2 | 2.36 | ||
Online acad. | 0 | 14 | 11 | 16 | 6 | 4 | 1 | 2.58 |
Teaching staff | 3 | 25 | 27 | 21 | 8 | 4 | 1 | 2.25 |
A similar pattern of coding density can be observed between TAs (D) versus Senior TAs and Academics (E), Online Academics (F) and Employers (G). It would appear that those participants who were approached directly or online made a concerted effort to respond to the questions as can be seen in Tables 5 and 7, where at least 3 themes were typically described by cohorts E, F and G. Again it is worth considering the experience that cohort D have with critical thinking. The majority of this cohort were on semester long contracts and only had teaching experience in a first year laboratory environment (Table 2). It is possible these participants may not exercise their critical thinking skills as frequently as academics who routinely engage in activities such as peer reviewing journal submissions which exercise these skills more frequently. This aligns with the constructivist notion that an individual creates their meaning of a given construct from their environment (Lemanski and Overton, 2011) and in this research how the participants believe that construct is applied in their daily lives.
With respect to demographic data, there was a slightly larger representation of students identifying as male compared to female. This was observed in all student cohorts, however it is important to note that there was slightly larger number of female students enrolled in chemistry at Monash University as compared to male students. As can be seen from Table 1, the median age for students was nineteen years old. This value was skewed slightly as a result of such large numbers of respondents from first and second year cohorts.
Larger samples of first and second year students and first year TAs were obtained due to the environments in which the questionnaire was conducted (namely compulsory laboratory sessions). Aside from the slightly larger number of male student respondents, there can be some confidence that the data obtained is representative of a random sample of the respective cohorts and the findings may be generalizable.
Obtaining data from senior TAs, academics and employers was far more difficult and consequently the data collected was more reflective of non-representative convenience sampling. Therefore, the findings herein may have limited generalisability with respect to senior TAs, academics and employers.
The theme ‘analysis’ was frequently expressed by all groups (students, teaching staff and employers). At least 20% of all responses identified analysis as part of the meaning of critical thinking. In the case of the student group it was, in fact, the most common theme, with just over 25% of respondents using it to define critical thinking. The term analysis or analysing was commonly used to describe interaction with some sort intellectual stimulus, whether it be an idea, data or a problem. Many responses referred to ‘analysing something’ to suggest a breath of critical thinking.
Students strongly identified with three other themes: ‘critique’, ‘objectivity’ and ‘problem solving’. Problem solving was the second most commonly expressed theme by student respondents with just over 23% of responses describing it. The link between critical thinking and problem solving appears to be a common association made by students (Tapper, 2004). Critique and objectivity were identified in approximately 17% of responses. The relatively smaller number of themes described by students is not altogether surprising as other qualitative studies have shown students often have difficultly conceptualising critical thinking (Duro et al., 2013).
Teaching staff most commonly described the themes ‘critique’ (40%) and ‘evaluate’ (42%) when defining critical thinking. In other recent studies a similar emphasis on interpreting information via analysis and evaluation was also observed (Duro et al., 2013; Desai et al., 2016). Teaching staff were much more goal orientated than students with 28% of responses describing ‘arriving at an outcome’. Outcomes were very task orientated a kin to Barnett's (1997) ‘critical being’, either developing a plan relating to experimental design or arriving at a conclusion as a result of experimental data. For example:
“The ability to examine evidence, come to a conclusion based on that evidence…”
Teaching staff also commonly described the themes ‘application of knowledge’, ‘logical approach’, ‘objectivity’ and ‘problem solving’ in approximately 20% of responses. It is worth noting that students and teaching staff express the theme of ‘objectivity’ with similar frequencies (18% and 19%, respectively). Of all three groups, teaching staff use the theme of problem solving the least when defining critical thinking (18%). While only 14% of teaching staff respondents described the theme of ‘interpreting information’ the value of this as being part of critical thinking was higher than with the student (11%) and employer (9%) groups.
As can be seen from Table 5 employers typically described the largest number themes in their responses. ‘Problem solving’ was the most common theme expressed by over 44% of employers. Employers were goal orientated much like teaching staff, commonly describing themes of ‘application of knowledge’ (19%), ‘objectivity’ (30%), ‘logical approach’ (21%), ‘evaluate’ (30%) and ‘arriving at an outcome’ (33%). Arriving at an outcome contained a wide breadth of examples in employer responses. However, there was some focus on using evidence to inform a conclusion which would lead to a course of action for the organisation to take:
“…a necessary approach to solving or answering problems, developing a product or process.”
Employers expressed four themes unique to their group: ‘context (macro)’ (12%), ‘creative’ (19%), ‘systematic approach’ (21%) and ‘identification of opportunities and problems’ (35%). The latter focused on the use of critical thinking as a method of uncovering what is not immediately apparent:
“To consider the problem to expose route cause(s) in a rationale and logical manner and apply lateral thinking to seek solutions to the problem.”
The above response also includes in its definition of critical thinking;
“The ability of a person to identify a problem that does not have a readily available or off the shelf solution.”
This is an excellent example of responses identifying creativity in conjunction with the theme of problem identification. The general sentiment of employers was that critical thinking is important to innovation within the organisation and is suggestive of what Jackson (2010b) refers to as ‘Pro-c creativity’ or the creativity associated within a professional environment.
Furthermore, employers were unique in describing critical thinking with the theme of ‘context (macro)’. What this theme references is that employers identified the application of critical thinking on a much broader social scale. For example:
“…understand the implications from an organisational perspective.”
“…collaborating the thoughts and views of others to gain a clearer insight of the real challenge.”
Employers acknowledged that the results of critical thinking can have an impact in commercial and societal contexts. While students and teaching staff have a somewhat more internalised definition of critical thinking, employers appear to have a more social application of critical thinking as seen in some the literature (Desai et al., 2016).
One of the most interesting features of this data was that the terms ‘judgement’ and ‘inference’, found in the Delphi definition of critical thinking (Facione, 1990), were seldom used by respondents. In fact below are the only two student responses to use the term ‘judgement’:
“Not taking things at face value and giving topics considerable thought and analysis before coming to a conclusion/judgement on it.” – First year respondent
“Analysis of a problem to make a judgement.” – Second year respondent
It is worth noting that a similar minority of respondents used the term ‘opinion’ in their definition of critical thinking;
“Ability to objectively analyse, process and form an opinion of a particular subject.”
And a slightly larger number of respondents used the term ‘conclusion’:
“A skill to understand a thing more clearly and make conclusion.”
When the Delphi report describes core critical thinking skills the terms ‘judgement’ and ‘opinion’ are used somewhat synonymously. Similarly, ‘drawing conclusions’ is explicitly stated as a sub skill of the skill of ‘inference’ (Facione, 1990, p. 10). This suggests that a larger number of respondents using ‘opinion’ or ‘conclusion’ may in fact be referencing the terms ‘judgement’ or ‘inference’. However without further probing what respondents mean by ‘conclusion’ or ‘opinion’ this is not a certainty.
There is also very little emphasis around self-regulation or the metacognitive processes typically associated with ‘good’ critical thinking (Glaser, 1984; Bailin, 2002). Perhaps this is implied when respondents described the theme of ‘objectivity’:
“Thinking about situations with an open view point and analysing what you're doing.”
What is very clear from this data is the emphasis on problem solving in the definition of critical thinking. This was a very prominent feature of the data from students and employers. With respect to the students this may be due to the perception that scientific facts are unquestionable and the algorithmic problem solving pedagogies commonly employed in science education (Zielinski, 2004; DeWit, 2006; Cloonan and Hutchinson, 2011). This feature of the data was slightly less common in teaching staff, but it was very prominent with employers. This might be due to the fact that employers are typically adept at reflecting on open-ended problems and identifying any parameters or approximations required (Randles and Overton, 2015). This experience with open-ended problems may also explain the description of the theme of ‘identification of problems and opportunities’ which was somewhat unique to employers.
Interestingly the Delphi report does not consider problem solving an element of critical thinking. Instead it proposes problem solving and critical thinking are ‘closely related forms of higher-order thinking’ (Facione, 1990, p. 5). Similarly Halpern suggests that certain behaviours are associated with critical thinking or problem solving but that these higher order cognitive skills are not mutually exclusive (Halpern, 1996, pp. 317–363). This cognitive psychology view is more reflective of the data that has emerged from respondents in this study which might otherwise be considered misconceptions with respect to critical thinking.
Regardless of this interpretation, it would be interesting to ask students, teachers and professionals from other disciplines to define critical thinking. It is quite possible that an emphasis on judgement may occur in humanities, commerce or arts and perhaps there would be less use of the theme of problem solving. For example when a group of business academics were asked to describe which critical thinking skills were important to graduates entering the workforce within their discipline, 47% of responses described problem solving and 34% of responses described analysis (Desai et al., 2016).
The other interesting feature of this data are the points of difference between groups and what these may be attributed to. For example teaching staff emphasised the themes of ‘critique’ and ‘evaluate’. A common aspect of an academics role is to be involved in peer review and academic writing so it is not surprising that these themes arise so frequently. Likewise employers' frequency of themes around identification, innovation and context are reflective of a competitive commercial environment. Given the respondents association between critical thinking and problem solving, these perceptions around evaluation and identifying problems could also be a reflection of behaviours typical of expert open-ended problem solvers (Randles and Overton, 2015). Both employers and teaching staff have a goal oriented definition of critical thinking which may be a product of maturity and/or their exposure to professional environments. Again this may be an example of constructivism (Lemanski and Overton, 2011).
As can be seen in Table 8, all groups used themes around analysis, critiquing, objectivity and problem solving to define critical thinking. In addition teaching staff and employers use themes relating to the application of knowledge, arriving at an outcome, evaluation and using a logical approach. Employers further expand on their definition to include themes regarding creativity, considering the broader context, taking a systematic approach and identifying opportunities and problems. These themes regarding the definition of critical thinking can be synthesised thus:
Theme | Students (%) | Teaching staff (%) | Employers (%) |
---|---|---|---|
Expressed as increments of >10% for ease of readability and to highlight similarities and differences between groups. | |||
Analysis | >20 | >20 | >30 |
Application of knowledge | >20 | >10 | |
Arriving at an outcome | >20 | >30 | |
Context (macro) | >10 | ||
Creative | >10 | ||
Critique | >10 | >40 | >20 |
Evaluation | >40 | >20 | |
Identifying opportunities… | >30 | ||
Logical approach | >10 | >20 | |
Objectivity | >10 | >10 | >20 |
Problem solving | >20 | >10 | >40 |
Systematic approach | >20 |
To analyse and critique objectively when solving a problem. – Students
To analyse, critique and evaluate through the logical and objective application of knowledge to arrive at an outcome when solving a problem. – Teaching staff
To analyse, critique and evaluate problems and opportunities through the logical, systematic, objective and creative application of knowledge so as to arrive at an outcome and recognise the large scale context in which these problems and opportunities occur. – Employers
While there are some similarities between the definitions of critical thinking it would be inaccurate to suggest that there is a shared definition. Furthermore, the depth to which critical thinking was defined appears to reflect the constructivist phenomena. Employers most commonly reflect definitions found in the literature (Facione, 1990; Halpern, 1996; Tiruneh et al., 2014). Employers appear to have a broader definition of critical thinking and this may be related to the fact that employers work in very broad contexts and a range of experiences, going beyond chemistry to deal with issues such as budgets, policies and human resources.
With respect to the teaching staff, the wording of the question they received must be considered to put the responses in context: ‘Can you provide an example of when you have provided students with the opportunity to develop their critical thinking while studying chemistry?’ (Q2b) This wording elicited responses which were drawn from the respondents' recent teaching activities and may actually differ from where the respondent believes students develop their critical thinking most. For example many TAs from cohort A only have practical experience to draw on whereas cohorts B and C also have lecture and/or tutorial actives to base their response on (Table 2). Conversely some respondents from cohorts B and C only had lecture or tutorial experience to draw on.
When asked to provide an example of where they believed they developed their critical thinking while studying chemistry, 45% of students identified an activity relating to a practical environment. The second most common theme was ‘inquiry based learning’ (17%). What was most interesting was that 36% of second year students and 14% of third year students specifically mentioned ‘IDEA pracs’. These practicals were guided inquiry activities the students performed as part of their first year laboratory program (Rayner et al., 2013). The fact that after two years in some cases students identified these activities demonstrates the effectiveness of inquiry-based learning in developing transferable skills such as critical thinking.
It is important to recognise that students do not identify activities that make the teaching of critical thinking explicit. Students in other studies identified courses around scientific communication as opportunities where critical thinking was explicitly taught (Tapper, 2004). Beyond these courses, much like the students in the current study, the development of critical thinking became more implicit and students became dependent on feedback from writing activities (Tapper, 2004; Duro et al., 2013). It is clear from the literature, without a deliberate effort to make critical thinking goals explicit in discipline specific courses, students find it difficult to conceptualise, and perceive critical thinking as an intuitive skill that develops over time (Tapper, 2004; Beachboard and Beachboard, 2010; Duro et al., 2013; Loes et al., 2015).
Teaching staff also identified practical environments (26%) as to when they developed students' critical thinking. However, four additional themes were also prominent in their responses: ‘application of knowledge’ (21%), ‘critique’ (33%), ‘project work’ (21%) and ‘research’ (19%). These themes are reflective of activities described in recent literature designed to elicit higher order cognitive skills (Cowden and Santiago, 2016; Stephenson and Sadler-Mcknight, 2016; Toledo and Dubas, 2016). Critique activities ranged from critiquing experimental design to writing literature reviews:
“I may provide students with some experimental evidence and they need to evaluate whether these are consistent with specific mechanisms.”
“Choosing and researching a topic to conduct a literature review on. Writing a review to include critical appraisal of the information covered.”
“Research paper-based assessments in which students are asked to locate and extract information, analyse data and critically assess aspects of experimental design.”
“…paper analysis which requires use of many variables in understanding change factors and outcomes in reaction.”
The ‘application of knowledge’ most often described activities taking place predominantly in a lecture environment and in some instances in a practical environment. Themes of ‘project work’ and ‘research’ often described activities in practical environments. Many of these responses focus on final year research projects:
“Mainly this comes from the crucial role of the research project, generally in the final year of study when the student has had the opportunity to build up their knowledge base across a broad range of chemistry.”
The above statement would suggest that critical thinking can only be achieved with a solid foundation of discipline specific knowledge. While it holds true that an individual is a better critical thinker within their discipline specific knowledge (McPeak, 1981; Moore, 2011) it is not true that a large body knowledge is a necessary prerequisite to develop critical thinking (Ennis, 1989; Davies, 2013).
According to this data students and teaching staff have some limited agreement that critical thinking is developed in a practical environment. However, that is where the similarities end. Despite teaching staff believing that they develop critical thinking through the application of knowledge this is not apparent to the students.
Teaching staff commonly acknowledge that students develop their critical thinking in active environments in accordance with the literature (Biggs, 2012). However the research projects the respondents commonly describe are often elective subjects or offered as vacation internships, the numbers of which are limited and will only become scarcer as student numbers continue to grow. It would be useful to determine if teaching staff believed project work is an opportunity to measure student critical thinking or whether it is better measured via other activities (if at all) and compare this to the literature (Desai et al., 2016).
A recent meta-analysis would suggest, a combination of teaching activities afford the greatest effect with respect to the development of critical thinking (Abrami et al., 2015). These teaching activities according to Abrami and colleagues are described as ‘authentic instruction’, ‘dialogue’ and ‘mentoring’. These findings are reflective of the present work where practical inquiry based learning, discussions and research projects were commonly described as opportunities to develop critical thinking. It is advisable for chemistry educators wishing to develop critical thinking in students that the activities described by students and teaching staff within this research form a foundation within their practice, emphasising authentic problem solving and Socratic dialogue (Abrami et al., 2015).
When asked to define critical thinking via an open ended questionnaire students, teaching staff and employers all described the themes of analysis, critique, objectivity and problem solving. Teaching staff and employers commonly expressed themes around evaluation, goal orientation and use of logic. Employers also believed creativity, larger scale contexts, taking a systematic approach and identifying of opportunities and problems are important aspects of critical thinking. This would suggest there is only a limited shared definition of critical thinking between students, teaching staff and employers which centres on analysis and problem solving.
In the same open ended questionnaire students and teaching staff described where they believed they developed student critical thinking. Overwhelmingly students described practical environments and inquiry based learning activities developed critical thinking. Teaching staff expressed themes around the application and critiquing of knowledge and to some extent practical environments and research projects. Again there appeared to be limited overlap between the perceptions of students and teaching staff and the need for more immersive student experiences, such as inquiry-based learning and work integrated learning (Edwards et al., 2015), is apparent in the development of transferable skills such as critical thinking.
If the workplace is expecting tertiary institutes to provide chemistry graduates for the workforce, a shared definition of critical thinking is imperative. However, there appears to be a somewhat limited shared understanding as to what critical thinking skills entail. If there are so many facets to critical thinking how can universities accommodate the development of these? Initiatives such work integrated learning (Edwards et al., 2015) aim to give students experience in commercial environments and perhaps in combination with inquiry-based pedagogies, a shared understanding of critical thinking and how to develop it can occur.
This journal is © The Royal Society of Chemistry 2017 |