Researching moving targets: studying learning progressions and teaching sequences

Keith S. Taber
Faculty of Education, University of Cambridge, UK. E-mail: kst24@cam.ac.uk

Received 15th March 2017 , Accepted 15th March 2017
The 2018 theme issue of Chemistry Education Research and Practice will focus on ‘Learning progressions and teaching sequences in chemistry education’. The call for papers, may be found at http://www.rsc.org/globalassets/05-journals-books-databases/our-journals/chemistry-education-research-and-practice/cerp-call-for-papers-2018.pdf. This is both a fascinating and challenging area for researchers, as well as being of central importance to education. Two of the particular challenges of research in education relate to the nature and complexity of its core foci of learning and teaching. These factors feature largely in research aimed at exploring learning over time, and in developing teaching schemes to support desired learning progression. Learning and teaching are both processes, so that what is being investigated in such research is shifting as we undertake our investigations.

Learning is not observable – so how can we tell when it has occurred?

Learning is the core intended outcome of chemistry education. Yet learning and related concepts such as knowledge, understanding, and thinking, are often (ontologically) vague, and (epistemologically) elusive (Taber, 2013). These concepts are vague in the sense that we all know what we mean by thinking or learning in everyday and professional life, but it is actually quite difficult to pin down a satisfactory operational definition to support research into them. Even if we tend to think of learning in terms of cognitive processes and conceptual structures, in practice our research necessarily uses behaviouralist methodology. So although I would claim that my own research has revealed aspects of student thinking and underlying conceptual structures – I cannot show anyone the thinking or conceptual structures I write about as the research evidence is necessarily indirect. This is something recognised by the behaviourists (Watson, 1967) who felt research should focus on what could be seen and measured, rather than theoretical entities (like alternative conceptions and conceptual frameworks). Chemists perhaps are in a good position to reject such a constraint, as chemistry itself usually involves explaining observable phenomena in terms of a theoretical system comprised of non-observable entities. As chemistry education researchers looking at learning, our data is based on the observation of behaviour (what students say, draw, gesticulate, etc.), from which we infer and model aspects of what we imagine is going on ‘in their minds’ (Taber, 2013).

Arguably, learning can be described as providing potential for new behaviour: “learning is considered to be a process through which a change in the potential for behaviour is brought about” (Taber, 2009, p. 10). If a learner has at time A no potential for behaviour X (which might be explaining the reactivity of double bonds, for example), but then later at time B this potential has been acquired, then some learning has taken place. This does not mean, however, that not observing this behaviour at one time, and then observing it later, is automatically strong evidence of learning. The learner has to have the opportunity and motivation to demonstrate the behaviour, and these are necessary rather than sufficient conditions. Clearly most of the time, our behaviour only reflects a tiny fraction of what we have learnt over the years. Moreover, students often have potential for offering multiple behaviours in particular contexts – such that our observations sample a limited number of the potential behaviours. A question such as ‘what is this?’ might legitimately invite different responses – say, ethene, an alkene, an unsaturated compound, a fuel, a hydrocarbon, a double bond, sp2 hybridisation, a molecule, a symmetric arrangement, a planar structure, a formula, a symbolic representation, a diagram, a focus for a research interview…

To give an example from my own research, one student I worked with (‘Tajinder’) offered three quite distinct ways of understanding the nature of chemical bonding. This behaviour was interpreted as representing manifold conceptions of bonding (Taber, 2000). That is, what was observed (speech acts that were classified as evidence of the use of alternative forms of explanation) was used to draw inferences about mental states (having different ideas, that drew upon different elements of a repertoire of conceptions assumed to be represented somehow in the brain). If Tajinder had simply been asked about bonding in a particular molecule near the start of his course, and again near the end, it is possible that he would have offered quite different forms of explanation (perhaps leading to an inference that learning had taken place as he had abandoned one idea and adopted another). However, the data collected suggest that if Tajinder had been asked about bonding in a particular molecule near the start of his course, and again near the end, it is also possible that he would have offered similar forms of explanation based on the same underlying explanatory principle (potentially leading to an inference that learning had not taken place and a tenacious alternative conception had not been challenged). In-depth collection of data over an extended period showed that neither inference would be adequate, as the learning process was more complex. There was learning, as shown in the change in the profile of responses given at different points in his studies – but it could not be summarised as a simple matter of conceptual substitution (Taber, 2001).

Research as an intervention in learning

One of the inherent tensions in work in the area of learning progressions concerns the difference between descriptive and prescriptive accounts (Alonzo and Gotwals, 2012). Learning progressions can be understood either as the usual progression of learning within a particular curriculum context (descriptive) or as a model to guide teachers on what is viable as target knowledge at different grade levels when working towards more scientific models (prescriptive). The latter notion is based upon research showing simple shifts from ignorance or alternative conceptions to scientific understanding are often non-viable, and progress depends upon using intermediate conceptions (Driver, 1989) or models as ‘stepping stones’ (Watts and Bentley, 1991). This raises issues of what might be considered acceptable as an intermediate conception in a particular learning progression. For example, designers of a learning progression focusing on the cycling of carbon in the ecosystem might well suggest that the notion that “light energy becomes chemical bond energy” (Jin and Anderson, 2012, p. 169) is “the scientific account that successfully traces energy in photosynthesis” (p. 170), but most chemists would suspect that this statement is likely to support the adoption or retention of the common alternative conception that bonds are somehow stores of energy.

If looking to develop a descriptive account one might imagine that a purely observational study that explored the ‘natural’ progression of student thinking about chemical ideas would be indicated. Indeed one of the key distinctions made in describing educational research is between naturalistic studies (that seek to explore how things are, without influencing what is to be studied) and interventionist studies (that deliberately seek to change the state of affairs, and evaluate the intervention). Naturalism seeks to observe ‘given’ situations (Kemmis, 1980) but this can be a frustrating restriction, however. The developmental psychologist (or genetic epistemologist, as he framed his work) Jean Piaget is well known for introducing the clinical interview that has indirectly acted as a model of many research interviews in such areas as exploring learner thinking in science topics. Yet Piaget (1959/2002), having published as a biologist when a student, set out to do a natural observation study. Several frustrating weeks of following a child around school waiting for the target student to do or say something revealing led to Piaget developing a more direct method: sitting the child down and asking them some well-sequenced questions about the topic of interest.

However, as soon as the researcher probes the learner's thinking, she inevitably intervenes in the natural course of that thinking, and the thinking that is ‘revealed’ – that is actually inferred from the learner's behaviour offering observable representations (Taber, 2013) – is cued by the specific probes and questions the researcher presents. Piaget (1929/1973) himself recognised that some answers children gave to his questions were romanced: that is, the child made up a feasible response to a novel question. Romanced responses may clearly be of interest, as they draw upon the cognitive resources the interviewee has available (Hammer, 2004), but they do not reveal existing stable patterns of thought. The difficulty of knowing whether what is cued in a research interview is, or is not, a core and stable feature of a participant's thinking about a topic is probably responsible for some of the debates about the status and significance of what was actually being reported in accounts of students’ misconceptions or conceptual frameworks (Taber, 2009). Of course stable patterns of thought have their origin in something more tentative and provisional: so having constructed a romanced response, put together in situ within an interview context, as a means to answer some previously unconsidered question, the novel idea may then come to be adopted into a more regular part of the learner's thinking. Research activities, such as answering diagnostic tests or being interviewed, are learning opportunities – and so students may learn from them.

Such learning is not necessarily a bad thing in itself: I recall an interview where a student in effect ‘invented’ the idea of van der Waals's forces. The interview questions probing what the student did know led to her proposing an interaction she had not yet been formally taught about. Interview questions, like teacher's questions, can lead to a kind of Socratic dialogue that allows the participant to construct new knowledge structures by reorganisation and juxtaposition of existing knowledge. This reflects a deliberate teaching technique used to scaffold learning (Wood, 1988), where the teacher provides an activity to highlight and reorientate aspects of existing knowledge that provide the foundations for new learning, so acting as a scaffolding ‘plank’ (a platform for new knowledge) (Taber, 2002). The student clearly did not suggest this novel (for her) idea would be called van der Waals’ forces, but it seems likely that when she was later taught about this concept her previous insight will have provided an anchoring point for, and been reinforced by, the new learning.

In research contexts we are attempting to explore rather than teach, and often it is methodologically unsound to give feedback (the researcher will often say something like ‘I want to know what you are thinking, so there are no right or wrong answers’), especially when the research is longitudinal and we are interested in seeing how the learner's ideas develop over time. However, when students concoct new original non-canonical ideas in research interviews that go unchallenged, the research interview may act as a learning intervention, supporting the learning of alternative conceptions. This raises the question of when researchers have an ethical responsibility to intervene and feed back to the participant that ideas revealed in research interviews should be revisited and perhaps reconsidered. This issue certainly arose when I was interviewing one of my own students (‘Annie’) and found she had developed an alternative framework of ideas deriving from a misconstruing of what charge symbols denote in chemistry (Taber, 1995).

Longitudinal and microgenetic research

The issue of research distorting the natural course of learning is especially problematic when undertaking research to see how learning leads to changes in knowledge, thinking and understanding over time. If we are only interested in gross measures then cross-sectional studies may suffice – although these are challenging given the complexity of learning. If we have a sample of, say, 14 years olds in some educational context and a ‘comparable’ sample of, say, 16 year olds, in the same context, then we can compare and so perhaps make inferences about the learning occurring in that context over the two years from 14 to 16 years of age. But comparability is not straightforward. A sample within a school may, for example, be subject to changes in school leadership and organisation which influences learning such that the learning environment experienced by the 14 year-olds is quite different from what the 16 year-olds experienced two years earlier. Differences between the two samples are not then simply representing changes taking place over the two years.

A national sample of sufficient size and representativeness may allow such local changes to tend to average out – as long as there are not systemic changes that undermine the study. It would be difficult to undertake such work in England, for example, where curriculum, school organisation, examination specifications, assessment modes, and the like, tend to be in such flux that making any kind of comparison over time is unlikely to ever simply reflect learning between two age levels.

This is not a problem with longitudinal studies as the same individuals are investigated at different points in time. Whether learners who are co-operative and generous enough to offer the gift of data regularly are representative of wider cohorts must be a question borne in mind. Moreover, as suggested above, whether their thinking can be considered to be typical of students under normal teaching conditions once they have been through regular episodes of having their ideas probed and questioned is even more dubious. I was blessed in my doctoral research to have one participant (Tajinder, see above) who was prepared to be interviewed at length numerous times over an extended period of time. This was in part because he himself recognised that the research sessions were learning opportunities and could help him in his studies (Taber and Student, 2003). The detailed data allowed me to identify shifts over time that were subtle – not a switch from one idea to another, but a slow evolution in the profile of explanations offered to describe aspects of chemical bonding and related concepts. Even if Tajinder's generosity and commitments to the research did not make him an outlier, the many hours of one-to-one conversation about his understanding of chemistry inevitably undermine any claims that his progression in the subject can be assumed to be typical.

This issue becomes especially pertinent in research designed to be microgenetic. Microgenetic studies (Brock and Taber, 2017) deliberately implement a high frequency of observations (and often in practice, observations of activity that need to be considered interventions) at a point where a learner is suspected of being ready to show development. This can help show whether learning occurs smoothly, through discontinuity, or in a somewhat jittery way with plenty of backsliding. It is difficult to explore such issues without the high intensity of observations – but at the cost of investigating a somewhat artificial situation. A microgenetic study of a toddler learning to walk could be naturalistic – a microgenetic study of a student's progress in balancing chemical equations is probably going to rely upon setting up a series of opportunities for frequent observation.

Individual differences are known to be very significant in science learning, so we should not assume that we can take any learner as typical. It is possible to study learning pathways by observing whole classes over time, but even here resource limitations are unlikely to allow both the depth needed to explore individual thinking and breadth across a range of students. The careful selection of cases can seek to avoid obvious atypicality, even if this has to be moderated by pragmatic considerations – such as working with students who are willing to be interviewed; students who contribute enough in class so they can be regularly observed using their ideas; etc., (Petri and Niedderer, 1998).

Investigating teaching sequences

A more naturalistic context might be possible in studies of the effectiveness of teaching sequences, although even here studies are likely to have to be selective in what they focus on (Duit et al., 1998). Here the intervention in learning is the teaching, not the research, and data may be collected relating to normal classroom actives within the usual curriculum and timetable context.

Teaching is a complex undertaking, and if individual students can be idiosyncratic, so can lesson sequences taught by particular teachers with specific classes. Some methodologies acknowledge this. Design research assumes an iterative process, where what is learnt from one iteration informs the next version – undertaken by a different class (Ruthven et al., 2009). Lesson study often involves teachers from different institutions observing and iteratively developing (and taking turns in teaching) versions of the same lesson in different classrooms (Allen et al., 2004), although the focus is usually on a single lesson. Evaluation of teaching effectiveness meets some of the problems alluded to above: given the contextual nature, complexity, and on-going nature of learning, it may be difficult to produce evidence that will seem convincing to outsiders. Every student, class, school, etc., is different, so simple testing of learning gains in a quantitative sense offers a simplistic basis for comparison across learning contexts. Large scale studies that can test teaching sequences at scale are seldom viable, and in any case may disguise the contextual factors that sometimes lead to different approaches working to different degrees in different classes (Guthrie, 1977). In-depth case studies presented with ‘thick’ description (Geertz, 1973) may offer better narratives and reader generalisation, but lack statistical generalisability.

Challenges for the research community

One apparent message from this editorial then is that research into learning progressions and the development of effective teaching sequences is difficult, challenging work, where we are unlikely to readily obtain definitive results. Yet that is no reason not to take up the challenge. Learning is the core objective of educational work, and teaching is how we do that work; and understanding more about the complexity of these processes should be a central aim of research. It is possible to see a continuity in much research in science education over the past half-century. This tradition has characteristics of a research programme gradually building towards enabling the kinds of studies that can characterise learning over time and support the development of research-informed teaching sequences (Taber, 2009). Work in chemistry education that looks at the nature of progression in learning our subject is now well underway (Sevian and Talanquer, 2014). The 2018 theme issue offers an opportunity to take stock of work in chemistry education focused on these central issues. I am very much looking forward to seeing how the chemistry education community is responding to these challenging but fundamental research foci.

References

  1. Allen D., Donham R. and Tanner K., (2004), Approaches to Biology Teaching and Learning: Lesson Study—Building Communities of Learning Among Educators, Cell Biol Educ., 3, 1–7.
  2. Alonzo A. C. and Gotwals A. W., (2012), Learning Progressions in Science: Current Challenges and Future Directions, Rotterdam: Sense Publishers.
  3. Brock R. and Taber K. S., (2017), The application of the microgenetic method to studies of learning in science education: characteristics of published studies, methodological issues and recommendations for future research, Sci Educ., 53(1), 45–73, DOI: http://10.1080/03057267.2016.1262046.
  4. Driver R., (1989), Students’ conceptions and the learning of science, Int. J. Sci. Educ., 11(special issue), 481–490.
  5. Duit R., Roth W.-M., Komorek M. and Wilbers J., (1998), Conceptual change cum discourse analysis to understand cognition in a unit on chaotic systems: towards an integrative perspective on learning in science, Int. J. Sci. Educ., 20(9), 1059–1073.
  6. Geertz C., (1973), The Interpretation of Cultures: Selected Essays, New York: Basic Books.
  7. Guthrie J. T., (1977), Research Views: Follow through: A Compensatory Education Experiment, Read Teach., 31(2), 240–244.
  8. Hammer D., (2004), The variability of student reasoning, Lecture 3: manifold cognitive resources, in Redish E. F. and Vicentini M. (ed.), Research on Physics Education, Bologna/Amsterdam: Italian Physical Society/IOS Press, pp. 321–340.
  9. Jin H., and Anderson C. W., (2012), Developing assessments for a learning progression on carbon-transforming processes in socio-ecological systems, in Alonzo A. C. and Gotwals A. W. (ed.), Learning Progression in Science: Current challenges and future directions, Rotterdam, The Netherlands: Sense, pp. 151–181.
  10. Kemmis S., (1980), The Imagination of the Case and the Invention of the Study, in Simons H. (ed.), Towards a Science of the Singular: Essays about Case Study in Educational Research and Evaluation, Norwich: Centre for Applied Research in Education, University of East Anglia, pp. 96–142.
  11. Petri J. and Niedderer H., (1998), A learning pathway in high-school level quantum atomic physics, Int. J. Sci. Educ., 20(9), 1075–1088.
  12. Piaget J., (1929/1973), The Child's Conception of The World, (Tomlinson J. and Tomlinson A., Trans.), St. Albans: Granada.
  13. Piaget J., (1959/2002), The Language and Thought of the Child, 3rd edn, London: Routledge.
  14. Ruthven K., Laborde C., Leach J. and Tiberghien A., (2009), Design Tools in Didactical Research: Instrumenting the Epistemological and Cognitive Aspects of the Design of Teaching Sequences, Educ. Res., 38(5), 329–342, DOI: http://10.3102/0013189x09338513.
  15. Sevian H. and Talanquer V., (2014), Rethinking chemistry: a learning progression on chemical thinking, Chem. Educ. Res. Pract., 15(1), 10–23, DOI: 10.1039/c3rp00111c.
  16. Taber K. S., (1995), Development of Student Understanding: A Case Study of Stability and Lability in Cognitive Structure, Research in Science & Technological Education, 13(1), 87–97.
  17. Taber K. S., (2000), Multiple frameworks? Evidence of manifold conceptions in individual cognitive structure, Int. J. Sci. Educ., 22(4), 399–417.
  18. Taber K. S., (2001), Shifting sands: a case study of conceptual development as competition between alternative conceptions, Int. J. Sci. Educ., 23(7), 731–753.
  19. Taber K. S., (2002), Chemical Misconceptions – Prevention, Diagnosis and Cure: Theoretical background, London: Royal Society of Chemistry, vol. 1.
  20. Taber K. S., (2009), Progressing Science Education: Constructing the scientific research programme into the contingent nature of learning science, Dordrecht: Springer.
  21. Taber K. S., (2013), Modelling Learners and Learning in Science Education: Developing representations of concepts, conceptual structure and conceptual change to inform teaching and research, Dordrecht: Springer.
  22. Taber K. S. and Student T. A., (2003), How was it for you? the dialogue between researcher and colearner, Westminster Studies in Education, 26(1), 33–44.
  23. Watson J. B., (1967), What is Behaviourism? in Dyal J. A. (ed.), Readings in Psychology: Understanding human behavior, 2nd edn, New York: McGraw-Hill Book Company, pp. 7–9.
  24. Watts M. and Bentley D. (1991), Constructivism in the curriculum. Can we close the gap between the strong theoretical version and the weak version of theory-in-practice? The Curriculum Journal, 2(2), 171–182, DOI: http://10.1080/0958517910020206.
  25. Wood D., (1988), How Children Think and Learn: the social contexts of cognitive development, Oxford: Blackwell.

This journal is © The Royal Society of Chemistry 2017
Click here to see how this site uses Cookies. View our privacy policy here.