Steve
Gutsell
* and
Paul
Russell
Unilever Safety and Environmental Assurance Centre, Colworth Science Park, Sharnbrook, Bedfordshire MK44 1LQ, UK. E-mail: steve.gutsell@unilever.com; Fax: +44 (0)1234 222632; Tel: +44 (0)1234 264849
First published on 29th May 2013
The Adverse Outcome Pathway (AOP) conceptual framework has been presented as a logical sequence of events or processes within biological systems which can be used to understand adverse effects and refine the current risk assessment practice. This approach shifts the risk assessment focus from traditional apical endpoints to the development of a mechanistic understanding of a chemicals effect at a molecular and cellular level. In order to obtain this level of detail, chemistry in all its disciplines has a key role to play. Measurement techniques will be important in understanding chemical characterisation, free concentration and exposure at the site of interest. Such measurements will be vital in developing structure-based toxicological alerts and informing predictive models. This paper explores the areas where chemistry will be influential in the development of AOPs.
![]() Dr Steve Gutsell | Dr Steve Gutsell, Ph.D., has been with Unilever's Safety and Environmental Assurance Centre for over 9 years. With a background in Organic Chemistry and expertise in the area of Computational Chemistry, specifically using predictive methods such as (Quantitative) Structure–Activity Relationships ((Q)SAR), Read Across and other techniques to predict both toxicological and ecotoxicological endpoints from chemical structure. He has published several papers in this area and presented at numerous international scientific and regulatory meetings. Recent areas of interest include how pathways-based approaches can be used to create novel risk assessments for consumer products. |
![]() Dr Paul Russell | Dr Paul Russell, Ph.D. CChem MRSC, has over 15 years industrial experience in analytical chemistry, working in the pharmaceutical industry and contract research before joining Unilever's Safety and Environmental Assurance Centre in 2004. He is a technical specialist in liquid chromatography and mass spectroscopy and has published a number of peer reviewed articles in this area and regularly presents at international conferences. Dr Russell currently has a specific focus on the development of mechanistic chemistry based approaches to support pathways based toxicological risk assessments for new materials. He is also Secretary of the Separation Science Group of the Royal Society of Chemistry. |
The science that underpins the safety risk assessment of consumer goods is currently undergoing one of the largest paradigm shifts in recent history.2,3 To a large extent this shift is being catalysed by the need to assess increasing numbers of chemicals with fewer resources. Increasing public and political concerns regarding the use of animal studies for assuring the safety of new chemicals have also driven extensive research into non-animal methodologies.4 Whilst progress has been made for several toxicological endpoints there still remain many gaps in the knowledge. There is also a desire to reduce the uncertainty inherent in many current risk assessment practices e.g. from extrapolation of effects interspecies, from interindividual variation, and from one route of exposure to another. Many of these uncertainties are addressed through the application of factors (uncertainty/safety/assessment) to the risk assessment equations. However, the extent to which these factors accurately account for such uncertainties is, at best, variable.5
Numerous approaches have been put forward to address individual elements of this overall challenge.6 However, until recently it was extremely difficult to see how these individual pieces of the puzzle might fit together to achieve the overall aims. The pathways concept attempts to do just that. Whilst different terminology and levels of detail have developed in different areas depending on the scope of the approach7 (see Fig. 1), the basic premise is the same: by understanding more about the chemical and biological mechanisms involved at different levels of biological organisation and how they are related it should be possible to predict outcomes at higher levels of organisation from information obtained at lower levels.
![]() | ||
| Fig. 1 Scope of pathways approaches (adapted from Crofton 2010). | ||
To illustrate the pivotal role that chemistry as a multi-faceted discipline has to play in these pathways-based approaches, this paper will focus on the main areas of application in an Adverse Outcome Pathways (AOP) based approach to refining risk assessment.
![]() | ||
| Fig. 2 (A) General AOP framework, (B) narcosis AOP for aquatic toxicity, (C) skin sensitisation AOP in mammals. (A) and (B) adapted from Ankley et al. 2010 (ref. 6). (C) Adapted from Fig. 3 ‘Flow diagram of the pathways associated with skin sensitization’ (p. 27) from OECD (2012) Series on Testing and Assessment No. 168 ENV/JM/MONO(2012)10/PART1 The Adverse Outcome Pathway for Skin Sensitisation Initiated by Covalent Binding to Proteins http://search.oecd.org/officialdocuments/displaydocumentpdf/?cote=env/jm/mono(2012)10/part1&doclanguage=en. | ||
The initial interaction, or MIE, can provide a link to subsequent outcomes/effects at different levels of biological organisation and other key dimensions such as gender or life stage. It is readily accepted that a single MIE can lead to multiple adverse outcomes and vice versa. Prior to the occurrence of an MIE it is critical to understand the chemical source and conditions of the exposure scenario in order to fully understand the interaction occurring at the MIE itself. This includes an understanding of the possible speciation, metabolism and/or degradation of a chemical prior to the MIE which may render it more or less active.
The AOP approach allows commonalities to be leveraged across the human health and environmental domains, breaking complex toxicology down into focussed biological and chemical processes which allow the influence of a chemical and its derivatives to be more readily understood. This includes building an appreciation of the influence of kinetics at each stage of a pathway to enable qualitative modelling to become quantitative.13 Consideration of this dose–response, or response–response, relationship at each organisational level will be required to determine whether the biological self-protection processes are overcome, resulting in an adverse rather than adaptive response mechanism.14 The degree to which this complexity at the intervening levels of organisation is required to be understood will vary according to the needs of the assessment in question. For example, it could be that an understanding of the effects of a chemical at an organelle level are sufficient to inform an early screening risk assessment.
Chemistry in all its disciplines (i.e. experimental, predictive, theoretical and computational) has long been a key enabler in safety risk assessment. The ability to link an adverse effect to a measured dose of a defined molecular structure requires the use of chemical techniques, typically analytical experimental methods, to measure the input and output of such a study.
In order to meet the demands of a pathways-based approach to risk assessment an in-depth understanding of many currently poorly understood biological processes will be required. This challenge is made easier through the inherent structured approach that the AOP framework presents. The application of existing chemistry based technologies applied in novel ways and maximising the use of established knowledge will allow a bespoke, fit for purpose, integrated, approach to testing/predictions to be developed.
Advances in computer processing power are allowing the calculation of quantum chemical parameters, even from ab initio techniques, on standard computing equipment. This has greatly increased the accessibility of 3-dimensional descriptors necessary to model protein–ligand interactions and Quantum Mechanical (QM) descriptors related to electrophilic/nucleophilic reactivity. On the negative side the number of descriptors that can be calculated in seconds is now vast and can result in QSAR models that are very difficult to interpret. In addition, the use of complex descriptors and statistical methods with little thought as to the mechanistic interpretation of a model has greatly damaged the reputation of QSAR techniques. The need to improve transparency and interpretation of QSAR models have been widely accepted,20 but there has still been a proliferation of models published which do not add to the understanding of the mechanisms involved in toxicological processes.21 The result is that for human health endpoints QSAR methods have largely been restricted to use as early screens or as a means of prioritising chemicals for subsequent testing.
A chemical's electrophilic reactivity is thought to be responsible for a number of different potential adverse effects. Whilst the use of computational approaches to understand reactivity and develop predictive models has become more prevalent,22 the need for high quality experimental measurements of reactivity has been largely overlooked. To allow the shift to an AOP-based approach to risk assessment and the development of predictive models, the interdependence between experimental measurements and predictive modelling (at each level of organisation) requires increased emphasis. Clearly this area offers many opportunities for chemists and (eco)toxicologists to work together to develop high quality data sets of chemical properties to act as training sets for predictive models.
Often simpler models that take the form of structure–activity relationships are more accepted as inputs to risk assessment as they are transparent and easy to implement. There are many examples of collections of these SARs that have been combined into either commercial or free software tools.23,24 A further application of SAR-type rules is in read across. This approach is widely applied as a means to fill gaps in hazard data packages for chemicals with closely related analogues which do possess data. There is much debate around the best approaches to read across and its limitations. Again the structured nature of the AOP approach can be used to provide increased evidence of the validity of a read across proposal. It will also allow informed decisions to be made about the need for further testing at higher levels of biological organisation.
An effective characterisation of a chemical should give detailed information about its physical and chemical composition along with identification and quantification of any impurities present. A thorough knowledge of a chemical under investigation is the foundation for good experimental design. It also helps with interpretation of the results which is imperative for the success of any in vivo or in vitro study. However, this is an issue that is often not addressed sufficiently, or even overlooked completely. Inputs into predictive models such as (Q)SARs tend to be single chemical structures, yet the experiments that generate the parameters which feed into these models for a given chemical, either knowingly or unknowingly, rarely have such high purity.
A comprehensive characterisation study may include chromatographic techniques (HPLC, GC), spectroscopic techniques (NMR, FTIR, MS) along with traditional wet chemistry measurements (pH, pKa, log P) but will more likely be a combination of a number of these. Low sensitivity techniques such as Nuclear Magnetic Resonance (NMR) and Fourier Transform Infra-Red (FTIR) spectroscopy may be employed early on in the research process to support screening activities. More sensitive techniques (i.e. Mass Spectrometry (MS)), although equally capable at the screening stage, can be used to unravel more complex problems such as identification of low level impurities or metabolites. Biological based in vitro assays, like in vivo assays, involve multiple stresses on a dosed chemical (e.g. metabolism, degradation, binding, pH effects) and even for a well characterised chemical the outcome is often complex to interpret. It is certainly not possible to fully understand the results of an experiment without sound knowledge of the chemical that you are presenting and analytical chemistry has a key role to play here.
Chemical purities often need to be determined to low levels and the role of any salts or counter-ions understood to predict or interpret results from in vitro assays. Impurities that are detected and unavoidable should likewise be characterised and NMR spectroscopy is particularly effective here as it has both qualitative and quantitative capabilities, although this may then lead to subsequent MS studies if sensitivity is ultimately found to be the limiting factor. Generally with modern equipment the lower sensitivity techniques are adequate for detecting impurities at levels suitable to support in vitro assays. According to the International Conference on Harmonisation (ICH) guidelines for pharmaceuticals,26 identification of impurities below 0.1% (for a 2 g day−1 dose) is not necessary unless there is evidence of their toxicity. The level at which impurities can be detected will depend largely on the impurity in question and the detection system, but when attempting to gain mechanistic understanding from in vitro assays a simple awareness of the potential effect of low level impurities can be an advantage when interpreting results.27 Hence a pragmatic evidence-based approach should be taken to the characterisation and risk assessment of low level impurities guided by experimental methods and tools such as the TTC.28
Biological interactions are often driven by a chemicals three dimensional conformation and hence knowledge of structural isomers is important, albeit these can be challenging to detect. Chemical stability should be well understood, not only in its natural physical state, but also in solution, formulation and within the relevant biomimetic system.
Time spent understanding the chemical of interest will facilitate better understanding of any experimental output through the reduction of uncertainty. In any multi-parameter experimental assay it is critical to identify potential sources of variability and ensure any controlled variables such as the input chemical are well understood. This in turn leads to the development of robust predictive models by improving the quality of the data used to build them.
![]() | ||
| Fig. 3 Wet/dry cycle for development of in silico models. | ||
The predictivity and applicability domain of such models will clearly improve as the size of the training sets increases. Pragmatic decisions will need to be made as to when a model is fit for purpose. This will obviously depend on the risk assessment decision in question.
As the AOP conceptual framework is predicated by the need to understand the links between effects at different levels of organisation it follows that it should be possible to relate events at the MIE level to those further downstream on the pathway. Indeed this has been the fundamental concept behind the development of many (Q)SARs. By understanding the quantitative (or qualitative) interconnections between the levels of organisation it will become apparent how far downstream MIE information alone will allow predictions to be made. This should prevent the production of models that attempt to make predictions without a mechanistically plausible connection to an appropriate endpoint.
Once again the importance of a thorough understanding of the actual amount of chemical eliciting a response at the MIE should not be underestimated as this will underpin any predictions of events further down the pathway.
It is evident that in vitro assays are seen as a critical tool in advancing our knowledge of AOP's so long as there outputs are well understood in terms of their toxicological significance.34 Traditionally the nominal dose introduced to the system has been used to describe chemical exposure within in vitro assays without consideration of how much chemical is available to act on a cell, or a target within a cell. This introduces significant error when extrapolating from in vitro toxicity data to in vivo scenarios where the exposure at the target site is expected to be inherently different. Risk assessment approaches can be refined through quantitatively understanding the free chemical available to act within the in vitro experiment.35 This has been identified as a key priority by the U.S. National Research Council in their strategic vision for toxicity testing in the 21st century.3Fig. 436 illustrates the factors affecting free concentration include binding to components in the cell media such as serum proteins37–40 and binding to the glass or plastic in solid supports and labware.41–43 Chemicals are often dosed into in vitro assays in mixed solvent systems to aid their solubility, but subsequent introduction into a wholly aqueous cellular system can cause precipitation, again reducing the amount of chemical freely available. In addition, non-target binding can occur within the cell itself (i.e. to membranes and organelles) and cell metabolism or degradation can lead to a reduction in the applied dose of a chemical.
![]() | ||
| Fig. 4 In vitro to in vivo extrapolation. Understanding the interactions and an awareness of the dynamic equilibria occurring within in vitro assays is essential when using data to extrapolate to in vivo situations (adapted from Kramer et al. 2012). Emphasis should be placed on understanding the free concentration available to reach the target and have an effect. | ||
These potential in vitro losses will apply to both human and environmental toxicology assays. The chemical specific equilibria that exist between non-targeted binding and losses make the determination of actual target concentrations particularly challenging with any analytical sampling procedures potentially adding further complexity. The development of small volume extraction techniques such as solid phase micro extraction44,45 can be employed to provide highly specific, matrix compatible extractions at focussed target sites within an assay with minimal disruption to the in vitro assay equilibria.46
In order to reproducibly and accurately predict in vivo effects from in vitro data, a key enabler for AOP-based toxicological risk assessment, these non-target interactions are important parameters to consider. A sequence of in vitro experiments should be designed to determine the free concentration, whether by direct analysis or by developing a quantitative understanding of some or all of the non-targeted interactions described in Fig. 4. Physical chemical parameters (measured and predicted) of the chemical can potentially be used to predict many of these interactions. Through an improved understanding of free concentration the validity of the many predictive models that will be developed from in vitro assay data will be improved.
The inputs to PBPK models can be categorised as chemical independent (e.g. organ weights, blood flow etc.) and chemical dependent (e.g. plasma protein binding, various partition coefficients and metabolic clearance rates). A clear barrier to the use of this kind of model in environmental risk assessment is the range of different species that need to be considered. However, many elements of PBPK modelling are similar to those used in environmental fate models that also attempt to predict the kinetics of a chemical as it partitions to various environmental compartments and possibly biodegrades.
Whilst many QSAR models exist to provide some of the chemical dependent inputs to PBPK models for those chemicals without experimental values, the applicability domain of such models is somewhat restricted. In addition, methods to predict metabolism rates using either in silico or in vitro models are currently limited.48 It is also fair to say that many of the QSARs themselves rely on predicted descriptors as input (e.g. log P).49 In this situation it is clear that errors may be introduced into the final PBPK predictions at several stages. Acquisition of accurate measured data wherever possible as input to the PBPK modelling process will clearly remove some of the early sources of potential error, or at least assist in highlighting the source of error, and result in more reliable outputs.
One of the major barriers to the use of PBPK modelling to understand relevant exposure from home and personal care products (amongst others) is the lack of a reliable method to predict skin penetration.50 As with other topics discussed previously there is a need for further quality data to be generated to facilitate the development of predictive models.51 This issue is further complicated by the knowledge that the nature of the formulation/vehicle in which a chemical is applied can have a large impact on the penetration. The current state of in silico modelling for topical exposure is very basic. This is largely due to the lack of data obtained under consistent comparable conditions and data designed to elucidate the effect of vehicle on penetration. Clearly this is an area that warrants further experimental work.52 In particular the influence of the physical–chemical properties of formulations on penetration or absorption should be investigated.
Experimental quantification of MIEs will lead to the development of QSARs for both the MIE itself and downstream effects. This quantitative understanding of an MIE together with an understanding of the events at higher levels of biological organisation will enhance the mechanistic interpretation of QSARs used to predict biological “endpoints” by limiting predictions to only those effects that are plausibly linked to the MIE. This should increase the credibility of QSAR approaches if they are presented as part of an integrated pathways-based risk assessment.
Generating data from multiple assays at different levels of biological organisation will result in a plethora of data that will require interpretation. Developments in the field of systems biology are being facilitated by advances in measurement science that enable the monitoring of biological processes at a molecular level in real time.26 In addition such analytical technologies, when applied correctly within chemical characterisation studies, can inform exposure to high sensitivities. The development of an understanding of metabolism, degradation, speciation and fundamental physical–chemical properties are all challenges that can be addressed using the latest analytical chemistry approaches.
It is recognised that in vitro experimental results will be key to building mechanistic understanding of AOPs. However, when extrapolating in vitro results to in vivo risk assessment scenarios care should be taken to fully understand assay variables and the free concentration in vitro. In turn this improved chemical specific input to PBPK models will facilitate conversion of consumer exposure to free concentrations at target sites. The next step in the ambition would be to obtain a quantitative dose response at these target sites which will be key to the success of a truly refined AOP-based risk assessment.
The AOP conceptual framework is already proving useful in providing biological plausibility to observed correlations between chemical structure and biological effects. As more detail of more pathways becomes available this utility will also increase. Quantitative modelling of an entire pathway is a long term ambition. This will only be achievable after first understanding the pathway in a qualitative fashion. It will then be necessary to quantify the relevant interactions through the development of suitable assays, the results of which can be fed into holistic predictive models.
To address the myriad of challenges associated with the proposed changes to risk assessment, it will be necessary to bring the combined expertise and techniques of several chemistry-based disciplines together with those from other fields. Each of these disciplines has a role to play, but the exact nature of this role is only just becoming clear and may well be very different to that of the present situation. Rather than attempting to address all the complex questions of an AOP approach at once, an integrated chemistry approach can guide research and prioritise activities in a pragmatic, fit for purpose manner. This would allow the ultimate aims of replacing the use of animal studies, improving efficiency and reducing the uncertainty in current risk assessment practices to be achieved.
| This journal is © The Royal Society of Chemistry 2013 |