Amanda S.
Barnard
*
Commonwealth Scientific and Industrial Research Organisation (CSIRO) Materials Science and Engineering & Future Manufacturing Flagship Clayton, Victoria, Australia. E-mail: amanda.barnard@csiro.au
First published on 13th August 2009
For the move from nanoscience to nanotechnology to be sustainable, it is important that the issues surrounding possible ‘nano-hazards’ be addressed before commercialization. The global push for more environmentally friendly, biodegradable products, means the introduction of the nanoparticles contained within these products into the ecosystem is an inevitability. When this happens, it is desirable to know how the hazardous properties may be affected, and what the potential hazards are. In this article, a number of strategies will be discussed, combining the desirable aspects of theory, simulation, experiment and observation, and leading to predictions for incorporation into preventative frameworks. Particular attention will be given to the role of theory and computation, and how it intersects with the participants from complementary fields.
Amanda S. Barnard | Amanda Barnard is a Queen Elizabeth II Fellow and leader of the Virtual Nanoscience Laboratory at CSIRO in Australia. She received her BSc (Applied Physics) in 2001 and her PhD (Physics) in 2003 from the RMIT University, before going on to a two-year position as a Distinguished Postdoctoral Fellow in the Center for Nanoscale Materials at Argonne National Laboratory, followed by three years as a Fellow at the University of Oxford. Using thermodynamic theory and first principles computer simulations, her current research includes predicting the environmental stability of nanoparticles and their interactions with natural ecosystems. |
The next generation of ‘in-demand’ smart products must be efficient, safe and (in more and more cases) environmentally friendly. Above all else, they must also be reliable and perform their function in a predictable way. Many of the nanoparticles engineered for the tasks above have no natural analogue, and are often artificially modified to induce specific functionality. This introduces a certain degree of unpredictability, since we are presented with a range of untested materials that are (literally) unique on an atomic scale, with no historical data to guide our assumptions regarding the possible hazards. It is arrogant and irresponsible to say on one hand that nanomaterials are novel and provide a range of properties (and possible applications) not observed at other lengthscales, and yet on the other to say that all of the undesirable properties are all sufficiently predictable and pose no possible threat. There are cases where we can learn from nature, particularly in the areas of geomorphology and biotechnology , and there are entire areas of science already studying these cases. It is the cases where we cannot rely on these lessons that require specialist attention from the new fields of nanoscience and nanotechnology.
In general, the risks associated with engineered nanomaterials are combinations of hazards (often described in terms of severity) and exposure (described in terms of the likelihood, frequency and duration). Hazards may manifest as undesirable interactions with biological systems such as cytotoxicity, oxidative stress in tissue, pulmonary disorders due to inhalation, inflammation due to accumulation and, at a more fundamental biological level, such phenomena as protein mis-folding and damage to DNA.8–10 Moreover, even if a certain nanoparticle does not appear toxic by itself, the interaction between this nanoparticle and other common compounds in the human body may cause serious problems to cell functions. In addition to this, hazards may also manifest as undesirable interactions with natural ecosystems, including the introduction of air- or water-born pollution, contamination of soils, detriment to the food chain and reactions leading to increases in salinity or chemical imbalances in natural resources that were not otherwise present before. This is by no means an exhaustive list, but is intended to illustrate the diversity encapsulated by this term.
In general, much of the study of toxicity of nanosized particles is based on established research on airborne ultra-fine particles. In the past, ultra-fine particles were known to either occur naturally, or be introduced through human activities or industrial products. This lead Günter, Eva and Jan Oberdörster to catalogue the natural and anthropogenic sources of nanoparticles (with average diameter <100 nm) in 2005, highlighting those produced intentionally and unintentionally by mankind.8 The natural sources include gas-to-particle conversions, forest fires, volcanoes, viruses, biogenic sources, and ferritin; and the unintentional sources include internal combustion engines, power plants, incinerators, jet engines, fumes (metal polymers, etc.), heated surfaces (such as in frying, boiling or grilling) and electric motors. The main difference between this field and the emerging field of nanotoxicology is the issue of the nanomaterials being engineered, and the fact that unlike the nanoparticles from the natural and unintentional sources, intentionally produced nanoparticles (made in laboratories) have not been around long enough for us to gather sufficient data to gauge their long-term potential for causing adverse effects.11,12
To demonstrate the importance of this issue, let us consider the life-cycle of a typical engineered nanoparticle. Using either chemical or physical synthesis techniques, experimentalists use parameters such as temperature, pressure and pH to produce the optimal laboratory conditions required to make nanoparticles with carefully controlled size, shape and surface structure. A variety of reactants, precursors and catalysts may be used, and the surfaces of the final products are often coated with atomic and molecular passivants or ligands to stabilize them. These nanoparticles are then stored in controlled environments, until they are integrated into devices or permanently introduced into products that are ultimately used by consumers. After we have finished using these products, consumers will eventually discard them, and, through drainage systems or landfill, the nanoparticles enter the natural environment and ecosystem. There the environment is not ‘controlled’, and may be significantly different to the laboratory, storage or operational environments that scientists and engineers took great care to develop and maintain.
The nanoparticles we engineer have a particular end-use in mind, and in many cases will be incorporated into macroscale products. Considering the current move to recyclable or biodegradable products, it is more realistic to assume that the introduction of these engineered nanoparticles into the natural world is an inevitability, rather than a ‘worst case scenario’ (Fig. 1). Therefore, in parallel to studies aimed at optimizing industrial performance, the toxicology of nanomaterials is also under scrutiny,13–15 along with the issues surrounding workplace safety,16,17 and consideration of possible environmental impacts.11,18–20 The International Association for Impact Assessment defines the development of environmental impact assessments as “the process of identifying, predicting, evaluating and mitigating the biophysical, social, and other relevant effects of development proposals prior to major decisions being taken and commitments made.”21 This is related to (but dissimilar from) the multi-criteria decision analysis and environmental risk assessment for nanomaterials, as described by the hypothetical case study of Linkov et al.22 The latter method utilizes a weighted decision matrix of criteria and performance scores to provide a systematic analytical approach for integrating risk levels, uncertainty, and valuation. It enables evaluation and ranking of many alternatives and provides a useful way of combining data from an array of different sources. However, currently there is insufficient data pertaining to the stability and reactivity of nanoparticles in the environment to adequately identify, predict or evaluate their impact to an acceptable level.23 This being said, considerable effort is being directed toward rectifying this situation.
Fig. 1 Given our diligent adoption of biodegradable products, the introduction of nanoparticles into the environment is more of a certainty than an unlikely possibility. |
Currently attention is focussed on the interaction of nanomaterials with living organisms.12 In 2007, a report24 was prepared outlining the vision for the future of toxicity testing, recommending ways of ensuring more effective evaluation of cellular responses in a suite of toxicity pathway assays using high-throughput tests (based on available technologies). The report addressed issues pertaining to risk assessment (based on results of toxicity testing), dose–response modeling, and the role of computational system biology models in identifying toxicity pathways. The experts contributing to the report agreed the need for a departure from the traditional high-dose animal-based toxicological tests, and the introduction of a new approach more firmly grounded in human biology.
This is a welcomed suggestion, as it has already been shown that some nanomaterials have the potential to damage skin, brain and lung tissue, and to accumulate in the body,10 though consensus in some animal studies is still lacking and many issues need be addressed before the field can progress. Reports on the toxicity of carbon nanotubes, for example, are delivering both good and bad news. An in vitro study measuring the impact of single-walled carbon nanotubes on mitochondria in human A549 cells reported spurious and inconsistent evidence of a cytotoxicity of approximately 50% using the water-insoluble salt MTT, and no cytotoxicity when using another water-soluble salt known as WST-1. No cytotoxicity was observed when using the salt INT, the dye TMRE and the antibody Annexin-V.25 However, nanotubes are also being compared to asbestos,16 and exposure of the mesothelial lining of the body cavity of mice to long multi-walled carbon nanotubes has recently been show to result in asbestos-like, length-dependent, pathogenic behavior.26 It has been found that the functionalization of carbon nanotubes can reduce toxicity,27 but stable chemisorption on nanotubes is intrinsically linked to the environmental factors such as pressure and temperature.28
Numerous reports are available summarizing the current knowledge on nano-hazards and nanotoxicity, and highlighting areas requiring attention.23,29–41 In addition to this, there has been a considerable increase in the number of reports, surveys, inquiries and articles highlighting possible hazards, coming from private bodies.9,41 Common concerns raised in these documents are the potential hazards associated with ‘dispersed’ or ‘isolated’ nanomaterials, as opposed to those already integrated into products and devices. This is due to the fact that many isolated nanomaterials are smaller than the biological systems with which they interact. The development of a preliminary framework for informing the risk analysis and risk management of nanoparticles was proposed in 2005,42 and in 2006 the International Risk Governance Council proposed a framework for global risk governance of nanotechnology, and recommended some models.43 Most recently, alternative approaches for managing the uncertainty associated with environmental risk have been published, based either on hazard or on exposure, from the perspective of experiences with chemicals.44
Clearly, the integrity of future nanodevices, our ability to anticipate failure of these devices, and our ability to manage the toxicological and environmental impacts when the devices are used in certain environments or are discarded, all require a detailed understanding of the stability of pure and functionalized nanomaterials under a full range of environmental conditions.10 Unfortunately, scientists are producing such a wide variety of engineered nanoparticles at such a rapid pace, the task of systematically measuring the stability of all possible compositions, sizes and shapes in different chemical environments (experimentally) is fast becoming unfeasibly large. It could already be measured on the order of human lifetimes.
Therefore, the motivation of this article is to describe ways that nanoscientists and nanotechnolgists with a variety of skills and expertise can engage in this area, and to provide a broad overview of a scientific strategy for predicting hazards associated with nanomaterials. The intention is not to provide a detailed step-wise guide for studying a particular nano-system or nano-hazard, since (at this level) both selecting and applying suitable techniques will depend sensitively on the precise problem at hand. It is to highlight how to combine individual investigations in a logical and systematic way, and to highlight how a collaborative approach can yield a more complete understanding in the end. In particular, this article highlights a number of areas in which theory and computation can help develop our understanding of instabilities relevant to nano-hazards, and suggests ways in which simulations can be used in partnership with experimental approaches. Attention is also given to showing how scientific outcomes interface with other fields that are also involved in this area.
The ultimate goal of this scheme is to make a link between the nano-hazards, and Prevention, at the lower left corner of Fig. 2, via the Social Domain (risk assessment and regulation). In general, this domain deals with how the impact of a particular event is evaluated. This is usually done by estimating the probabilities of events occurring (likelihood), multiplying this by a measure of the consequence of the event (or a measure of the severity of the hazard), the frequency and duration (of exposure), and then multiplying by a cost function. Discussions relating to cost are beyond the scope of this paper, but it is useful to point out that ‘cost’ (from this perspective) cannot be uniquely defined. The ultimate cost in monetary terms (aside from social or environmental costs that no monetary value can be placed on), may not estimable a priori.
Fig. 2 Schematic representation of the multi-disciplinary landscape that is navigated by those working on various aspects of (potential) nano-hazards, showing three ‘domains’ of Nature, Scientific Research, and Social Policy and Regulation. |
In this context, prevention is achieved by either reducing the probability of occurrence (exposure) or the impact of the event. This can mean preventing exposure entirely or setting exposure limits, prescribing modification to the nanomaterials that will reduce or eliminate the danger, setting appropriate safety guidelines and labeling, designing containment and storage facilities, designing disposal systems, or introducing regulation. The issue of exposure and the calculation of the probability of an event occurring will not be dealt with here. This is a behavioural issue, and is predominantly the domain of social scientists, lawyers, and politicians, who are engaged in activities designed to protect the interests (and members) of the population and the environment. Although scientists may be in a position to advise colleagues in this space, the mechanisms are not in place for us to act on this advice autonomously.
Currently, within this space, the only reliable link between nano-hazards and prevention is termed Data Mining, which is based on experiences of actual events. While observations and data collected from occurrences are irrefutable, this route is highly undesirable since it requires that something dangerous has already happened. One does not need to look far into the past to see where mistakes have been made (such as asbestos and DDT) and recall the detrimental affects these mistakes have had upon human health and the economy. While it is true that we were able to learn from these mistakes and identify prevention methods going forward, clearly the cost was too big and the damage too great to ever let this type of error happen again. Therefore the route given in the first column must be considered socially unacceptable, and we need to find a better way.
The reactivity of nanoparticles can exhibit a high degree of selectivity that depends sensitively on the material (both composition and solid phase), the size (surface-to-volume ratio), and on the shape, i.e.: on the Nanomorphology. In addition to size, shape and phase, this also encompass aspects such as the topology, symmetry and crystallographic forms presented, the growth direction of quasi-one dimensional nanostructures such as nanorods and nanowires, and the chirality of nanotubes. Many fundamental properties of nanomaterials have already been shown to have a strong dependence on their nanomorphology, such as quantum confinement48 and luminescent properties,49 the g-factors of semiconductor quantum dots50 and nanomagnetism;51,52 in addition to the functionality53 and catalytic properties54 that contribute to the reactivity.
Although the link between nanomorphology and reactivity falls within the domain of nature, there has been some success in engineering the morphology of nanoparticles so as to minimize or maximize certain interactions or reactive properties. The driving forces for variations in nanomorphology are either thermodynamic or kinetic, and include material parameters such as size,55 coatings and surfactants,56 supports and substrates,57 composition (dopants);58 and environmental parameters such as temperature,59 chemical environment,60 and exposure to electric fields61 or light.62 Since so many of the influencing factors have been identified, a detailed understanding of the relationship between each parameter and particle shape may be used to tailor the properties of nanomaterials on an individual and collective basis. This link is often exploited by those tailoring nanomaterials for specific applications, who routinely modify the size, shape and surface chemistry. Hence, we already have a reasonable understanding of how these two areas relate to one another, and how they relate to the local environment.
Both measurements and modeling fall squarely within the domain of scientific research, but since these complementary areas are not unrelated, a refinement loop is formed as highlighted in Fig. 3. The usefulness of theory and computer simulations lies in their ability to accurately reproduce experiment, and this can only be verified via a structured system of comparison. Both analytical theories, computational algorithms and simulation methods must be constantly compared to real measurements to improve their accuracy and ensure reliability. Similarly, a systematic examination of all possible nanoparticle/environmental combinations using experimental methods has already become unfeasible, and a more targeted application of these resources is more scientifically responsible. It is in this regard that modeling can complement the experimental effort by identifying the most important areas to focus on, and identify the underlying mechanisms to be tested.65
Fig. 3 Schematic representation of the refinement loop (in the ‘domain of scientific research’) where complementary approaches can combine to produce reliable predictions. |
Recent examples of such work include the characterization of the surface reactivity of ferrihydrite nanoparticles between 3–6 nm assembled within an iron-storage protein, using molecular orbital/density functional theory (MO/DFT) frequency calculations;66 the characterization of the surface reactivity of gold nanoparticles using extended Hückel theory combined with DFT calculations,67 and a DFT study of the nano-toxicological implications of oxygen adsorption at silver surfaces (including ab initio molecular dynamics).68 In the latter case the formation of superoxide at Ag(100) and Ag(111) surfaces was studied by explicitly calculating adsorption energies and structural parameters, as well as charge transfer. These results showed that O2− preferentially forms at an Ag(100) surface, making this surface more susceptible to ROS production, and the same approach could be applied to any active species at any metal surface.
Another pertinent example is nanosized titania, which is raising concerns due to the photocatalytic activity of its surfaces which can produce reactive oxygen species.12 The minority {001} surfaces of the anatase polymorph of this photocatalyst have been show to be particularly reactive (as opposed to the other, more dominant {101} facets).69 However, the fractional surface area attributed to these reactive facets within a given sample can change in response to its surroundings. Fortunately this has already been characterized in a nanoscale phase diagram.64 Therefore, by simply counting the number of potentially reactive sites on anatase {001}, and accounting for the changes in particle shape with size and temperature,70 it is possible to estimate the ROS production efficiency of anatase nanoparticles of any size, and under realistic conditions.65
Another advantage of including modeling in this strategy is that (once experimentally validated) one can rapidly sample a multi-dimensional parameter-space such as that occupied by this problem. Furthermore, since modeling is capable of doing this in a systematic manner, it can also provide much needed Explanation of the underlying mechanisms responsible for nano-hazards, such as the recent computer simulations showing fullerene translocation through lipid membranes.71 It should be noted that nano-hazards and explanation are represented at opposite extremes of the diagram in Fig. 2, serving to highlight the long way we have to go to bring these two concepts together.
Once both the measurements and modeling approaches have been validated, and a suitable partnership has been achieved, tasks may be selectively assigned to each of these paths, so as to maximize both scientific impact and the responsible use of available resources. Clearly, explicit measurements are the more pertinent route to characterizing the most common nanomaterials, and the most probable events (possibly identified by complementary modeling). This will ensure that all known (or unknown) environmental parameters are included and, as mentioned above, a natural dispersion of results is adequately accounted for.
Likewise, in addition to mapping nanomaterials’ interaction landscape, a suitable role for modeling is to probe extreme situations that are inaccessible to experimental measurements, either due to economic or physical limitations, and to test the most statistically improbable events. It may be argued that such results cannot be verified or validated, since experimental data is absent, so there is no assurance that simulations of extreme situations are reliable. However, a model that has been show to be consistently reliable under a range of ‘testable’ scenarios is ultimately the only option we have in these cases, and one must weigh the potential for uncertainty associated with these simulations against the guarantee of uncertainty associated with doing nothing. Although they are unlikely, and our ability to model them may be imperfect, these events can be (environmentally, economically or sociologically) catastrophic and the omission of these possibilities from a study of this type is irresponsible.
By activating both paths in the refinement loop we can generate a complete scientific picture which provides a fundamental basis for Prediction. Both paths feed into this node, and both are equally important. In general, the combination of approaches leading to this point will yield a distribution of results, possibly centered around a numerical value. The shape of this distribution, and the magnitude of the uncertainties, will be unique for a given problem, but will hold an enormous amount of important information necessary for estimating the probabilities described briefly in section 2.1.
A current example of a (presumed) suitable format for predictions in this area is the well known quantitative structure–activity relationship (QSAR) modeling technique. This is the process by which structure is quantitatively correlated with a well defined process, such as biological activity or chemical reactivity.72 This technique assumes chemicals (or nanoparticles) with similar molecular structures have similar effects in physical and biological systems, and the extent of an effect varies in a systematic way with variations in structure. It has been widely applied to the fields of drug discovery and chemical toxicology. Although nanoparticles with similar structures may exhibit wildly dissimilar properties and behave in an unpredictable way, the QSAR approach is now finding use in studies of nanotoxicity around the world. The more general strategy outlined here highlights where information pertaining to structure and activity (or re-activity) may come from, and how this data may be collected. However, in order for a QSAR result to be a suitable prediction for regulatory purposes: a ‘valid’ (relevant and reliable) model must be used, it must be reliably applicable to the nanomaterial of interest, and the endpoint should be relevant for the regulatory purpose. There is also the issue of how nanomaterials are grouped, and the identification of suitable structural and property descriptors. Clearly, although numerous reliable chemical QSARs have been developed over the years, a new generation of models should be tailor-made for the study of nanoparticles.
In general, there is still a lack of information on how results from computational physics, computational chemistry and computational materials science can be linked with a QSAR approach, and to what extent the methods are either complementary or competitive. It is possible that, given a suitable QSAR model, computational values (or distributions of values) could be used in combination with experimental values to provide a more complete set of inputs. Quantification of facet-dependent differential reactivity is one possibility, but this has yet to be shown to work in practice. The specificity of results obtained from computational studies may mean that a more dense set of structure–activity relationships is needed, and the increased complexity of the correlations may render the method impractical.
The route from nano-hazards to prevention, and the ultimate goal of making this journey without harm occurring, is more than just a multi-disciplinary problem. It is a multi-field problem. We can imagine the issue as a spherical core wrapped in concentric shells of collaboration, like an onion, and with information effectively permeating the shells as we approach the solution outside. Scientific disciplines form the inner shell, where physics, chemistry, biology, materials science and engineering conduct investigations within their own field of expertise and share the knowledge. These types of collaborative efforts are relatively common in the fields of nanoscience and nanotechnology. In the next concentric shell is the collaboration between experiment, computer simulations and theoretical modeling (as described above), which are less common. And finally, in the outer-most shell, there is the collaboration of those engaged in scientific research with others from different fields outside of science, which are generally rare.
However, we find that an efficient strategy, that covers all the bases, will be as much an exercise in knowledge sharing and personal relationships as it will in scientific discovery. Given the more specific, quantitative knowledge gained from theory and simulation we can build predictive models that will enable the design of more appropriate storage systems, that protect nanomaterials from the environment as much as they protect the environment (and us) from them, and allow us to construct algorithms for assessing the likelihood of toxicity in a variety of natural environments.
Footnote |
† Although reactivity is the only property explicitly linked to nano-hazards in the scheme shown in Fig. 2, the general strategy outline here applies for any hazardous property. |
This journal is © The Royal Society of Chemistry 2009 |