1. Introduction
Nanomaterials are widely used in various industrial processes and consumer products, e.g., catalysis, refinery, electronics, food, packaging, cosmetics, pharmaceutical and medical devices. A further expansion is expected in the forthcoming years [
1,
2,
3,
4]. Due to this wide distribution, the impact of nanomaterials on health and environment becomes more and more relevant [
5,
6]. The European legislation has responded to this development and defined the term “nanoform” in the Annexes to the REACH (registration, evaluation, authorization of chemicals) regulation [
7]. Now, specific information of the nanomaterials is required from the companies when registering the appropriate materials in a dossier. The progress reached in the last decade has led to the establishment of a new branch in toxicology, the so-called nanotoxicology, which is specialized on questions like in what doses and how nanomaterials interact with biomolecules and the environment, how they are transformed by the biological systems and over what period of time. According to Bohnsack et al., nanotoxicology has to begin with comprehensive physical chemistry behind the nanomaterial under study [
8]. For example, nanoparticles can agglomerate or aggregate, which may influence deposition and transport of those into/to different body organs; clearance from the body may be also altered if these are agglomerated/aggregated, etc. [
9]. This process of agglomeration/aggregation is determined by the nanoparticle size and concentration: “the smaller the particles and the higher the concentration, the greater the aggregation/agglomeration rate” [
10]. Despite great efforts in the nano-risk research area and large number of publications on nanotoxicology during the last decade, there are many challenges that nanotoxicology is facing, e.g.:
The influence the different environments such as air, soils or sediments, freshwater or seawater on the toxicity of the nanomaterials,
Categorization and prioritization of different types of nanomaterials for the ecotoxicological risk assessments [
11] (grouping and read across),
The increased complexity of nanomaterials, e.g., core–shell particles [
12].
In the context of REACH, eleven physicochemical properties were considered as relevant, of which the following six are essential for registration of nanoforms (priority properties): chemical composition, crystallinity, particle size, particle shape, chemical nature of the surface (“surface chemistry”) and specific surface area (SSA) [
7]. This key role of these properties stresses the importance of reliable, reproducible and traceable data. This need is boosted by the consideration of ECHA (The European Chemicals Agency) to use them potentially for grouping and read across [
13]. Obviously, all efforts in this field and more generally, in nanoinformatics for risk assessment, can only be successful, if the data, which are used are not only available and complete, but also of guaranteed high quality.
The quality of physicochemical data is discussed in the literature in terms of completeness, relevance (meaningfulness and usefulness), reliability (trustworthiness), accessibility/availability and reproducibility [
14]. Data completeness in nanotoxicology refers to availability of a complete physicochemical characterization of the nanomaterial (including also raw, derived and meta-data) and an extensive description of the experiments and methods. In a regulatory context, for example, complete information is then considered if the information allows a regulatory decision. One way to evaluate data completeness is to employ a minimum information checklist (examples of such checklists were already reported in the literature) [
15] and to estimate the compliance with such a checklist. One has to mention here that data completeness may be considered highly context-dependent.
Data relevance may be regarded as “fit-for-purpose”, which tells whether data serve to clarify particular question/purpose, not necessarily being useful for answering other questions. As such, only relevant data should be taken into account when assessing data completeness [
15].
Reliability of the data is another criterion that means data trustworthiness or “credibility”. Mainly, data can be considered reliable if these are compliant with Good Laboratory Practice or other standardized protocols, like guidelines of the Organization for Economic Co-operation and Development (OECD) or rulings of the International Organization for Standardization (ISO) [
16]. important aspect in assigning data as reliable is the evaluation of an intrinsic scientific quality of the data. This aspect is not always addressed in the literature; however, we consider that only an expert team with long-term experience can judge about scientific quality (reliability) of the experimental data. Best assumptions would be achieved if different expert-teams (from different accredited laboratories) evaluate scientific quality of the data.
For the data availability and reproducibility, the FAIR (findability, accessibility, interoperability and reusability) guiding principles for scientific data management give minimal requirements for finding, accessing, (re)using and citing scientific data [
17]. In times of generating big quantities of data (“Big data”) such discussion is necessary, particularly, how to open data for other interested parties and avoid unnecessary time-wasting measurements.
On the other hand, the community has been discussing the “reproducibility crisis” for several years [
18]. To overcome this crisis and to accelerate nanotechnology research, scientific publishers and more and more project funding agencies encourage authors to publish data in suitable repositories and to respect the FAIR principles [
19,
20,
21]. It can be hoped that such sustained efforts coming from the whole scientific community contribute to solving the “reproducibility crisis” [
22]. The consideration of such guidelines is not only necessary for scientific publications, but also even more for data used for risk assessment. Therefore, we demonstrate for the physicochemical properties particle size, particle shape, chemical composition and surface chemistry with representative examples of nanoparticles, how the data quality can be ensured such that risk assessment can be made in a reliable way.
As mentioned before, the first step in nanotoxicology is the complete characterization of the nanomaterial, since further toxicological interpretation is based on these data. If the extensive and robust characterization is lacking, the linking between nanomaterial properties and toxicological outcomes becomes impossible. Efforts towards the establishment of consensual minimum levels of analysis of the physicochemical properties of nanomaterials across the entire nanomaterials community have started 2008–2009 [
23], however, the full success has not come yet. This paper demonstrates on a practical analysis example, the high degree of standardization of surface analysis workflows with methods completing each other [
24].
Figure 1 depicts links of the physicochemical characterization in the framework of risk assessment.
As main surface analytical methods to characterize nanomaterials we chose scanning electron microscopy (SEM), energy dispersive X-ray spectroscopy (EDS), X-ray photoelectron spectroscopy (XPS) and time-of-flight secondary ion mass spectrometry (ToF-SIMS).
The electron microscopy in its various variants, i.e., scanning electron microscopy, transmission electron microscopy (TEM) or combination of the two as scanning transmission electron microscopy (STEM and STEM-in-SEM), constitutes the “working horse” technique for direct visualization of the nanomaterial surface morphology. Further, electron microscopy is well suited for the traceable measurement of the nanoobject size and shape. This valuable capability relies on the calibration of the image magnification, i.e., of the pixel size, by well-known dimensional structures as reference materials [
26]. Beyond the great advantage of accessing accurately single nanoparticles, electron microscopy poses challenges mainly related to the representativity of the measurement carried out on a small fraction of nanoobjects (mostly a few hundreds). Particularly for real-world nanomaterials, having mostly a complex morphology, broad particle size distribution and high degree of agglomeration/aggregation, the final result of the particle size distribution can be significantly erroneous [
27].
EDS is conventionally a broadly used method applied in combination with a SEM mostly for the quick analysis of the elemental composition of solid samples with a spatial resolution in the range of a micrometer. Technological developments in recent years have enabled the reduction of spatial resolution down to a few nanometers [
28,
29]. Whilst a quick identification of the main elements in the bulk of the nanoobjects is possible by EDS, the quantitative evaluation of the chemical composition of single nanoobjects is impossible. The relatively quick qualitative information on the presence of elements in the “bulk” of the nanoobjects is very valuable, particularly in combination with the analysis of the morphology of the same nanoobjects by electron microscopy or in combination with other surface chemical analysis techniques such as XPS, ToF-SIMS or Auger electron spectroscopy, with the last technique not included in this study [
29,
30].
XPS is the most popular surface analysis method and provides information about the chemical composition and the chemical nature of the compounds in the near surface region [
31]. Here, the sample is irradiated with X-rays, usually of an energy of 1253.6 (Mg Kα) or 1486.6 eV (Al Kα) and the ejected photo- and Auger electrons are analyzed in an energy spectrometer. The information depth of ca. 10 nm makes this method highly suitable for chemical analysis of nanoparticles.
ToF-SIMS is an even more surface sensitive technique than XPS. In ToF-SIMS, the surface is bombarded by charged species (primary ions) resulting in a collision cascade within the surface near layers. Part of the impetus of the energy transported in the collision cascade is directed towards the surface and atoms and molecules from the uppermost monolayer can overcome the surface binding energy and thus leave the surface. About 1% of the thus sputtered material is charged (secondary ions) and by mass analysis of the ion flux the chemical composition of the uppermost 1–3 nm can be derived. Currently, mainly cluster ion beams (e.g., Bi
3+) are used for surface excitation and ToF mass analyzers allow the simultaneous detection of elements and intact molecules (<10,000 u) with detection limits in the ppb/fmol range. Semiquantitative information can be gained if the chemical composition (matrix) of the materials to be compared is similar. The key properties are summarized in
Table 1.
As in other areas of science, the “reproducibility crisis” has been discussed in the surface science community, which XPS and ToF-SIMS are part of [
32]. Increasing multidisciplinarity, complexity and the high competition between researchers together with limited resources and often wishful thinking were identified as reasons for this crisis. The first two reasons are certainly valid for the risk assessment of nanomaterials. This subject is a paradigm for a highly multidisciplinary and complex research area, which makes it so challenging. As a response of these observations the American Vacuum Society (AVS) begins to develop practical guides for different methods together with renowned scientists. The first guide for XPS has been published recently [
33]. It is highly recommendable to consider this guide for work with XPS. In this context, it must be noted, that for the methods presented in this paper there are active standardization bodies, e.g., ISO and CEN (European Committee for Standardization) [
34]. Therefore, actual and application-orientated standards are available, and they should be considered in the daily work. The consideration of such guidelines should be a crucial issue for data, which are used for risk assessment.
In this contribution, we present how the quality of physicochemical data of nanomaterials can be ensured in order to use the data for risk assessment. As materials we have chosen Al-coated TiO2 nanoparticles, because of the prevalence of titania nanoparticles in consumer and other common products like sunscreens, paints and catalysts. These materials were provided by the Joint Research Centre (JRC) of the European Commission, Ispra (Italy). The focus of the here presented study is to describe the challenges that are accompanied with the surface-analytical characterization of core–shell particles.
2. Materials and Methods
Basically, the physicochemical characterization of nanomaterials with respect to their morphology and chemistry follows the same steps, independent of the type of material: (i) sample preparation, (ii) measurement, (iii) data analysis and (iv) reporting/archiving. The minimum information requirements with respect to the material morphological–chemical characterization of titanium dioxide nanoparticles are described in detail for all the analytical methods selected here: electron microscopy for size and shape, EDS for qualitative bulk elemental composition, XPS and ToF-SIMS for the surface chemistry. Concrete, following two materials have been selected from the JRC repository, see
Table 2 [
35].
According to the information on the JRC repository [
35], both materials selected are rutile having a primary particle size of 21 nm (from XRD measurements) and an Al-coating; one material is hydrophobic (JRCNM62001a) and the other one (JRCNM62002a) hydrophilic. It remains to apply the surface analytical methods selected here and infer the characterization results.
In the present work, a SEM of type Supra 40 (ZEISS, Oberkochen, Germany) with a Schottky field emitter and an InLens secondary electron detector was used at a 5 kV beam acceleration voltage.
The EDS analysis in the present study has been performed with a QUANTAX 400 EDS system (BRUKER, Berlin, Germany), which is equipped with an SDD (Silicon Drift-Detector) of the 10 mm2 nominal area. An excitation of 10 keV was applied for the analysis of the titania samples prepared as a thick dry powder layer on an aluminum stub, so that the substrate cannot be coexcited. The analysis areas were selected as large as 5 × 5 µm2 on sample agglomerates of about 10 µm size.
XPS measurements were performed at an Axis Ultra DLD (KRATOS, Manchester, UK) with monochromatic Al Kα radiation (E = 1486.6 eV). The electron emission angle was 0° and the source-to-analyzer angle was 60°. The binding energy scale of the instrument was calibrated following a Kratos analytical procedure, which uses ISO 15472 binding energy data [
36]. The setting of the instrument was the hybrid lens mode and the slot mode with an analysis area of approximately 300 × 700 µm
2. Furthermore, charge neutralization with a flood gun was used. All spectra were recorded in the fixed analyzer transmission (FAT) mode. The samples were measured as powders prepared on a special stainless-steel sample holder.
ToF-SIMS analysis was performed on a TOF.SIMS5 instrument (IONTOF, Münster, Germany) in the spectrometry mode, with Bi3+ primary ions at an acceleration voltage of 25 keV. The primary ion flux was kept below the static limit (1012 ions/cm2) and the secondary ions analyzed in both the positive and negative mode. The samples were prepared by fixing the TiO2 powder onto a double-sided adhesive substrate (3M removable repositionable tape 665), which was then mounted on the sample holder and introduced to the instrument. Special care was taken to avoid the release of sample material grains into the equipment. This particular adhesive was chosen because it contains significantly less siloxane signals, which result from the coating on the liner.