1. Introduction
The Chernobyl accident contaminated most of Europe with fission-product radioisotopes including short-lived, high-activity Iodine and Tellurium, and also fuel particles containing uranium and other intermediate half-life isotopes, including the 30-year half life Caesium-137 [
1]. In the UK, whole body monitoring showed the persistence of Caesium-137 in the population [
2] and grassland surveys enabled the radiological modeling of equivalent dose. In general, the exposures in Europe were examined in some detail and doses to the population were well characterized [
1]. For all of the countries of Europe except Belarus, the first year average committed effective doses were below 1 mSv, ranging from 0.02 mSv for the whole of the UK through 0.07 mSv for the whole of Germany, 0.2 mSv for Greece up to 2 mSv for Belarus. At these levels, the risk model of the International Commission on Radiological Protection (ICRP) predicts no measurable health effects. The absorbed doses were less than a quarter the mean natural background dose, and if dose has any universal radiological meaning, the exposures must be considered safe. Nevertheless there were reported increases in infant leukemia in the
in utero exposed cohort in Scotland [
3], Belarus [
4] Greece [
5], Germany [
6] and Wales and Scotland combined [
7,
8].
Busby and Scott Cato [
7,
8] examined the likely absorbed doses to the children and applied the current radiation risk models of the ICRP, those employed also by all radiological protection legislation, to show that the risk factors currently being employed for the protection of members of the public were in error by upwards of 100-fold. Such an error might begin to illuminate other apparently inexplicable associations between childhood leukemia and exposure near nuclear sites, notably the ongoing child leukemia cluster near the UK Sellafield reprocessing plant in Cumbria [
9] and the results of the recent KiKK study in Germany [
10]. Infant leukemia is believed to be a consequence of a gene mutation
in utero [
5]. The importance of the infant leukemia results are that the
in utero doses were well characterized, and that since the cohort is so well described, there is really no other explanation for the finding apart from exposure to ionizing radiation. Thus the existence of the effect may be taken as a
prima facie evidence of the failure of the ICRP model and may be used to determine the accurate risk factors for this kind of internal exposure.
The seriousness of this question led in the UK to the formation of the Committee Examining Radiation Risk from Internal Emitters (CERRIE) whose remit was to examine the assertion that for internal exposures from fission–product radioisotopes, the true risk factors for cancer and leukemia were much greater than those currently employed by the radiation protection legislation. It was argued that the ICRP model was largely based on historical external radiation exposure studies, principally that of the Japanese A-bomb survivors and may not be safe for examining internal chronic exposures. This question was addressed in 2003 by the new European Committee on Radiation Risk (ECRR) [
11] and also in 2006 by the French IRSN [
12]. The application of the ICRP model (which is based on adult exposures) to fetal exposures has also been questioned recently [
13,
14].
As part of its remit to examine the issue, CERRIE applied to the Oxford-based Childhood Cancer Research Group (CCRG) in order to follow up the 2000 Busby and Cato analysis [
7,
8] by examining the UK by contamination area and period. Data limitations had forced Busby and Scott Cato to employ very slightly different periods to those used by Petridou
et al. [
5] and Kaletsch
et al. [
6] and CERRIE decided to obtain data for the same periods. The first question was whether there was an effect in the high and intermediate exposure areas of the UK if the time periods used by Petridou
et al. [
5] were used to define exposure cohorts. Exposure in the UK depended upon rainfall at the time, and areas were agreed on the basis of measurements made by the UK National Radiological Protection Board and supplied to CERRIE. Results of the CERRIE analysis were difficult to interpret since the committee failed to agree on the significance of the data. There were two reports. The main report presented a statistically significant excess risk in Greece and Germany and non-statistically significant excess risk in the UK and in Belarus but was disinclined to conclude that the effect was real [
15]. A minority of the committee argued that the effect had occurred in different countries as well as the UK and therefore should be taken as evidence that raised questions over the adequacy of the ICRP risk model for radiation safety [
16].
4. Discussion
In the UK data, supplied by CCRG, and based upon the 1996 Petridou
et al. [
5] birth cohort criteria, there was an increase in infant leukemia in the exposed cohort in both the high and intermediate group combined and also in the total population. Unlike the increases in Scotland and Wales [
7,
8], the UK increase fell short of statistical significance at the p = 0.05 level using a two tailed test though would have been statistically significant using a directional test (which is justified since the prior hypothesis is directional: no one would argue that exposure to radiation would have
reduced the risk of infant leukemia). This result (
Table 4) differs from the earlier finding of Busby and Cato for Wales and Scotland [
7,
8] which found a statistically significant excess risk of RR = 3.9; p = 0.0002) because different areas were employed by CERRIE and also a slightly different period was employed. Most of the UK was unexposed and so the exposed population was diluted with unexposed individuals, reducing the Relative Risk and therefore also the statistical significance.
Combining the UK increases with those in Greece and Germany, (where the doses were greater) gave a 43% increase in infant leukemia in the combined cohort of 2.2 million births in children exposed to a mean population weighted dose of 0.067 mSv. The mean dose was obtained by population weighting the fetal doses determined for each country supplied by NRPB to the CERRIE committee for UK and obtained from the German study [
6] where the doses were measured by the German Radiological Protection personnel and from UN data for Greece [
1]. It should be emphasized that the internal dose here is unknown. The dose calculations are based mainly upon external dose, mainly gamma shine from Caesium-137 deposition. However, it is just this (mainly) external dose that is employed in radiological modeling of health effects, and so for the purpose of what follows this is the dose that is relevant.
In calculating the dissonance between the predictions of the ICRP models and the observed number of cases found in Scotland and Wales, Busby and Cato [
7,
8] used the ICRP risk factor of 0.0125 per Sievert (employed by the UK government COMARE committee in 1996 to examine the Sellafield child leukemias) [
17]. However, in discussions within CERRIE it was pointed out that the obstetric data of Stewart
et al. [
18] was a firmer basis on which to compare the risks from internal fetal exposure with those from external. Stewart
et al. found a 40% increase in childhood cancer aged 0–14 after an X-ray dose of 10mSv [
19].
If we assume a 10mSv X-ray dose causes a 40% increase in childhood cancer, it is clear from
Table 5 that a mean dose of 0.067 mSv from Chernobyl fallout has caused a mean increase in infant leukemia of 43%. The mean corresponding error in the application of the obstetric external risk factor to the infant leukemias is thus 43/40 × 10/0.067 = 160. There were therefore 160 times more infant leukemias in this combined population that would be predicted by the use of the obstetric X-ray data. And this is only in children aged 0–1: this is a minimum value, as we have yet to see what other cancers or leukemias emerge in this group as they age between 1 and 14 years. If the ICRP cancer risk coefficient is employed, as it was in the COMARE analysis of the Sellafield child leukemias the difference between the observed and predicted number of infant leukemias would be far greater, in excess of 1000-fold.
Because the number of exposed children is so large, it can be safely concluded that there was a real increase in infant leukemia in those who were exposed in utero to the fallout from Chernobyl although we cannot say for certain that the effect was not due to parental pre-conception irradiation, since our exposed groups (defined by Petridou et al.) were born up to the end of 1987.
A number of researchers have dismissed the increases in infant leukemia following the Chernobyl fallout as causally due to radiation exposure on the basis that the dose response relationship does not increase monotonically e.g., [
6,
8]. This argument needs to be addressed.
In the data available from the several countries, there was also a biological gradient in the rates over a certain range.
Figure 1 shows the increases in infant leukemia with dose in the European countries which have been studied.
The German study presented results for three dose areas and showed that the dose response was biphasic,
i.e., the greatest effect was not at the highest reported dose level. This was also true for the data from the UK when it was subdivided into the high, intermediate and low dose areas. In both countries the highest effect was in the intermediate dose area. Infant leukemia increases were also reported in Belarus [
4] and the effect there was quite modest there although the doses were higher than in Greece. The data suggest that over the range 0–2 mSv the overall dose response is biphasic (
Figure 1).
This biphasic behaviour is not remarkable for an
in utero cause and endpoints in the living child, since above a certain dose some defense system may become overwhelmed and fetal death may intervene. Increasing the dose of any fetal poison will generally result in fetal damage and ultimately in death of the fetus. Therefore the highest doses will not necessarily produce the greatest effect if the outcome is measured after birth. Alternatively, biphasic radiation dose response relationships have been reported in the literature by Burlakova, who believes they represent a consequence of induced repair efficiency and the overwhelming of defense responses [
20]. In addition, dismissal of causality because of the absence of a monotonic increase in effect with external dose may be insecure since it is not clear that the dose levels reported correlate with internal exposures of the specific type that cause the illnesses, since agricultural produce from high exposure areas may end up anywhere in the country or even in another country. In the main, the exposures used for these studies are based upon external radiation measurements or ground deposition of Caesium-137. If the exposures were to milk from cattle fed in the winter of 1986/87 with grass contaminated with radionuclides, this milk might end up anywhere in the country, not necessarily where the main deposition was; indeed dairy cattle are unlikely to be feeding in areas where the rainfall is high e.g. mountains. In support of this conclusion it is clear from the whole body monitoring results in the South of England, where Cs-137 precipitation was almost absent, that winter cattle feed was contaminated with radionuclides and that the radiation in the food travelled south from the affected areas. There was a clear second peak in Cs-37 in the Spring of 1987 which the produce from winter fed cattle appeared in the food supply [
5,
6,
10].
Given the extremely low mean dose involved in the combined exposure area, UK, Greece and Germany (<70 μSv), the increase in infant leukemia was not predicted by the ICRP model. This defines an error in the use of a risk coefficient defined by the obstetric X-ray data of at minimum of 160-fold and an even greater error in the predictive radiation risk model of the ICRP. The ICRP model has been criticized for lack of scientific method and for failures to predict or explain a number of observations in children [
11–
13,
16]. In particular, it has been argued that the use of acute external irradiation data to inform the model for health risks from internal chronic irradiation involved misuse of scientific method, and employed deductive rather than inductive reasoning [
9,
12,
13]. If these criticisms are valid then clearly it is not possible to employ risk factors culled from the Japanese A-Bomb external high-dose acute exposure series to inform risk about chronic low-dose internal irradiation. And by the same argument, it is not valid to employ the risk factors obtained from the external obstetric X-ray data to inform risk models for internal irradiation. It is necessary to employ studies of children exposed to internal chronic radiation from fission product isotopes if we wish to develop models to predict or explain these same exposures.
The nuclear site child leukemia clusters, e.g., Sellafield, Dounreay and La Hague, and others listed by ECRR2003 [
11] have been extensively studied and confirmed as being real and not due to chance. Recently a very large German government-funded study also revealed significant excess leukemia risk in children living within 5 km of nuclear sites from 1988–2005 [
10]. These children will have been exposed to fission-product and uranium releases from the sites;
i.e., internal exposures. In all these nuclear sites the difference between the yield of childhood leukemia predicted by the ICRP and the observed numbers for these nuclear sites is in excess of 300-fold. The existence of the infant leukemias reported here for the European cohorts has resulted from doses which are less than those experienced by the nuclear site children, but for whom there is no alternative explanation apart from internal radiation exposure to largely the same fission product isotopes. Further research on infant leukemia in this cohort in other countries of Europe might usefully be pursued.