1. Introduction
The peach industry in the southeastern United States (SEUS) has been a part of the regional iconography since at least the mid-1920s, and was historically an important part of the agricultural economy [
1]. While California’s current peach production dwarfs that of Georgia and South Carolina [
2], the industry in the SEUS continues to contribute millions to regional, state, and local economies [
3], and peaches remain important to regional identity [
1]. In 2017, approximately 80% of Georgia’s peach crop and 90% of South Carolina’s peach crop were damaged due to warm winter temperatures. The warm conditions resulted in insufficient winter chill accumulation in some areas, while other parts of the SEUS were impacted when an early bloom, due to unseasonably warm temperatures, was followed by a mid-March freeze. In Georgia, an estimated 70% of the total 2017 peach losses were attributed to inadequate chill and 10–15% of the losses, the result of a spring freeze [
4]. The combined impacts of anomalously low chill accumulation and spring freeze yielded substantial economic damage across the region [
5]. Given the role of the peach industry in both the economy and culture of the SEUS, the 2017 crop failure garnered much public interest including whether such warm winters and impacts to perennial agriculture may become more commonplace in the coming decades.
Like other fruit trees, peaches undergo a series of physiological changes during the fall that allow for the onset of dormancy, when growth and development are slowed or stopped and the plant is better able to tolerate cold temperatures. Many perennial crops must be exposed to a certain amount of cold temperatures, or chill, during this period of dormancy to continue their development in the spring [
6]. Peach cultivation is governed by a number of climatic factors such as cold hardiness, frost tolerance, and sufficient heat accumulation. Peach cultivars are frequently selected based on climatological chill accumulation [
7] as insufficient chill accumulation can reduce flower quality, inhibit pollination and fruit development, and lower fruit quality and yield [
6,
8,
9], with subsequent economic impacts to both growers and consumers [
10].
Observational studies have shown warming in both the mean and extreme cold winter temperatures over the past half century across the US [
11,
12,
13], much of which is consistent with anthropogenic forcing [
14] and is expected to continue under climate change [
15,
16]. The exceptions of observed warming trends are primarily found in the warming hole across parts of the SEUS where winter temperatures cooled and spring onset trended later over the latter half of the 20th century [
17,
18]. The warming hole is likely a consequence of internal variability of the climate system that has buffered the influence of anthropogenic forcing to date, but is not expected to persist into the coming decades [
17]. While it is acknowledged that chill accumulation is only one of many thermal-metrics that might directly impact crop suitability in a changing climate [
19], declines in chill accumulation have been observed in some regions [
20] and are projected to decline further [
21]. Likewise, increases in winter temperatures are projected to reduce chill accumulation below the thresholds needed for peach cultivars in many peach-growing portions of the US [
20,
22].
In view of recent crop impacts due to warm winters, we examine chill accumulation across the SEUS in the context of ongoing climate change with a focus on implications for peach cultivation. First, a first-order estimate is provided of the contribution of anthropogenic climate change to observed low chill accumulation winters in the SEUS and years with insufficient chill in prime peach-growing areas in Georgia and South Carolina during 1981-2017. Secondly, using a suite of downscaled climate projections, changes in chill accumulation, the frequency of low-chill winters, and changes in the risk of winters with insufficient chill for common peach cultivars in the coming decades were investigated. Comprehensively, this study presents methodologies that may be applied to agro-climate metrics for conducting climate change risk and impact analyses for perennial crop systems globally, and provides a risk assessment of insufficient chill for peaches—and general chill accumulation for other perennials—in the SEUS, presenting information useful for climate-informed decision making.
2. Materials and Methods
Two primary datasets were used in this study (available at
https://data.nkn.uidaho.edu/). First, the observed daily maximum and minimum temperature (T
max, T
min) at a ~4-km spatial resolution for the period 1981–2017 for the SEUS [25°–35.2° N, 78.5°–88.5° W (see
Figure 1a)] were acquired from the gridded surface meteorological dataset (gridMET) of [
23]. Previous validation of gridMET showed high correlation and low bias of temperature when compared to meteorological station observations across the US [
23], and comparisons in chill accumulation between gridMET and data from 50 SEUS meteorological stations from 1980–2017 showed strong spatial correlation (
r = 0.99), with a mean absolute error of 50 chill hours and a median bias of -17 chill hours (analysis not shown). Second, the projections of daily T
max and T
min from 20 global climate models (GCMs) that participated in the fifth phase of the Climate Model Intercomparison Project (CMIP5) were statistically downscaled using the multivariate adaptive constructed analogs (MACA) method [
24]. The MACA used gridMET as training data, thereby ensuring compatibility in contemporary climate statistics between the downscaled GCM experiments and gridded observations. The analysis of climate projections was constrained to simulations for the early (2010–2039) and mid- (2040–2069) 21st century periods given the limited ability for developing meaningful management strategies relevant to the end-of-century projections. Further, we focused on future experiments run under the Representative Concentration Pathway 4.5 (RCP 4.5) to provide a conservative estimate of projected changes in chill accumulation. The projections using RCP 8.5 would likely show similar qualitative changes, but with larger magnitudes, particularly for the mid-21st century where multi-model mean changes in winter mean temperatures show an additional 0.6 °C warming above RCP 4.5, although the variability among models exceeds the difference between RCP 4.5 and RCP 8.5 for the time horizons highlighted herein.
A first-order estimate is provided on the influence of anthropogenic climate change on observed 1981–2017 chill accumulation using a large ensemble of CMIP5 simulations and a pattern scaling approach that allows for comparisons between rates of local and global change [
25]. The differences in monthly T
max and T
min as simulated by 23 different GCMs at their native spatial resolution were taken between two 30-year periods, 1850–1879 and 2070–2099. The pattern scaling approach allows the expression of modeled rates of regional change for an individual variable and month to modeled rates of change in the global mean annual temperature. This approach assumes a linear relationship between the variables, which is reasonable for climate change timescales [
25]. The pattern scaling was calculated separately for each model, as well as for the 23-model median. For each model, the anthropogenic climate change signal was defined for monthly T
max and T
min by multiplying the monthly varying pattern scaling function by an 11-year moving average of the change in the modeled global mean annual temperature relative to each model’s 1850–1879 baseline. It is acknowledged that this is one of several first-order approaches for approximating the modeled influence of anthropogenic climate change over the historical record [
26,
27].
Following [
26], a time series of daily T
max and T
min for 1981–2017 for the SEUS was created that preserves the observed interannual climate variability, but removes the influence of modeled anthropogenic climate change by subtracting the estimated difference in modeled monthly temperature anomalies (relative to the 1850–1879 baseline) using pattern scaling from the observed temperatures. These counterfactual scenarios do not make an effort to discern the sources of change in the observed data. Rather, they provide an approach for estimating the proximal effects of modeled anthropogenic climate change in the context of real-world observations.
The peach location data were obtained from the 2016 United States Department of Agriculture—National Agricultural Statistics Service Cropland Data Layer (CDL, available at
https://www.nass.usda.gov/Research_and_Science/Cropland/Release/index.php) for the SEUS states of Alabama, Georgia, South Carolina, and Florida [
28]. Approximately 94-km
2 were classified as peach in the 2016 Southern CDL with nearly all of the orchards located in Georgia (~34.5-km
2) and South Carolina (~58-km
2) (
Figure 1a). The 30-m resolution CDL data was aggregated to the common 4-km resolution of the climate data for analyzing chill accumulation over peach-growing locations, summing the number of 30-m peach cells within each 4-km grid cell. The peach-growing locations were classified as those 4-km grid cells with >0.01% peach density. Finally, in order to provide locally-relevant results in addition to the regional analysis, our peach cultivar-specific analysis focused on peach locations within a 4-county area of central Georgia and a 3-county area in the Piedmont region of South Carolina that are responsible for ~75% and ~50% of each state’s peach production, respectively.
Estimates of chill accumulation derived from chilling models are used for selecting appropriate crop species and cultivars, and to track plant phenology for farm management practices [
29,
30]. While there are multiple modeling approaches for calculating chill, the Weinberger Chilling Hours Model [
31] was utilized as chill requirements for SEUS peaches are most commonly reported in chilling hours. The chill thresholds for peach cultivars examined in this study were quantified using the Weinberger model in central Georgia. Further, this model is commonly used to track winter chill accumulation across the SEUS as part of the online tools available through regional university consortiums and university extension programs (e.g.,
http://agroclimate.org/;
http://weather.uga.edu/), and as such, using this chill model allows for the most direct translation of this work to end users.
The Chilling Hours Model sums the number of hours per day with temperatures <7.2 °C; hourly data were temporally disaggregated from daily T
max and T
min using a modified sine curve model [
32]. Annual chill accumulation was considered from 1 October to 15 February, as is standard in the SEUS peach industry [
33]. Peach chill requirements were obtained from the University of Georgia [
34] for three cultivars grown in the SEUS. Gulfprince and Juneprince peaches require 400 and 650 chill hours, respectively, and are hereafter referred to as low- and moderate-chill cultivars. The Elberta peach cultivar (hereafter referred to as high-chill) requires 850 chill hours and is a cultivar standard to which the phenology of other peach cultivars is compared [
34]. It is noted that not all of these cultivars are grown across all peach-growing locations of Georgia and South Carolina. Gulfprince is a cultivar grown primarily in southern Georgia, while central Georgia principally grows peaches with chill requirements ≥600 chill hours (Dario Chavez, University of Georgia Extension Specialist, personal communication). However, these three cultivars have been included as exemplary of the range of chill requirements across SEUS-grown peaches. By including the low-chill cultivar in our analyses of selected South Carolina and Georgia peach-growing counties, we show the capacity for these counties to continue to produce peaches under future climate conditions should future chill accumulation limit the productivity of the currently-grown moderate- and high-chill cultivars.
Chill accumulation was calculated over the 1981–2017 period with the observational data, and for the counterfactual scenarios using the observed data from 1981–2017 after removing the influence of anthropogenic climate change. The 1981–2017 data were further used to quantify changes in the frequency of low-chill winters, defined as the bottom decile (10th percentile). This provides both additional context for the peach-focused analysis herein and may be of broader interest to the SEUS fruit and nut industry reliant on understanding exposure of low-chill winters as it pertains to the economics of orchard operations [
29]. The observed and counterfactual scenarios for 1981–2017 were used to quantify the degree to which modeled climate change influenced the average chill accumulation, the probability of experiencing a low-chill winter, and the risk of insufficient chill accumulation for the three peach cultivars across the key Georgia and South Carolina peach-growing regions. Chill accumulation was also calculated for the 2010–2069 period for each of the 20 downscaled climate datasets. A similar set of tests were applied to projections including changes in average chill accumulation and the probability of experiencing a low-chill winter across the SEUS. Finally, the probability of insufficient chill was estimated for the early and mid-21st century conditions for the key peach cultivars and regions in order to highlight the potential risk to peach cultivation. Given our focus on the changes to chill accumulation with respect to perennial fruit cultivation, areas with <100 chill hours over the 1981–2017 observed period were masked out.
3. Results
The average chill accumulation for the observed period 1981–2017 across the SEUS ranged from less than 100 h in southern Florida, to more than 2000 h in the Blue Ridge mountains of northeastern Georgia (
Figure 1b). The majority (>65%) of the region—from northern Florida to northern Alabama, Georgia, and South Carolina—averaged 500–1500 chill hours, including approximately 1100 h in the central Georgia peach-growing region and 1350 h in the Piedmont peach-growing region of South Carolina. With the exception of southern Florida, the 2017 chill accumulation was substantially lower than the 1981–2010 normal. The accumulated chill in 2017 showed an SEUS average anomaly of approximately 330 h below normal. The Georgia peach regions showed an anomaly of ~430 h below normal, and South Carolina peach regions showed an anomaly of ~360 h below normal (
Figure 1c).
The observed average chill accumulation over 1981–2017 was less than that modeled in the absence of anthropogenic climate change, consistent with the expectations from modeled warming (
Figure 2a). A distinct geographic pattern of the reduced chill hours due to climate change was evident across the SEUS, with nominal differences in southern Florida and reductions of more than 120 h in northern Alabama, Georgia and South Carolina. The peach-growing regions showed average reductions of ~115–120 chill hours in Georgia and South Carolina. Notably, these reductions are averages over the 37-year period as the modeled estimate was larger in more recent years. Complementary to average reductions in chill hours, the percent of years experiencing low winter chill was substantially higher across the SEUS over the 1981–2017 period than it would have been in the absence of climate change (
Figure 2b). These trends were found across models. The 23-model range for declines in chill was ~68–140 h, while the range for the probability of low-chill winters was 1.6–4.6% of years (from a reference of 10% of years).
The reductions in chill accumulation and increases in the occurrence of low-chill winters may be inconsequential for agriculture unless there are direct impacts to plant physiology or indirect crop impacts (e.g., pathogens, pests). For the three peach cultivars, we show that the Georgia peach-growing region had five winters from 1981–2017 that did not accumulate sufficient chill for the high-chill cultivar (
Figure 3a). No winters in the South Carolina peach-growing regions had insufficient chill for the cultivars considered from 1981–2017 (
Figure 3b). By contrast, the counterfactual scenarios all showed greater chill accumulation and reduced occurrence of winters with insufficient chill for high-chill cultivars in Georgia. Notably, we show that the chill accumulation in 2017 would have been the lowest in the 37 year period in Georgia without climate change, suggesting that it was primarily driven by natural variability. However, the estimated 2017 chill accumulation excluding the modeled first-order influence of climate change for the peach-growing area of Georgia ranged from ~760 to ~920 chill hours across 23 models, with a median of ~825 h, well above the threshold of 650 chill hours required for moderate-chill cultivar and the ~660 chill hours observed that winter.
The reduced chill accumulation across the SEUS was modeled relative to contemporary 1981–2017 averages for the early and mid-21st centuries, with multi-model mean SEUS declines of ~100 h, and ~185 h, respectively (
Figure 4a,b). The geographic patterns of reductions in chill hours were similar to those shown for the influence of modeled climate change for the 1981–2017 period. Over Georgia (South Carolina) peach-growing regions, the average declines in chill were calculated as ~110 (~135) hours by the early 21st century and ~210 (~250) hours by the mid-21st century. In addition to declines in the average chill accumulation, the probability of experiencing a year with low winter chill accumulation increased. Averaged across the SEUS and across all models, approximately 20% of years by the early 21st century and 40% of years by the mid-21st century experienced low winter chill, with the greatest increases across western and northern Alabama, northern and central Georgia, and northern and central South Carolina (
Figure 4c,d). By the early and mid- 21st century, Georgia (South Carolina) peach regions saw ~15% (30%) and 32% (52%) of years having low winter chill, respectively.
With respect to peach cultivar-specific chill requirements, 23% (4%) percent of years showed insufficient chill for the high- (moderate-) chill cultivar in prime peach-growing counties in Georgia by the early 21st century, rising to 43% (11%) percent of years by the mid-21st century (
Figure 5a). The peach-growing regions in South Carolina, which did not see chill accumulations below established thresholds from 1980–2017, had 5% (0.25%) percent of years with insufficient chill for the high-chill (moderate-chill) cultivar by the early 21st century, and 12% (1.5%) percent of years by the mid-21st century (
Figure 5b). Notably, there was substantial inter-model variability in the risk of winters with insufficient chill. For example, the percent of winters with insufficient chill for the moderate-chill cultivar in Georgia ranged from 0–16% for the early 21st century, and 0–30% for the mid-21st century. By contrast, chill accumulation was sufficient for the low-chill peach cultivar under both future time periods in both states’ peach-growing regions.
4. Discussion
Recent studies have shown that extreme events around the globe would not have been possible without the influence of human-induced warming [
35,
36,
37]. Temperatures in the SEUS during the October-February chill accumulation periods of 2015–2016 and 2016–2017 were the 2nd and 3rd warmest since 1895, with the 1931–1932 winter being the warmest [
38], suggesting that such warm winters are possible within the bounds of natural variability and can occur without significant contributions from anthropogenic climate change. While we do not undertake a detailed attribution analysis, our modeling exercise provides support that recent insufficient chill accumulation in the SEUS peach regions, such as in 2017, would not likely occur under the same synoptic conditions in the absence of climate change. Further, our results showing an increased probability of low-chill winters due to climate change add to the growing body of literature defining the contribution of anthropogenic climate change to observed adverse climate impacts [
27,
39,
40].
Although insufficient chill accumulation is not a principle cause of loss for federally-insured crops in the SEUS [
41,
42], previous work has postulated that projected declines in chill may reduce suitability for perennial crop production [
19,
43]. Similarly, the projected future declines in chill accumulation across the SEUS complement previous work showing increases in the average and coldest winter minimum temperatures [
16], and declines in chill accumulation in regions around the globe [
21]. While this warming may offer range expansion for cold-intolerant crops, the related reduction in the winter chill accumulation in subtropical climates like the SEUS is projected to have negative impacts on warm-region fruit and nut crops, particularly those with moderate and high chill requirements [
20,
21]. However, the degree to which these declines may impact crop yield is unclear as uncertainties remain regarding the chill requirements that are physiologically needed for production, and the overall effect of marginal chill accumulation on crop yield and quality [
44,
45]. For example, while a common commercial peach cultivar grown in central Georgia has a stated chill requirement of 850 h, Georgia peach specialists have suggested that only 800 h are needed for a suitable crop [
4]. Consequently, we underscore that this work is not predictive of yield impacts related to reduced chill accumulation.
Compounding the problem of crop chill requirements is the questionable accuracy of the chilling model. While this model has been widely used for quantifying crop chill requirements, it may be overly sensitive to warming, potentially overestimating the impact of climate change [
43]. However, while it is acknowledged that previous studies have shown that the Dynamic Model may provide a more accurate representation of chill accumulation [
21], the 20-model mean changes in the average chill accumulation show an agreement of declines across the SEUS and other warm-winter regions, regardless of the chilling model (
Figure 6). Further, we recognize that familiarity with chill portions (the units of the Dynamic Model) may be lacking among extension agents and fruit industry professionals (Pamela Knox, University of Georgia Agricultural Climatologist, personal communication), and that regionally-defined chill portion thresholds do not yet exist for SEUS peach cultivars (Dario Chavez, personal communication). Finally, we acknowledge the limitations of using temporally disaggregated daily data [
46], and that the microclimates of orchard sites and orchard management practices may augment or abate the projected changes and impacts.
Despite research suggesting that declines in crop suitability due to climate change may not be as severe as shown in our results [
45], it is worth noting that we examined changes in chill accumulation under a conservative, moderate warming scenario. Provided that some degree of reductions in suitability are anticipated for peach crops across the SEUS—as well as for other crops with similar chill requirements—adaptive measures may be warranted to maintain production. These measures may include altering orchard management practices and selective planting. For existing orchards, the application of chemicals such as hydrogen cyanamide may effectively break dormancy in insufficiently-chilled peach crops [
47], overhead irrigation to encourage evaporative cooling may aid chill accumulation, and orchard management practices such as controlling tree vigor may help to lower the chill needed for successful bud break [
48]. For future orchards, site selection with preferential planting in sites with cooler microclimates, such as low-lying cool-air sinks, may provide an opportunity to increase exposure to chilling temperatures. Orchard managers may also consider specific scion and rootstock combinations that may help mitigate the negative impacts of low chill [
49]. Moreover, a transition to crop cultivars with lower chill requirements (e.g., Gulfcrest or other varieties developed for warmer climates) may reduce or eliminate the negative impacts of declining chill accumulation under climate change, as evidenced by the minimal impact of future warming to the low-chill cultivar examined in this study. However, it is noted that orchards planted in cool-air microclimates may be at increased risk of frost damage, and lower-chill cultivars may be more susceptible to early bloom and subsequent frost damage. While quantifying the complex relationships between chill accumulation, bloom, and the relative risks of insufficient chill and spring frost damage are beyond the scope of this work, the interactions between these physiological and climatic conditions highlight the need to consider a broader suite of environmental and economic considerations in planning for future orchard management.
As has been suggested for perennial crop adaptation in other regions [
19,
50], the translocation of crops to cooler climates may also provide an adaptive measure for maintaining peach cultivation in the SEUS, particularly for those cultivars with higher chilling requirements. Historically, peach cultivation in Georgia extended into the northern portion of the state, but the favorability of that region declined over time due to frequent freeze damage [
51]. If climate change reduces the freeze risk in northern Georgia, the area may provide a refuge within the state for continued cultivation of high-chill peach cultivars and other similarly at-risk perennials. However, any future translocation would require significant capital and be contingent upon economic viability, which is likely to be predicated on factors such as topography and soils, the costs associated with the purchase of farmland and the packing or processing facilities, competing land use, and market forces.