3.1. Sensitivity Analysis
The sensitivity analyses were performed on the 33 model parameters as indicated in
Table 3. We measure the sensitivity of the root mean squared error (RMSE) metric, calculated using the sampled soil water salinity (EC
sw) at the three depth intervals to ensure that our sensitivity indices are grounded relative to the observed soil salinity. The sample sizes and corresponding number of model evaluations required for both the Elementary Effect (EE) and Sobol’ methods are listed in
Table 4. For the EE method, a sample size of
n = 1000 with 10 optimal trajectories were used resulting in 340 model evaluations. Since the Sobol analysis was performed with the reduced parameter space results determined from the prior EE approach, the number of samples selected ranged from 10 to 12 and iterations were discontinued at the point where the sensitivity results converged. The Latin Hypercube sampling design was used with
n = 50 sampling points to generate a near-random sample of parameter values as proposed by Reference [
56]. The open-source implementations were used for both methods [
48]. Convergence was considered acceptable when within the 95% confidence interval.
Among the 33 rootzone model parameters (
Table 3), the EE method revealed that only nine strongly influenced the output soil–water EC (EC
sw) as summarized in
Table 5.
Table 5 indicates parameter sensitivity values µ, µ*, µ*_conf and Σ/EE from the elementary effect analysis. The mean of the distribution of EEi (Σ/EE) is a proxy for a total sensitivity index of i
th input factor and µ*_conf is the confidence interval of the µ* at 95%. We chose these nine parameters based on the combination of small µ* values with corresponding large σ values (
Figure 5). High σ value indicates that the EE are strongly affected by the choice of the sample points at which they are computed, therefore a non-negligible interaction with other factor values as illustrated in
Figure 5 and
Figure 6.
Next, we applied Sobol’s sensitivity analysis to determine factor fixing using the nine influential factors from the EE results as summarized in
Table 6, we reported the first-order (Si) and total order sensitivity (S
Ti). Of the nine factors, only two were non-influential by this analysis, that is, they resulted in
. These non-influential parameters towards average root zone salinity included the capillary fringe height (Hd) and depth to groundwater (Hwt), so these were fixed at average values for the model calibration analysis. Interestingly, S
T values for Tsrz for layer 2 and layer 3 indicate that saturated soil moisture content is more influential for these layers than Ks, and especially that Tsrz in layer 2 is the most influential parameter in the model. Given that we used the control site for model calibration and irrigation of vegetables for this site is managed with sprinklers for establishment and then drip for the rest of the plant development stages, we suspect that plant water uptake is mainly from the bottom layers. However, in our model plant water uptake even with drip is assumed to be mainly from the top layer: 60–30–7–3% uptake pattern. Root water uptake (RWU) although included in the sensitivity analysis was not influential. Mostly likely, upward flow of soil water from the second layer provides water required for uptake in the top layer. Soil moisture is directly related to unsaturated conductivity (a driving force for soil water from wet to dry soil layer). As such the water saturated water content in the second layer ended up being most influential in our calibration. It is important to note that with respect to the crop rooting depth, it is likely that the lettuce rootzone depth is important for the control site specifically as it was the main crop grown for the majority of the experiment period. For other sites, it may be important to include variability of the crop rooting depth.
3.2. Calibration and Validation
Model calibration was performed allowing the parameters k1, k2, ZrL, Tsrz, Ksrz, Tfcrz, and Twprz identified as most sensitive to model determination of soil–water EC (EC
sw) to vary within their ranges and output compared to the measured soil salinity at the control site. Model validation was completed by simulating EC
sw for site 3. The inclusion of the saturated soil water content and saturated hydraulic conductivity in the calibration was crucial as infiltration rates were expected to vary with changes in exchangeable sodium in the soil. O’Geen [
57] provides classification of salt-affected soils based on trends of soil water EC, exchangeable sodium percentage (ESP), and SAR. We assessed the potential infiltration problems caused by irrigation water quality following Reference [
6] (p. 44) and found that slight-to-moderate reduction in infiltration rates due to irrigation water salinity were expected for the control site and site 4 (
Figure A2). It is however interesting to note that blending well water and recycled water alleviated the possible adverse effects of well water on soil infiltration rates.
The latin hypercube sampling design was used with
N = 50 sampling points as proposed by Reference [
56]. Intervals were sampled without replacement to ensure even distribution of points with respect to each variable. We executed the model 50 times and computed the corresponding RMSE associated with model predictions. An open-source global optimization code DEoptim written in R was used to find a global minimum RMSE [
58]. DEoptim implements the differential evolution algorithm for global optimization. The estimated best fits with the least RMSE values are listed in
Table 7.
Summarized in
Table 8 are the indices associated with comparisons between EC
sw measured in the field and model predicted values at different soil depths. For all depth intervals, the sum of first-order effects and sum of total order indices is greater than one, indicating that there are interactions among model factors. Moreover, for factors that have total indices greater than their first-order values, other factors are taking part in the interaction such that throughout the soil profile the root zone hydraulic parameters, rooting depth and dissolution rate are taking part in determination of the soil-water EC, with the dissolution rate accounting for the largest fraction of output variance. This observation provides insight about how well water flow and salinity is modeled.
Model realizations for the calibration and validation runs were compared with measured EC
sw and shown in
Figure 7. Model predictions did not capture large values of observed EC
sw though model performance improved with increased soil depth. The RMSE index indicated large discrepancies between predicted and measured values, hence the greater values at site 3. Whereas R
2 values reflect the combined dispersion against the single dispersion of the observed and predicted values. The mean relative error (MRE < 30%) for all layers indicates satisfactory model performance, while the larger NSE values for the top soil layer indicate that the modeling effort is worthwhile in predicting near surface salinity to depths of 0.3 m.
For all calibrated model runs, regressions of the predicted vs. observed values resulted in a non-zero intercept, b of nearly 1 dS/m. Taken together with the low
R2-values, we conclude that the model persistently underestimates EC
sw, especially those observed EC
sw values greater than ~2 dS/m. Based on the relatively small MRE values that provide an indication of the magnitude of the error relative to observed values without considering the error direction, the model captures salinity dynamics for all layers in the root zone. However, the
Figure 8 plot of residuals vs. predicted EC
sw for the calibration and validation results exhibits heteroscedasticity, that is, residuals grow as the predicted EC
sw values increase. Overall, this latter observation suggests that although the NSE, RMSE, and MRE statistics show that the model has some predictive capacity, it does not capture some processes apparently involved in the soil salinity dynamics.
The possible explanation for the differences in measured and predicted soil salinity is that the model does not account for fertilizer and soil amendment management, or plant root uptake of solutes and fertilizer. Generally, transformation (e.g., dissolution) in the soil of different chemicals added during fertigation will increase soil salinity. For example, urea is converted to ammonium that is adsorbed in the soil depending on sol temperature; ammonium is converted to nitrate by nitrification that depends on soil temperature, soil moisture, pH and oxygen content, and nitrate is highly mobile but ammonium, potassium, and phosphorus remain relatively immobile in the root zone. Although the salt index (SI) based on equivalent units of sodium nitrate (developed in 1943 to evaluate the salt hazard of fertilizers) alone cannot be used to evaluate the effect of increased soil salinity from fertilizer applications, it can be used as indicator for the long-term effects on soil salinity. The most commonly used fertilizers in the study region (from California Department of Food and Agriculture annual reports) were nitrogen fertilizers that included urea ammonium nitrate solution (SI = 95), ammonium nitrate (SI =102) and calcium nitrate (SI = 53); phosphorus fertilizer (SI = 7.8-29), potassium sulfate (SI = 46), gypsum, and lime. Sodium nitrate was arbitrarily set at 100, where EC of 0.5 to 40 mass percentage of sodium nitrate is 0.54 to 17.8 dS/m and for a mixture of materials it is reasonable to assume EC is additive for horticulture. As such, the lower the index value the smaller the contribution the fertilizer makes to the level of soluble salts. Thus, fertilizer applications likely add to soil salinities exacerbating the problem over time.
Although the model underestimates EC
sw, it adequately captures salinity trends in the leaching of salt during winter months and an increase in salinity water applications and ET during the crop season. In an effort to evaluate performance of transient vs. steady state models, Reference [
12] concluded that the transient models better predict the dynamics of the chemical–physical–biological interactions in an agricultural system. However, since we account for irrigation water salinity, rainfall salinity and dissolution of salts in the soil and exclude additions of fertilizer and soil amendment our simulated EC
sw values can be viewed as a likely lower bound of soil salinity associated with the irrigation and farm management practices considered in the model description.
Another complexity possible affecting model prediction is the spatial distribution of salinity with drip irrigation as noted by Reference [
59]. They used the transient Hydrus-2D model to compare results between field experiments having both drip and sprinkler irrigated processing tomatoes under shallow water table conditions for a wide range of irrigation water salinities. Both field and model results showed that soil-wetting patterns occurring under drip irrigation caused localized leaching which was concentrated near the drip line. In addition, a high-salinity soil volume was found near the soil surface that increased with increasing applied water EC. Overall, localized leaching occurred near the drip line while soil salinity increased with increasing distance from the emitter and with increasing soil depth. Such localized non-uniformities in leaching are not captured in the one-dimensional model but may have affected field soil sampling in the drip–irrigated fields of our study region. That is, soil samples collected some distance from where dripline emitters were previously operating would likely have greater salinities than would otherwise occur under the uniform leaching and dissolution conditions assumed in the model. Nonetheless, it is important to note that this model is user-friendly and less data intensive and it can be very useful for setting reference benchmarks of long-term salinity impacts of using saline water for irrigation.
3.3. Long-Term Salinity with Treated Wastewater Irrigation
We used the calibrated and validated model to simulate the long-term (50-year periods) soil salinity in the fields irrigated with varying fractions of treated wastewater, that is, we applied the model to control site and sites 2 to 7 (
Table 2 and
Table 3). Fifty-year simulations assumed randomly selected rainfall and ET
o data from historical records (1983 to 2014) from 2013 to 2049 and the 13-year cropping patterns and irrigation management. In the model calibration, the groundwater table height (Hwt) and groundwater salinity (ECgw) were found to be non-influential parameters with respect to the measured soil water EC (EC
sw). As such average values of measured Hwt and ECgw from the monitoring well located ~5 km west of the study site were used, that is, 0.95 m and 0.97 dS/m, respectively. For each simulated case, the three output variables of interest were averaged root zone salinity over the growing season expressed as EC
eS (dS/m), annual drainage salt output load as S
d (kg/ha/year), and annual runoff salt output load S
r (kg/ha/year).
Values for the annual average water-balance terms over the 13-water year simulation at each site are summarized in
Table 9a and for the 50-water year simulation in
Table 9b. Somewhat greater actual crop water uptake (ET) is achieved under drip and sprinkler irrigated fields with similar crop rotations (sites 2 and 3 as compared to 6 and 7). Leaching fractions were generally very low for all sites at less than 2%. The greatest leaching occurred in the vegetable–strawberry rotation that was first sprinkler than drip irrigated, while the other irrigation and cropping practices yielded similar leaching volumes with the exception of no leaching from drip-irrigated artichokes. Similarly, surface runoff is smaller from drip or sprinkler irrigated fields as compared to sprinkler-furrow irrigated fields.
Simulated average (50-year) annual root zone soil water salinity for all sites is shown in
Figure 9. Sites managed with sprinkler or drip irrigation and with higher salinity water (EC
w) had higher estimated annual average EC
sw, for example, at sites 3, 4, and 5, EC
sw was 2.94 dS/m, 3.15 dS/m, and 3.44 dS/m, respectively. For sites 6 and 7 that used sprinklers for germination then furrow irrigation, the latter site received water with higher salinity than the former 1.37 dS/m compared to 1.1 dS/m on vegetable and strawberry rotation, but the resulting average annual soil water salinity differed little, that is, 2.05 versus 2.89 dS/m, respectively. The control site irrigated with well water had the least annual average rootzone soil–water salinity. Furthermore, soil–water salinity equilibrium EC
sw ≤ 2.0 dS/m was reached throughout the 50-year horizon for the control site irrigated with well water and after 12 years of irrigated with blended wastewater for sites 2, 6, and 7, whereas for sites 3, 4, and 5 soil water EC increased above 2.0 dS/m in the simulation period. Using Mann–Kendall analysis we found that actual ET had a positive and significant association whereas irrigation amounts had a negative and significant association with EC
sw ≤ 2.0 dS/m (Tau = 0.321,
p-value = 0.016 and Tau = −0.268,
p-value = 0.046 respectively). The Mann–Kendall Tau values indicate the strength and direction of monotonic trends, with −1 and 1 representing perfectly negative and positive monotonic trends, respectively, while the
p-value indicates relative significance [
60].
As Platts and Grismer [
11] found that salt leaching to deeper soil layers occurred during the rainy season (October–March), while during the growing season soil–water EC increases in soil layers near the surface due to evapo-concentration, on the other hand, applied water salinity causes soil–water EC spikes during the growing season. Rainfall was important towards salt leaching from the root zone at all sites as evident during the wet years 2010, 2016, 2025, 2026, 2038, and 2046 when average annual soil water EC decreased. With the exception of sites 3, 4, and 5, there were no upward trends in soil water salinity over the 50-year period. Overall, relatively constant soil–water EC after 50 years simulation of EC
sw < initial EC
sw of 2.19 dS/m for all sites except site 5 suggest that there was adequate soil leaching in the region for sustained use of the treated wastewater for irrigation. However, the question remained as to what level of soil salinity would be acceptable especially for annual strawberry production.
Crops are generally assumed to respond to seasonal-averaged root zone salinity of the saturated paste (EC
eS) and yield loss thresholds and rates of decline with increasing salinity have been determined based on salinity thresholds in [
61]. We calculated EC
eS for all the sites and plot the range of EC
eS in
Figure 10. The maximum seasonal-averaged saturated paste EC for each site was 0.19, 0.27, 0.20, 0.23, 0.24, and 0.26 dS/m for sites 2, 3, 4, 5, 6, and 7, respectively. These values are less than half that of the lowest Mass–Hoffman threshold value of
dS/m associated the most sensitive crop (strawberry) in the rotations considered. As such, it is unlikely that long-term irrigation with treated wastewater in the region will adversely affect crop yields significantly.
In terms of possible adverse environmental effects associated with salinization of surface and ground waters in the region, we determined the cumulative salt output load with deep percolation (S
d) and salt output load with runoff (S
r) during the 50-water year simulation period for the different sites as shown in
Figure 11. Salts accompanying surface runoff pose a larger threat in the watershed as these are an order-of-magnitude greater than the cumulative salt output loads to groundwater. Salt loading with deep percolation for the 50-year simulation range up to 3,377 kg/ha with the greatest loads from site 3 and minimal loading for sites 4 and 5 (40 kg/ha and 49 kg/ha respectively) on which annual artichoke crops were grown. Cumulative salt loads accompanying runoff ranged from 19,918 to 59,552 kg/ha, with the greatest loading emanating from site 3 and least from site 4. In comparison to the control site irrigated with well water cumulative salt output loading with deep percolation was 1325 kg/ha and salt output load with runoff 21,505 kg/ha.
To clarify what factors were key to affecting adverse environmental salinization within the region, we tested the effect of applied water EC (EC
w) and depths, rainfall depths, actual crop ET and potential crop ET, and number of days fallow on soil water EC (EC
sw) and salt output loads with runoff (S
r) or deep percolation (S
d) using the non-parametric Mann–Kendall trend analysis (the ‘Kendall’ test in the R package [
60]. With respect to the Mann–Kendall Tau values (
Table 10), the applied water EC (EC
w) and rainfall depths had positive and significant effects on annual average soil water EC (EC
sw) and annual salt output loads with runoff (S
r) and deep percolation (S
d). Calculated actual crop ET had a positive and significant effect on EC
sw; this is expected as water uptake by the plant and evaporation leave salts behind. On the other hand, actual crop ET had a negative and significant effect on S
d, and had no significant effect on S
r. Applied water depths had positive and significant effect on S
d and S
r. The number of days fields were fallowed had a negative effect on EC
sw and a positive effect on S
d and S
r. Overall, these observations were consistent with that expected from the field observations and described above.