Next Article in Journal
Analysis of Pressure Transient Following Rapid Filling of a Vented Horizontal Pipe
Next Article in Special Issue
Using a Distributed Hydrologic Model to Improve the Green Infrastructure Parameterization Used in a Lumped Model
Previous Article in Journal
Infiltration and Inflow (I/I) to Wastewater Systems in Norway, Sweden, Denmark, and Finland
Previous Article in Special Issue
Urban Irrigation Suppresses Land Surface Temperature and Changes the Hydrologic Regime in Semi-Arid Regions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Introduction of an Experimental Terrestrial Forecasting/Monitoring System at Regional to Continental Scales Based on the Terrestrial Systems Modeling Platform (v1.1.0)

1
Institute of Bio-Geosciences (IBG-3, Agrosphere), Forschungszentrum Jülich GmbH, 52425 Jülich, Germany
2
Centre for High-Performance Scientific Computing, Geoverbund ABC/J, 52425 Jülich, Germany
3
Jülich Supercomputing Centre, Forschungszentrum Jülich GmbH, 52425 Jülich, Germany
4
Meteorological Institute, Bonn University, 53121 Bonn, Germany
5
Forecast Department, European Centre for Medium-Range Weather Forecasts, Reading RG2 9AX, UK
*
Author to whom correspondence should be addressed.
Now at CISS TDI GmbH, Sinzig, Germany.
Now at the Department of Environment, Ghent University, Belgium.
§
Now at Environmental Computing Group, Leibniz Supercomputing Centre, Germany.
Now at the Department of Environmental Research and Innovation, Luxembourg Institute of Science and Technology, Luxembourg.
Water 2018, 10(11), 1697; https://doi.org/10.3390/w10111697
Submission received: 26 October 2018 / Revised: 15 November 2018 / Accepted: 16 November 2018 / Published: 21 November 2018

Abstract

:
Operational weather and flood forecasting has been performed successfully for decades and is of great socioeconomic importance. Up to now, forecast products focus on atmospheric variables, such as precipitation, air temperature and, in hydrology, on river discharge. Considering the full terrestrial system from groundwater across the land surface into the atmosphere, a number of important hydrologic variables are missing especially with regard to the shallow and deeper subsurface (e.g., groundwater), which are gaining considerable attention in the context of global change. In this study, we propose a terrestrial monitoring/forecasting system using the Terrestrial Systems Modeling Platform (TSMP) that predicts all essential states and fluxes of the terrestrial hydrologic and energy cycles from groundwater into the atmosphere. Closure of the terrestrial cycles provides a physically consistent picture of the terrestrial system in TSMP. TSMP has been implemented over a regional domain over North Rhine-Westphalia and a continental domain over Europe in a real-time forecast/monitoring workflow. Applying a real-time forecasting/monitoring workflow over both domains, experimental forecasts are being produced with different lead times since the beginning of 2016. Real-time forecast/monitoring products encompass all compartments of the terrestrial system including additional hydrologic variables, such as plant available soil water, groundwater table depth, and groundwater recharge and storage.

1. Introduction

Weather forecasting operates at global and local scales providing predictions into the future from minutes, hours, and days to months on spatial scales from 100 meters to several kilometers [1]. Most national weather services such as, e.g., the German national weather service DWD (Deutscher Wetterdienst) [2] and the European Centre for Medium-Range Weather Forecasts [3] develop and operate such forecasting systems. Weather forecasting systems constitute a major infrastructure consisting of state-of-the-art software stacks and supercomputing hardware technologies, real-time data streams of in-situ and remotely sensed observations, and global data exchange infrastructures [4]. In operational forecasting, the software stack consists of a number of sophisticated components including pre- and post-processing tools, the numerical weather prediction (NWP) model, and data assimilation (DA) software, which constitute the kernel of the prediction system and comprise decades of natural science findings and scientific software design knowledge (e.g., [3,5,6,7]). In essence, these prediction systems provide real-time forecasts of the states and fluxes in the atmosphere based on a set of initial conditions (IC) and, in the case of regional forecasts, lateral boundary conditions (BC), which are continuously corrected with observations through data assimilation. In many atmospheric centers, the software stacks are implemented on some of the most powerful parallel computing and storage hardware available today, providing the required capacity and redundancy. Therefore, in terms of operational forecasting, the national weather services represent the state-of-the-art [5]. In addition, there exists a large number of post-processed products of available institutional numerical weather prediction simulation results, so-called re-analyses.
Similarly, in hydrologic forecasting of flood events, numerical hydrologic ensemble prediction systems (HEPSs) are paired with DA technologies in operational forecasting frameworks similar to NWP systems briefly outlined above [8,9,10,11]. A number of HEPS exist around the world (see for example, EFAS, the European Flood Alert System center [9] and the National Oceanic and Atmospheric Administration (NOAA) National Water Model in the US). In the hydrologic forecast realm, agricultural drought prediction receives attention and poses a number of additional scientific and applied challenges, because of long-term memory effects in the physical processes and scarcity of observations [12]. Often, drought information is made available in so-called drought monitors, which provide recent hindsight information (http://droughtmonitor.unl.edu/) based on input from NWP and atmospheric reanalysis products. It is important that drought forecasts extend further into the vadose zone and groundwater because these compartments constitute essential water storages relevant for e.g., public water supply, agriculture, and industry. While water resources management is considered a hot topic worldwide, only very few operational, real-time water resources prediction systems are implemented in an institutionalized fashion online today. Examples include The Netherlands Hydrologic Instrument [13] and the US National Water Model, which however does not include the groundwater compartment and focuses on stream discharge and land surface variables (http://water.noaa.gov/).
In summary, juxtaposing NWP and HEPS, we note that while there is a plethora of literature on even real-time HEPSs in the scientific community, the large majority of HEPS developments and applications remain in the scientific realm and have not been operationalized. Moreover, NWP and HEPS compartmentalize the earth system into the atmosphere and land surface-soil domains instead of providing a complete and physically consistent forecast of the water, energy and momentum cycles from the bedrock to the top of the atmosphere. In addition, the groundwater compartment is generally missing in the simulations. The situation is, of course, rooted in the complexity and sometimes limited understanding of the system, the lack of adequate computing resources, the objectives of the specific forecast, and legislative mandates.
As a result, consistent forecasts of the complete terrestrial water and energy cycle from groundwater across the land surface into the atmosphere at scientific and socioeconomic relevant space and time scales are missing. This also holds for the current state of water resources, which is generally only known from sparse, local in-situ observations and temporal snapshots from remote sensing of the land surface. Thus, society is exposed to a large degree of uncertainty with respect to the current and future state of freshwater resources, the future trajectory of water use and management, and water-related hazards, such as floods and droughts.
In this paper, we present an implementation of a fully automatized, regional to continental scale forecasting/monitoring system from groundwater into the atmosphere based on the Terrestrial Systems Modeling Platform, TSMP [14,15]. The overarching goal of the system is to extend the atmospheric-centric view of forecasting to the complete terrestrial ecohydrological system ultimately including the biosphere. TSMP, developed within the framework of the Transregional Collaborative Research Centre TR32 [16] has been applied in a wide variety of studies ranging from the regional to the continental scale with generally reasonable agreement to observations [17,18,19,20,21]. In addition, human water use has been implemented also assessing the potential added value in the comparison to precipitation and evapotranspiration products [22,23].
In this study, the term monitoring system is preferred, in order to emphasize the reciprocity of model based simulations and observations in the application of data assimilation technologies: Models are used to interpolate (in space and time) between observations providing information everywhere and at any time, while observations are used to correct the simulations acknowledging model and observational uncertainties. Currently, the monitoring system produces time lapse images of all relevant states and fluxes of the terrestrial water and energy cycle (Table 1) at the applied model resolution including uncertainty estimates. The system is considered as a blueprint of an operational system, ultimately merging all available observations from the subsurface to the atmosphere with a physics-based, high-resolution terrestrial model to provide best estimates of the current state of the system and to generate forecasts/projections at resolutions relevant for scientists and stakeholders. Here, we highlight technical requirements, the experimental design, and considerations on improving prediction accuracy with precipitation radar integration as preliminary steps to data assimilation. Validation of the simulation results will be subject of a future study honoring the coupling of all variables of the terrestrial water and energy cycle, and including observation uncertainty. New approaches must be developed to account for mass and energy closure from groundwater into the atmosphere. In the following, we outline and describe the components of the monitoring system, provide information on the workflow and details of the automation, example forecast products, and a discussion of the path forward.

2. Materials and Methods: Terrestrial Monitoring Systems’ Components and Domains

The kernel of the Terrestrial Monitoring System (TMS) consists of a number of simulation software and hardware components, and data streams, which are linked in a scripted, automated workflow (Section 3). The software components encompass the actual forward or forecast model TSMP (Terrestrial Systems Modeling Platform, version 1.1.0), which has been proposed by [14] and installed at a Linux cluster of the Jülich Supercomputing Centre. TSMP has been implemented over a regional and continental model domain, which are slightly smaller than North Rhine-Westphalia (NRW) and larger than the pan-European domain (EU), respectively (Figure 1). Additionally, the hydrologic state of the TSMP model setup for the NRW domain is corrected with the help of precipitation-radar observations to improve model initialization and the ensuing forecast products. The TMS follows a specific monitoring clock, which is mainly determined by the availability of boundary conditions from operational weather forecasting centers and observation periods. Finally, a number of monitoring products are produced and published online.

2.1. Software and Hardware Components of the Terrestrial Monitoring System, TMS

The dynamic kernel of the TMS is the parallel TSMP, which is a scale-consistent, fully integrated groundwater-to-atmosphere model, conserving moisture and energy across all compartments of the geo-ecosystem for different domains. TSMP consists of the numerical weather prediction system (COSMO, v4.21) of the German national weather service (DWD, [24]), the Community Land Model (CLM, v3.5, [25]), and the variably saturated surface-subsurface flow code ParFlow (v3.1, [26,27,28]). The interested reader is referred to the aforementioned publications and [14] for a detailed description of the modeling system. In order to exchange fluxes and states across the individual component models of TSMP and close the terrestrial water and energy cycle, the Ocean-Atmosphere-Sea-Ice-Soil coupler interfaced with the Model Coupling Toolkit (OASIS3-MCT [29]) is used. The porting, profiling, optimization and scalability of TSMP in a massively parallel supercomputing environment has been documented in [15]. The unique feature of TSMP is that all states and fluxes of the terrestrial water and energy cycle from groundwater across the land surface into the atmosphere are simulated in a mass and energy conservative and scale consistent implementation. Thus, the platform provides simultaneously consistent predictions in all terrestrial compartments, i.e., the subsurface, land surface, and atmosphere.
In the current setup, which has been running continuously since January 2016, the bulk of the monitoring simulations are performed on the Jülich Research on Exascale Cluster Architecture (JURECA) high performance computing (HPC) system at the Jülich Supercomputing Centre (JSC) in Germany. JURECA is a 1872 node (45,216 cores) supercomputer based on T-Platforms V-class server architecture with a peak performance of 1.8 (CPU) + 0.44 (GPU) Petaflops per second. The nodes consist of two 12 core Intel Xeon E5-2680 v3 Haswell CPUs and three different DDR4 memory configurations (128 GB, 256 GB, 512 GB), which are connected through a Mellanox Enhanced Data Rate (EDR) Infiniband network with fat tree topology and a 100 GB per second connection to the General Parallel File System (GPFS) storage cluster (https://www.fz-juelich.de/ias/jsc/).
The cluster uses the CentOS 7 Linux distribution (https://www.centos.org) as operating system and handles the resource management with the slurm batch system and Parastation. Also relevant for this work is the fact that cronjobs (tasks scheduled in the background) are prohibited, ssh (secure shell) connections are secured by Rivest-Shamir-Adleman (RSA) key-pairs, and compute nodes do not have internet access. This is commonly the case in supercomputing environments, which are not designed and operated for operational use, and must be taken into account in the design of the workflow. The maximum wall-clock time is 24 hours and maintenance downtime affects the whole system for several hours on a regular basis (on demand 1–2 times per month).
Part of the ensemble simulations of the NRW domain were performed on JUQUEEN, JSCs 5.0 Petaflop IBM Blue Gene/Q supercomputer with 458752 cores and 448 TB main memory. JUQUEENs 28 racks are further divided into 32 nodeboards, which are the smallest allocatable units, each consisting of 32 nodes with 16 IBM PowerPC A2 CPU cores (4-way simultaneous multithreading). Each node has 16 GB DDR3 SDRAM and is connected through a 5-D torus (4 × 4 × 4 × 4 × 2 configuration) network with very high peak bandwidth (40 GB s-1). JUQUEENs job submissions are handled by the LoadLeveler job scheduling system. JUQUEEN has been decommissioned in May 2018 and replaced by a new supercomputing architecture, which will be used to generate the ensemble in future.

2.2. Monitoring Domain Setup

2.2.1. The North Rhine-Westphalia domain

The regional scale domain (Figure 1) covers the south-western part of Germany’s North Rhine-Westphalia (NRW) including parts of Belgium, Luxembourg, and the Netherlands. NRW contains the catchments of the rivers Rur, Erft and Swist and partially Rhine, Moselle, and Meuse. The domain is 150 × 150 km large and has a convection permitting resolution of 1 km for the atmosphere and 0.5 km for the land surface and subsurface resulting in a 150 × 150 and 300 × 300 grid size with regard to nx and ny, respectively. The number of cells in the vertical dimension is nz = 30 for ParFlow, nz = 10 for CLM, and nz = 50 for COSMO. In the atmospheric model COSMO, the time step size is determined by the spatial discretization and set to dt = 10 s. Time integration of the relevant exchange fluxes with the land surface and subsurface model CLM and ParFlow is performed by OASIS3-MCT at a 900 s interval, which is also the time step of CLM and ParFlow and the coupling time step between COSMO, CLM, and ParFlow. Between 2016 and 2018, the atmospheric initial conditions and hourly boundary conditions were provided daily by a deterministic forecasting run of the DWD, uploaded to a File Transfer Protocol (FTP) server at the Meteorological Institute of the University of Bonn, and then ingested automatically to the TMS every 3 h with a maximum lead time of 25 h. Note, in May 2018, DWD changed their data policy and provision, which resulted in a hiatus in the boundary condition from the operational forecast. Currently, the NRW monitoring system is adopted to ingest initial and atmospheric boundary information from the 12.5 km resolution European monitoring system driven by European Centre for Medium-Range Weather Forecasts (ECMWF) forcing data described below via a one-way nesting every 6 hours and with maximum lead time of 72 h. The hydrologic boundary conditions are of the Dirichlet type and apply a constant pressure head profile. Hydrologic initial conditions are obtained from the previous day’s forecast, resulting in a continuous hydrologic time series. Details of the parameterizations and physical parameter values can be found in [14,21]. The forecast is distributed over two nodes on JURECA and requires approximately 1 h compute (wall-clock) time. In a more recent development, the initial conditions are improved continuously via hindsight simulations based on assimilated rain radar information.
In the groundwater-to-atmosphere simulations of the monitoring system, precipitation rates and patterns are always a challenge to model [30]. In weather forecasting, incorrect precipitation fields propagate into the water and energy fluxes of the lower land surface boundary condition that are evapotranspiration and sensible heat flux, which in turn influence the aforementioned atmospheric processes through two-way feedbacks. However, in the operational setup of the weather services, this error propagation is alleviated via data assimilation, which constantly corrects the atmospheric and land surface states by means of observational data during the assimilation phase of the forecast. In contrast, in the current TMS, incorrect precipitation fields are propagated directly into the land surface and groundwater compartments, because of the fully coupled nature of the TMS and the lack of strongly coupled data assimilation capabilities across the different terrestrial compartments, which is still subject of scientific research [31,32]. In a first correction step for the NRW domain, the precipitation fields from the monitoring run are replaced by rain radar information from high-resolution X-band radar composites at the end of the monitored day. While rain rates from radar observations are also error prone, we expect an overall improvement in comparison to the predicted rain rates from NWP. The observed rain rates in the atmospheric forcing data set are then used to perform an offline subsurface–land surface hydrologic simulation in order to improve the states and fluxes, and, thus, the initial condition for the following day with the observed rain rates. This simulation is called correction run in the following sections.
The rain radar products covering the North Rhine-Westphalia (NRW) domain are provided by the Meteorological Institute, Bonn University. They are composed of the measurements of the polarimetric X-band radar at the Meteorological Institute in Bonn (BoXPol) and the polarimetric X-band radar in Jülich (JuXPol), for technical details of the radar see [33]. The twin-radars provide volume scans consisting of plan position indicators (PPIs) measured at different elevation angles every 5 min in order to monitor the 3D structure of precipitating clouds in the North Rhine-Westphalia domain. The rain product is based on the lowest level of the 3D composite and is transferred for ingestion into the monitoring system at the end of each day. For details of the rain product the reader is referred to [34].

2.2.2. The Pan-European Domain

The continental scale pan-European domain (EU, Figure 1) conforms to the of the World Climate Research Program’s Coordinated Regional Downscaling Experiment (CORDEX) EUR-11 model grid specification, which covers Europe and parts of northern Africa, western Russia and Asia. The domain is 5450 × 5300 km large and discretized at a resolution of 12.5 km resulting in a 436 × 424 grid size with regard to nx and ny. The number of cells in the vertical direction is nz = 15 for ParFlow, nz = 10 for CLM, and nz = 50 for COSMO, reaching depths and heights of 103 m, 3 m and about 21 km, respectively [19]. The time step size is set to dt = 60 s for COSMO and dt = 1 h for ParFlow and CLM which is also the coupling frequency. The atmospheric initial conditions and hourly boundary conditions are provided on a daily basis from a high-resolution forecasting run of the ECMWF. The hydrologic boundary conditions are of the Dirichlet type and mimic the sea surface with a shallow water table depth along the coastlines [19]. Hydrologic initial conditions are obtained from the previous day’s forecast, resulting in a continuous hydrologic time series. Due to potential biases, drifts in hydrologic states and fluxes may occur that have not been diagnosed so far. Details of the parameterizations and physical input parameter values can be found in [19]. A forecast of 72 h is performed that is distributed over 6 JURECA nodes and requires approximately 1.5 h compute (wall-clock) time.

2.3. Monitoring Clock

Figure 2 shows the monitoring clock, i.e., the timeline of the workflows for the NRW and EU domain (Section 3), that is executed automatically on a daily basis. Both workflows are the same, except for the precipitation radar correction, which is only applied for NRW, and the provision of the atmospheric initial and boundary condition information, which is provided from the DWD and ECMWF for NRW and EU, respectively. In case of NRW, the monitoring clock starts with the retrieval of the precipitation radar products of the University of Bonn shortly after midnight Universal Coordinated Time (UTC) followed by the correction run of the previous day. This run usually requires less than 1 h of compute time. DWD’s and ECMWF’s atmospheric initial and boundary conditions are made available around 02:00 UTC and preprocessing starts after the retrieval of the files from a FTP server. The actual monitoring runs with TSMP commence as soon as the initial and boundary conditions are preprocessed (around 02:15 UTC). The TSMP runs usually require between 1 and 1:30 h. In an ensuing step, post-processing is performed and forecasts are made publicly available via videos on YouTube around 03:45 UTC (see Section 3.1). The start of scripts is automatically postponed if required preceding steps are not completed or in case of missing initial and boundary information. The different steps of the monitoring clock are detailed in Section 3.

3. Workflows and Automation

In this section, the workflow for creating a model run and its technical implementation is described including pre- and post-processing. In addition, the precipitation radar correction approach is outlined in more detail.

3.1. Monitoring EU and NRW

The monitoring workflow for the EU and NRW domains consists of sequentially executed steps detailed below and in Figure 3. Note, both monitoring runs, NRW and EU are based on the same workflow implemented on the supercomputer JURECA of the JSC. The flowchart in Figure 3 is valid for EU and NRW with some added complexity due to the precipitation radar correction in case of NRW as aforementioned (see Section 3.2 and Figure 4).

3.1.1. Retrieval of Boundary and Initial Conditions

All partial differential equation based models require initial and boundary conditions to close the mathematical problem. In TSMP, the hydrologic model ParFlow has fixed boundary conditions of the Neumann and Dirichlet type in case of the EU and NRW domains, and a free overland flow boundary condition at the land surface. Initial conditions are obtained from the previous day’s run and from the correction run applying the precipitation radar information in case of NRW. The lateral boundary condition for COSMO is read from a forcing file which is obtained by the nesting approach starting from a global NWP model. These boundary conditions together with the initial conditions are provided by DWD (for the NRW domain) and ECMWF (for the CORDEX EUR-11 domain) and downloaded from the corresponding FTP-servers.

3.1.2. Preprocessing

The initial and boundary conditions are obtained from the corresponding global weather prediction systems with a grid spacing of 2.8 km and approximately 12 km from DWD and ECMWF, respectively. These products are then interpolated to the desired resolution—1 km for NRW and 12.5 km for CORDEX EUR-11—and cropped to the final domain size. Boundary conditions for both domains are created by interpolation. This is done by the standard preprocessing tool int2lm ([35], v.2.01) of the COSMO model. The tool interpolates the initial and boundary condition (in Grib or netCDF format) from a coarser source grid and/or larger domain into the desired destination grid. In addition, int2lm renames and transforms all arrays and parameters into a COSMO-readable format, therefore both, DWD data as well as ECMWF data, needs to get preprocessed by int2lm.

3.1.3. Forward Simulation

The monitoring system with TSMP is designed to perform continuous, real-time monitoring simulations. Nevertheless, the NWP model COSMO is driven by the aforementioned downloaded and preprocessed initial and boundary conditions and, thus, reinitialized every day. The surface/subsurface models CLM and ParFlow, on the other hand, are restarted with the restart files, which were generated by the previous day’s simulation cycle. Due to the slow dynamics in the surface/subsurface, the hydrologic model (i.e., ParFlow-CLM) requires an extended spinup simulation with real meteorological forcing prior to the setup of the monitoring system in order to reach a physically consistent dynamic equilibrium. This spinup was performed for both domains in advance to the real-time monitoring and the first monitoring runs were started from these initial conditions. By running the monitoring system continuously, the equilibrium between the subsurface, land surface and atmosphere is maintained and automatically extended. All component models of TSMP have their own set of parameter files, usually called namelists (traditional Fortran nomenclature), that define a large set of runtime information and parameterizations. Input information that is necessary for daily forecast runs, like dates, path to forcing files, run directories or output folders must be consistently set up across all namelists. This is possible in a formalized and consistent manner for the three model components by implementing a setup script environment.

3.1.4. Postprocessing and Visualization

The component models of TSMP generate output files in different formats, ParFlow binary (PFB) and Network Common Data Format (netCDF), in hourly intervals. A NCL (NCAR command language) script converts PFB to netCDF and aggregates hourly model output files into one single file for additional post-processing and ensuing archiving. The analyses and visualization are done by a set of Python scripts, which read the output files generated by TSMP with the netCDF4 Python library in a first step. The output from, for example, ParFlow only consists of pressure and saturation for a given domain. Together with static fields, such as 2D and 3D fields of porosity, specific storage coefficients and slopes in x- and y-direction, the data is then post-processed to the desired analyses, such as surface runoff, groundwater recharge, plant available water, and groundwater table depth. Also standard meteorological products and analyses, such as wind speed, temperature, precipitation, and geopotential heights of pressure levels are generated. After the analysis of output variables is completed, frames for each individual analysis and time step (hourly) are plotted to PNG files with the Python libraries matplotlib and basemap. These PNG files are then rendered with FFmpeg into an AVI movie file.

3.1.5. YouTube Upload and Archiving

After all previous steps are finished, the generated movies are uploaded via the Google Application Programming Interface (API, v3) into the corresponding YouTube playlist. Data for the web app (see below) is copied to a webserver and the whole experiment folder is zipped and moved into archive. The raw data is available for at least 10 years.
Note that the five steps outlined above are contained in three scripts (retrieval of boundary and initial conditions, and preprocessing; forward simulation; and postprocessing and visualization, and YouTube upload and archiving), which are triggered by cronjobs on a different computer system over SSH (Secure Shell), because user cronjobs are not allowed on the JURECA HPC system. The data retrieval step and upload/backup step are triggered by separate scripts/cronjobs on the JURECA front nodes and are separated from other steps because the JURECA compute nodes are not connected to the internet. The simulations are set up by a third script/cronjob, which also submits the jobscript to JURECAs job scheduler (slurm) for handling preprocessing, model run and visualization. These scripts contain several reliability mechanisms. For example, delayed scripts restart if download files or checkpoint files are not present, accommodating maintenance downtimes, automatically catching stalled runs and handling jobscripts with 24 h lead time in order to optimize the job queue position. The overall availability and reliability of the system completely depends on the availability and reliability of the supercomputing resources of JSC. Regular and unplanned system maintenances directly impact the monitoring system and resulting forecast. At this point, no redundancy has been implemented.

3.2. Precipitation Radar Integration for NRW

As mentioned above and in Section 2, the surface and subsurface models CLM and ParFlow are restarted utilizing the results/restart files of the previous day. This introduces errors due to inaccuracies in the prediction of water fluxes in the groundwater-to-atmosphere system, especially with regard to precipitation. In order to improve the hydrologic states, precipitation radar information is applied in a correction run in an extension of the workflow for NRW. Figure 4 shows the dependencies between the different tasks during the correction run. In a first step, the atmospheric simulation results from the previous day’s forecast are merged with the rain observations from the radar to generate an atmospheric forcing time series, which is used in an ensuing offline hydrologic simulation with CLM-ParFlow to arrive at the hydrologic initial condition for next day’s forecast (Figure 4).

3.2.1. Radar Data Retrieval and Preprocessing

The radar data is accessible on a server of the Meteorological Institute, Bonn University, and is downloaded shortly after midnight. The radar data are available as netCDF files and are generated every 5 min. The grid of the radar data has the same resolution as the forecast run (1 km) therefore no interpolation is needed. Because the radar data covers a much larger domain, the data needs to be cropped to the NRW grid sizes. Hourly offline forcing files (i.e., time series of radiation, air temperature, precipitation, wind speeds, barometric pressure and specific humidity at each individual pixel) are generated using the previous monitoring run and observed precipitation from the radar.

3.2.2. TSMP Correction Run and Upload/Archiving

TSMP is run in CLM-ParFlow offline mode without COSMO where the newly created forcing files are used as offline atmospheric forcing in CLM. The remainder of the configuration is the same as in the forecasting run. The results are also zipped and moved into the archive.

3.3. Publication of Monitoring Products

The results are disseminated on the video platform YouTube (https://www.youtube.com/channel/UCGio3ckQwasR5a_kJo1GdOw). Instead of a locally maintained web presence, YouTube is already established and has a strong impact especially beyond the scientific community. It is free and easy to use and has synergies with other social media making it easy to share, link or embed. Additionally, YouTube is fully controllable via the Google Python API, which is important for the integration into the workflow and automation process presented above. YouTube provides limited but sufficient structuring features for our purposes, which were used to arrange the daily results. The two modeled domains are separated by playlist-sections where every analysis is represented as a playlist and new videos are added automatically. Additionally, we developed a web-app particularly designed for mobile devices that allows displaying meteograms of all analysis products for the current (or a manually selected) location (www.terrsysmp.org/tsmp_app). The raw data is also available via SFTP (SSH File Transfer Protocol) for the past three months. Note, all data are published in an ad hoc fashion i.e., at this point no quality control has been implemented.

4. Results and Discussion

Running a fully integrated, groundwater-to-atmosphere terrestrial systems model in forecast mode, opens up new opportunities. In addition to weather variables, special emphasis on the analyses of hydrologic variables enables new applications that are unique so far. For example, forecasted analyses of plant available water is useful for agricultural applications, surface runoff for flood forecasting and hydropower, and groundwater recharge, water table depth and water column storage for water resources management and drought prediction. In the following sections, example results of the different monitoring experiments are shown.

4.1. Monitoring EUR-11 and NRW Domains

In comparison to widely available meteorological forecasts, the unique feature of the monitoring system are experimental analyses and monitoring products of all state variables and fluxes from groundwater across the land surface into the atmosphere. A large number of variables can be accessed at https://www.youtube.com/channel/UCGio3ckQwasR5a_kJo1GdOw. Below, some examples are shown for more classical atmospheric products (Figure 5 and Figure 6), a hydrologic variable (Figure 7) over Europe during the heat wave of 2018, and hydrologic variables over NRW (Figure 8 and Figure 9) during the flooding in January 2018, which are not commonly available. For example, Figure 9 shows groundwater recharge, which is a variable of great interest to public water managers and agriculture.
Figure 5 shows the 2-m air temperature forecasts over Europe on August 7th, which was the peak of the heat wave in Germany in 2018. Temperatures above 35 °C were recorded in Germany during that time period. In addition, meteograms are provided for the location of the city of Jülich, Germany, for 2-m air temperature, sea level barometric pressure, 2 m specific humidity, 20 m wind speed, and precipitation. Figure 6 shows the precipitation forecast over Europe including precipitation meteograms for a number of European capitals. Both figures constitute standard NWP-type products. Figure 7 shows the forecasted change in plant available soil water as a new product, which is the change in the amount of water that is available to plants via root water uptake at a matric potential > 10 m. The figure illustrates the overall recession of plant available water on the order of −10 mm over a period of 72 h with localized recharge due to local precipitation events toward the end of the forecast period.
Figure 8 shows the forecasted change in groundwater table depth over the prediction period of 24 h (8 January 2018) over the NRW domain. The forecast suggests widespread recession of groundwater (positive values) with increasing groundwater table depth and localized decreases in water table depth in recharge areas. This is in contrast to the flooding of the Rhine River during the same period, which was mainly caused by heavy precipitation and snow melt in Southern Germany and the Alps. The predicted groundwater recharge shown in Figure 9 is spatially quite heterogeneous contradicting the common assumption of spatially uniform recharge areas along topographic ridges and heights. Especially in the Eifel region in the south of the monitoring domain, recharge areas alternate with discharge areas over small distances, which is difficult to simulate with simplified modeling approaches separating the groundwater compartment from the soil zone and land surface.

4.2. Radar Correction for NRW

Figure 10, Figure 11 and Figure 12 show the impact of the rain radar correction on the hydrology using the precipitation radar information as forcing in January 2018. Figure 10 shows the cumulative precipitation fields from the monitoring and correction run (small panels on the right), and the difference between both runs. One can clearly distinguish regions of over- and underestimation of precipitation, which results in correlated differences in air temperature (Figure 11) and, more pronounced, in surface runoff (Figure 12). While precipitation products from radars are error-prone, we expect an improvement in the simulated states especially with regard to their spatial distribution.

5. Summary and Conclusions

We presented a real-time terrestrial monitoring system (TMS) that closes the water and energy cycles from the groundwater into the atmosphere. The system provides forecasts of all states and fluxes of the water and energy cycle and a physically consistent digital image of the terrestrial system. The TMS has been implemented over a European continental domain (EUR-11) and a regional domain covering parts of North Rhine-Westphalia and bordering countries (NRW). In case of NRW, a correction of the hydrologic states is achieved by ingesting precipitation radar information into the monitoring system in an offline correction run, in order to improve the initialization of the land component of the ensuing monitoring run. The terrestrial monitoring system is experimental and still requires comprehensive validation, which is subject of ongoing research. The system gained some attention in the community and beyond. For example, the videos were watched more than 26,000 times since the commencement in January 2016. There is great potential in extending the monitoring to e.g., the seasonal time scale, adding data assimilation for improved model initialization, and the development of monitoring products for industry and public sectors. The complete runtime system and output is provided freely via SFTP (for access contact the corresponding author).

Author Contributions

Conceptualization, S.K., C.S. and K.G.; methodology, S.K., C.S. and K.G.; software, V.K. and F.G..; formal analysis, S.K., F.G., K.G.; investigation, S.K., F.G., K.G.; data curation, F.G.; writing-original draft preparation, F.G., S.K.; writing-review and editing, S.K., F.G., S.B., K.G., H.-J.H.-F., J.K., W.K., V.K., F.P., S.P., S.T, P.S., C.S., M.S.; visualization, F.G., K.G., S.B.; supervision, S.K.; project administration, S.K.; funding acquisition, S.K., C.S.

Funding

This research work was supported by the SFB/TR32 Patterns in Soil-Vegetation-Atmosphere Systems: Monitoring, Modeling and Data Assimilation Project funded by the Deutsche Forschungsgemeinschaft (DFG) and TERENO.

Acknowledgments

The authors gratefully acknowledge the computing time granted through the Board of Directors on the supercomputer JURECA and JUQUEEN at Forschungszentrum Jülich. We gratefully acknowledge the provision of the atmospheric boundary data by the European Centre for Medium-Range Weather Forecasts and the German Weather Service.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Swinbank, R.; Kyouda, M.; Buchanan, P.; Froude, L.; Hamill, T.M.; Hewson, T.D.; Keller, J.H.; Matsueda, M.; Methven, J.; Pappenberger, F.; et al. The TIGGE Project and Its Achievements. Bull. Am. Meteor. Soc. 2015, 97, 49–67. [Google Scholar] [CrossRef]
  2. Damrath, U.; Doms, G.; Frühwald, D.; Heise, E.; Richter, B.; Steppeler, J. Operational quantitative precipitation forecasting at the German Weather Service. J. Hydrol. 2000, 239, 260–285. [Google Scholar] [CrossRef]
  3. Palmer, T. The ECMWF ensemble prediction system: Looking back (more than) 25 years and projecting forward 25 years. Q. J. R. Meteorol. Soc. 2018. [Google Scholar] [CrossRef]
  4. Lynch, P. The origins of computer weather prediction and climate modeling. J. Comp. Phys. 2008, 227, 3431–3444. [Google Scholar] [CrossRef]
  5. Bauer, P.; Thorpe, A.; Brunet, G. The quiet revolution of numerical weather prediction. Nature 2015, 525, 47–55. [Google Scholar] [CrossRef] [PubMed]
  6. Saha, S.; Moorthi, S.; Wu, X.; Wang, J.; Nadiga, S.; Tripp, P.; Behringer, D.; Hou, Y.-T.; Chuang, H.; Iredell, M.; et al. The NCEP Climate Forecast System Version 2. J. Clim. 2013, 27, 2185–2208. [Google Scholar] [CrossRef]
  7. Roebber, P.J.; Schultz, D.M.; Colle, B.A.; Stensrud, D.J. Toward improved prediction: High-resolution and ensemble modeling systems in operations. Weather Forecast. 2004, 19, 936–949. [Google Scholar] [CrossRef]
  8. De Roo, A.P.J.; Gouweleeuw, B.; Thielen, J.; Bartholmes, J.; Bongioannini-Cerlini, P.; Todini, E.; Bates, P.D.; Horritt, M.; Hunter, N.; Beven, K.; et al. Development of a European flood forecasting system. Int. J. River Basin Manag. 2003, 1, 49–59. [Google Scholar] [CrossRef] [Green Version]
  9. Thielen, J.; Bartholmes, J.; Ramos, M.-H.; de Roo, A. The European flood alert system—part 1: Concept and development. Hydrol. Earth Syst. Sci. 2009, 13, 125–140. [Google Scholar] [CrossRef]
  10. Refsgaard, J.C. Validation and intercomparison of different updating procedures for real-time forecasting. Hydrol. Res. 1997, 28, 65–84. [Google Scholar] [CrossRef]
  11. Emerton, R.; Zsoter, E.; Arnal, L.; Cloke, H.L.; Muraro, D.; Prudhomme, C.; Stephens, E.M.; Salamon, P.; Pappenberger, F. Developing a global operational seasonal hydro-meteorological forecasting system: GloFAS-Seasonal v1.0. Geosci. Model Dev. 2018, 11, 3327–3346. [Google Scholar] [CrossRef]
  12. Shukla, S.; McNally, A.; Husak, G.; Funk, C. A seasonal agricultural drought forecast system for food-insecure regions of East Africa. Hydrol. Earth Syst. Sci. 2014, 18, 3907–3921. [Google Scholar] [CrossRef] [Green Version]
  13. De Lange, W.J.; Prinsen, G.F.; Hoogewoud, J.C.; Veldhuizen, A.A.; Verkaik, J.; Oude Essink, G.H.P.; van Walsum, P.E.V.; Delsman, J.R.; Hunink, J.C.; Massop, H.T.L.; et al. An operational, multi-scale, multi-model system for consensus-based, integrated water management and policy analysis: The Netherlands hydrological instrument. Environ. Model. Softw. 2014, 59, 98–108. [Google Scholar] [CrossRef]
  14. Shrestha, P.; Sulis, M.; Masbou, M.; Kollet, S.; Simmer, C. A scale-consistent terrestrial systems modeling platform based on COSMO, CLM, and parflow. Mon. Weather Rev. 2014, 142, 3466–3483. [Google Scholar] [CrossRef]
  15. Gasper, F.; Goergen, K.; Shrestha, P.; Sulis, M.; Rihani, J.; Geimer, M.; Kollet, S. Implementation and scaling of the fully coupled terrestrial systems modeling platform (TerrSysMP v1.0) in a massively parallel supercomputing environment—A case study on JUQUEEN (IBM Blue Gene/Q). Geosci. Model Dev. 2014, 7, 2531–2543. [Google Scholar] [CrossRef] [Green Version]
  16. Simmer, C.I.; Thiele-Eich, M.; Masbou, W.; Amelung, S.; Crewell, B.; Diekkrueger, F.; Ewert, H.-J.; Hendricks-Franssen, A.J.; Huisman, A.; Kemna, N.; et al. Monitoring and modeling the terrestrial system from pores to catchments—The transregional collaborative research center on patterns in the soil-vegetation-atmosphere system. Bull. Am. Meteorol. Soc. 2014, 96, 1765–1787. [Google Scholar] [CrossRef]
  17. Rahman, M.; Sulis, M.; Kollet, S.J. The concept of dual-boundary forcing in land surface-subsurface interactions of the terrestrial hydrologic and energy cycles. Water Resour. Res. 2014, 50, 8531–8548. [Google Scholar] [CrossRef] [Green Version]
  18. Rahman, M.; Sulis, M.; Kollet, S.J. The subsurface-land surface-atmosphere connection under convective conditions. Adv. Water Resour. 2015, 83, 240–249. [Google Scholar] [CrossRef]
  19. Keune, J.; Gasper, F.; Goergen, K.; Hense, A.; Shrestha, P.; Sulis, M.; Kollet, S. Studying the influence of groundwater representations on land surface-atmosphere feedbacks during the European heat wave in 2003. J. Geophys. Res. Atmos. 2016, 121, 13301–13325. [Google Scholar] [CrossRef]
  20. Sulis, M.; Williams, J.L.; Shrestha, P.; Diederich, M.; Simmer, C.; Kollet, S.J.; Maxwell, R.M. Coupling groundwater, vegetation, and atmospheric processes: A comparison of two integrated models. J. Hydrometeor. 2017, 18, 1489–1511. [Google Scholar] [CrossRef]
  21. Sulis, M.; Keune, J.; Shrestha, P.; Simmer, C.; Kollet, S.J. Quantifying the impact of subsurface-land surface physical processes on the predictive skill of subseasonal mesoscale atmospheric simulations. J. Geophys. Res. Atmos. 2018, 123, 9131–9151. [Google Scholar] [CrossRef]
  22. Keune, J.; Sulis, M.; Kollet, S.; Siebert, S.; Wada, Y. Human water use impacts on the strength of the continental sink for atmospheric water. Geophys. Res. Lett. 2018, 45, 4068–4076. [Google Scholar] [CrossRef]
  23. Keune, J.; Sulis, M.; Kollet, S. Potential added value of incorporating human water use in the simulation of evapotranspiration and precipitation in a continental-scale bedrock-to-atmosphere modeling system. J. Adv. Model. Earth Sys. 2018. under review. [Google Scholar]
  24. Baldauf, M.; Seifert, A.; Förstner, J.; Majewski, D.; Raschendorfer, M.; Reinhardt, T. Operational convective-scale numerical weather prediction with the COSMO model: Description and sensitivities. Mon. Weather Rev. 2011, 139, 3887–3905. [Google Scholar] [CrossRef]
  25. Oleson, K.W.; Niu, G.-Y.; Yang, Z.-L.; Lawrence, D.M.; Thornton, P.E.; Lawrence, P.J.; Stöckli, R.; Dickinson, R.E.; Bonan, G.B.; Levis, S.; et al. Improvements to the community land model and their impact on the hydrological cycle. J. Geophys. Res Biogeosci. 2008, 113. [Google Scholar] [CrossRef]
  26. Jones, J.E.; Woodward, C.S. Newton-Krylov-multigrid solvers for large-scale, highly heterogeneous, variably saturated flow problems. Adv. Water Resour. 2001, 24, 763–774. [Google Scholar] [CrossRef]
  27. Kollet, S.J.; Maxwell, R.M. Integrated surface-groundwater flow modeling: A free-surface overland flow boundary condition in a parallel groundwater flow model. Adv. Water Resour. 2006, 29, 945–958. [Google Scholar] [CrossRef]
  28. Maxwell, R.M. A terrain-following grid transform and preconditioner for parallel, large-scale, integrated hydrologic modeling. Adv. Water Resour. 2013, 53, 109–117. [Google Scholar] [CrossRef]
  29. Valcke, S. The OASIS3 coupler: A European climate modelling community software. Geosci. Model Dev. 2013, 6, 373–388. [Google Scholar] [CrossRef]
  30. Rodwell, M.J.; Magnusson, L.; Bauer, P.; Bechtold, P.; Bonavita, M.; Cardinali, C.; Diamantakis, M.; Earnshaw, P.; Garcia-Mendez, A.; Isaksen, L.; et al. Characteristics of occasional poor medium-range weather forecasts for Europe. Bull. Am. Meteor. Soc. 2013, 94, 1393–1405. [Google Scholar] [CrossRef]
  31. Kurtz, W.; He, G.; Kollet, S.J.; Maxwell, R.M.; Vereecken, H.; Hendricks-Franssen, H.-J. TerrSysMP–PDAF (version 1.0): A modular high-performance data assimilation framework for an integrated land surface-subsurface model. Geosci. Model Dev. 2016, 9, 1341–1360. [Google Scholar] [CrossRef]
  32. Zhang, H.; Kurtz, W.; Kollet, S.; Vereecken, H.; Hendricks-Franssen, H.J. Comparison of different assimilation methodologies of groundwater levels to improve predictions of root zone soil moisture with an integrated terrestrial systems model. Adv. Water Resour. 2018, 111, 224–238. [Google Scholar] [CrossRef]
  33. Diederich, M.; Ryzhkov, A.; Simmer, C.; Zhang, P.; Troemel, S. Use of specific attenuation for rainfall measurement at X-band radar wavelengths—Part 1: Radar calibration and partial beam blockage estimation. J. Hydrometeor. 2015, 16, 487–502. [Google Scholar] [CrossRef]
  34. Diederich, M.; Ryzhkov, A.; Simmer, C.; Zhang, P.; Trömel, S. Use of specific attenuation for rainfall measurement at X-band radar wavelengths—Part II: Rainfall estimates and comparison with rain gauges. J. Hydrometeor. 2015, 16, 503–516. [Google Scholar] [CrossRef]
  35. Schättler, U.; Blahak, U. A Description of the Nonhydrostatic Regional COSMO-Model. Part V: Preprocessing: Initial and Boundary Data for the COSMO-Model. 2018. Available online: http://www.cosmo-model.org/content/model/documentation/core/int2lm_2.03.pdf (accessed on 23 October 2018).
Figure 1. The monitoring domains, Europe (EU, left) and North Rhine-Westphalia (NRW, right). Black circles indicate capitals and major cities of the simulation domains. The domain highlighted in red in the figure on the right is the NRW domain, which is highly equipped with hydrological and hydrometeorological measurement equipment. The resolution differences in (b) illustrate the different grid cell sizes of the model setups, i.e., 12.5 km for the EU domain and 500 m for the NRW domain.
Figure 1. The monitoring domains, Europe (EU, left) and North Rhine-Westphalia (NRW, right). Black circles indicate capitals and major cities of the simulation domains. The domain highlighted in red in the figure on the right is the NRW domain, which is highly equipped with hydrological and hydrometeorological measurement equipment. The resolution differences in (b) illustrate the different grid cell sizes of the model setups, i.e., 12.5 km for the EU domain and 500 m for the NRW domain.
Water 10 01697 g001
Figure 2. Schematic of the monitoring clock for the NRW and EU domains. Note, the rain radar correction is only performed in case of the NRW domain (ICs: Initial conditions; BCs: Boundary conditions).
Figure 2. Schematic of the monitoring clock for the NRW and EU domains. Note, the rain radar correction is only performed in case of the NRW domain (ICs: Initial conditions; BCs: Boundary conditions).
Water 10 01697 g002
Figure 3. Flowchart for both the EU and NRW monitoring system implemented in the forecast jobscript running on the compute nodes of JURECA at JSC. $WORK indicates the execution of the respective workflow step in the working or scratch directory of the HPC environment. Note that both monitoring runs are based on the same workflow.
Figure 3. Flowchart for both the EU and NRW monitoring system implemented in the forecast jobscript running on the compute nodes of JURECA at JSC. $WORK indicates the execution of the respective workflow step in the working or scratch directory of the HPC environment. Note that both monitoring runs are based on the same workflow.
Water 10 01697 g003
Figure 4. Schematic of the different dependencies between the tasks in the workflow of the correction run.
Figure 4. Schematic of the different dependencies between the tasks in the workflow of the correction run.
Water 10 01697 g004
Figure 5. Monitoring output of TSMP (CORDEX EUR-11 domain) for the terrestrial 2 m air temperature (°C) at the peak of the heat wave on 7 August 2018. On the left from top to bottom, the meteograms depict 2 m air temperature (°C), sea level pressure (hPa), 10 m wind speed (m·s−1), and precipitation (mm·h−1) in Jülich (latitute: 50.9, longitude: 6.4), Germany.
Figure 5. Monitoring output of TSMP (CORDEX EUR-11 domain) for the terrestrial 2 m air temperature (°C) at the peak of the heat wave on 7 August 2018. On the left from top to bottom, the meteograms depict 2 m air temperature (°C), sea level pressure (hPa), 10 m wind speed (m·s−1), and precipitation (mm·h−1) in Jülich (latitute: 50.9, longitude: 6.4), Germany.
Water 10 01697 g005
Figure 6. Monitoring output of TSMP (CORDEX EUR-11 domain) for precipitation (l·m−2 or mm) in logarithmic scale on 7 August 2018. On the left from top to bottom, the meteograms depict precipitation rates (snow and rain) (l·m−2 or mm) for the European capitals Berlin, Moscow, Paris, Madrid and Rome.
Figure 6. Monitoring output of TSMP (CORDEX EUR-11 domain) for precipitation (l·m−2 or mm) in logarithmic scale on 7 August 2018. On the left from top to bottom, the meteograms depict precipitation rates (snow and rain) (l·m−2 or mm) for the European capitals Berlin, Moscow, Paris, Madrid and Rome.
Water 10 01697 g006
Figure 7. Monitoring output of TSMP (CORDEX EUR-11 domain) of the change of plant available water (mm) over the course of the prediction period of 72 h (between 7 August and 10 August 2018).
Figure 7. Monitoring output of TSMP (CORDEX EUR-11 domain) of the change of plant available water (mm) over the course of the prediction period of 72 h (between 7 August and 10 August 2018).
Water 10 01697 g007
Figure 8. Monitoring output of TSMP (NRW) for the water table depth change (mm) in logarithmic scale. The change is estimated relative to the initial condition at the beginning of the monitoring run. Negative values indicate a rise of the water table.
Figure 8. Monitoring output of TSMP (NRW) for the water table depth change (mm) in logarithmic scale. The change is estimated relative to the initial condition at the beginning of the monitoring run. Negative values indicate a rise of the water table.
Water 10 01697 g008
Figure 9. Monitoring output of TSMP (NRW) for groundwater recharge (positive) and capillary rise (negative) (mm) i.e., the plant available water storage change (mm (Sym. Log.)). The change is relative to the initial condition. The plant available water is the water that is stored in the pore space at a matric potential larger than −10m.
Figure 9. Monitoring output of TSMP (NRW) for groundwater recharge (positive) and capillary rise (negative) (mm) i.e., the plant available water storage change (mm (Sym. Log.)). The change is relative to the initial condition. The plant available water is the water that is stored in the pore space at a matric potential larger than −10m.
Water 10 01697 g009
Figure 10. Difference plot for precipitation (mm) between simulated and radar precipitation (left). The panels on the right show the simulated precipitation (top) and radar-estimated precipitation (bottom).
Figure 10. Difference plot for precipitation (mm) between simulated and radar precipitation (left). The panels on the right show the simulated precipitation (top) and radar-estimated precipitation (bottom).
Water 10 01697 g010
Figure 11. Difference plot for 2 m air temperature (°C) between uncorrected and precipitation radar corrected simulations. The panels on the right show the temperatures from the uncorrected (top) and precipitation radar corrected simulations (bottom).
Figure 11. Difference plot for 2 m air temperature (°C) between uncorrected and precipitation radar corrected simulations. The panels on the right show the temperatures from the uncorrected (top) and precipitation radar corrected simulations (bottom).
Water 10 01697 g011
Figure 12. Difference plot for surface runoff (m3·s−1) between uncorrected and precipitation radar corrected simulations. The panels on the right show the runoff from the uncorrected (top) and precipitation radar corrected simulations (bottom).
Figure 12. Difference plot for surface runoff (m3·s−1) between uncorrected and precipitation radar corrected simulations. The panels on the right show the runoff from the uncorrected (top) and precipitation radar corrected simulations (bottom).
Water 10 01697 g012
Table 1. Selected essential states and fluxes simulated by the terrestrial monitoring system.
Table 1. Selected essential states and fluxes simulated by the terrestrial monitoring system.
Variable Name Units
Subsurface hydraulic pressure(m)
Subsurface relative saturation(-)
Subsurface Darcy flow(m·h−1)
Overland flow, surface runoff(m·s−1)
Evapotranspiration(mm·s−1) or W·m−2)
Sensible heat flux(W·m−2)
Ground heat flux(W·m−2)
Long/short wave radiation (incoming and outgoing)(W·m−2)
Precipitation (liquid and frozen)(mm·s−1)
Snow water equivalent(m)
Barometric pressure(Pa)
Air temperature(K)
Air humidity(kg·kg−1)
Air wind speeds(m·s−1)
Cloud cover(-)

Share and Cite

MDPI and ACS Style

Kollet, S.; Gasper, F.; Brdar, S.; Goergen, K.; Hendricks-Franssen, H.-J.; Keune, J.; Kurtz, W.; Küll, V.; Pappenberger, F.; Poll, S.; et al. Introduction of an Experimental Terrestrial Forecasting/Monitoring System at Regional to Continental Scales Based on the Terrestrial Systems Modeling Platform (v1.1.0). Water 2018, 10, 1697. https://doi.org/10.3390/w10111697

AMA Style

Kollet S, Gasper F, Brdar S, Goergen K, Hendricks-Franssen H-J, Keune J, Kurtz W, Küll V, Pappenberger F, Poll S, et al. Introduction of an Experimental Terrestrial Forecasting/Monitoring System at Regional to Continental Scales Based on the Terrestrial Systems Modeling Platform (v1.1.0). Water. 2018; 10(11):1697. https://doi.org/10.3390/w10111697

Chicago/Turabian Style

Kollet, Stefan, Fabian Gasper, Slavko Brdar, Klaus Goergen, Harrie-Jan Hendricks-Franssen, Jessica Keune, Wolfgang Kurtz, Volker Küll, Florian Pappenberger, Stefan Poll, and et al. 2018. "Introduction of an Experimental Terrestrial Forecasting/Monitoring System at Regional to Continental Scales Based on the Terrestrial Systems Modeling Platform (v1.1.0)" Water 10, no. 11: 1697. https://doi.org/10.3390/w10111697

APA Style

Kollet, S., Gasper, F., Brdar, S., Goergen, K., Hendricks-Franssen, H. -J., Keune, J., Kurtz, W., Küll, V., Pappenberger, F., Poll, S., Trömel, S., Shrestha, P., Simmer, C., & Sulis, M. (2018). Introduction of an Experimental Terrestrial Forecasting/Monitoring System at Regional to Continental Scales Based on the Terrestrial Systems Modeling Platform (v1.1.0). Water, 10(11), 1697. https://doi.org/10.3390/w10111697

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop