Principles and Challenges for Multi-Stakeholder Development of Focused, Tiered, and Triggered, Adaptive Monitoring Programs for Aquatic Environments
Abstract
:1. Introduction
2. Essential First Steps
- Values-based questions, that deal with perception issues, such as “has the taste or risk from eating fish or shellfish changed?”
- Stressor-based questions that deal primarily with exposure issues and environmental impact assessment (EIA) predictions, such as “how far did the contaminants travel?”
- Effects-based questions that deal with the accumulated environmental state, such as “are there residual environmental concerns?” or “are there regional cumulative effects?”
2.1. Defining the Question
- Is there a difference (a statistical difference)?
- Is there a change (is the difference large enough that it surpasses a trigger that reflects natural variability)?
- Is it real (requires confirmation of change)?
- Was it expected (requires an understanding of risk and expected risks)?
- Is it stable or getting worse (requires temporal data)?
- How big an area is changing (requires understanding the extent and magnitude of change)?
- Is it meaningful (requires understanding how important a change is relative to other sites and other indicators)?
- Where might it be coming from (requires identifying the cause)?
- How serious is it and do I need to fix it or stop it (requires an understanding of ecological relevance)?
2.2. Defining Answers
- Is there a difference: involves comparing the data from a site of interest to relevant local reference site(s);
- Is there a change: is a question asked over time at a single site, in comparison with available historical or reference data;
- How big an area: involves a spatial data set;
- Is it getting better or worse: involves a spatial data set over time;
- How serious is it: requires ecologically relevant endpoints across a range of reference sites.
3. Designing an Adaptive System
- Confirmation studies: a repeat of surveillance monitoring, on an accelerated pace, to evaluate the replicability of a change.
- Investigation of cause (IOC): are generally hypothesis-driven studies designed to characterize the potential cause or source of an issue after evidence that the change is real and importance is obtained. IOC is a type of focused study.
3.1. Development of Triggers
3.2. Defining How Big a Difference Is Going to Be Interpreted as a Response or Signal
4. How Do I Pick the Right Indicators?
- Early warning indicators that tell you whether predicted or anticipated changes are happening; they are usually at lower levels of organization, or direct measures of the stressors of interest;
- Performance (effects) indicators that are integrators that tell you whether the accumulation of stress is affecting indicators that threaten sustainability;
- Biodiversity type indicators that tell you whether changes at lower level are important enough that damage has been done.
- How confident am I if I don’t see a change in my measurement endpoints that nothing important is happening (am I monitoring at the right level?);
- If I do detect an effect, where is the next obvious place I would look to evaluate how important it is (where else do I need baseline data?);
- What is the variability in the measurement endpoints (within and between cycles) and how much power does that give my study, given the sample sizes and frequency of monitoring that I can afford (how much statistical power do I have?);
- How big a change in these endpoints should create a situation where I want to know more information (what is the critical effect size?); and
- How confident am I that a change in these endpoints gives me concern that specific operation is having a potential impact (what is the potential that I can link changes in these endpoints back to specific operations?).
- Answer a clear relevant question;
- Have a published, peer-reviewed analytical protocol;
- Are commercially available;
- Have an adequate baseline or reference for comparison to (what would a change be?);
- Have defined targets and interpretation process (what would happen if threshold exceeded; i.e., when does a change occur?).
5. Final Considerations
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Lowell, R.B.; Ribey, S.C.; Ellis, I.K.; Porter, E.; Culp, J.M.; Grapentine, L.C.; McMaster, M.E.; Munkittrick, K.R.; Scroggins, R.P. National Assessment of the Pulp and Paper Environmental Effects Monitoring Data; NWRI Report 03-521; National Water Research Institute: Burlington, ON, Canada, 2003. [Google Scholar]
- Walker, S.L.; Hedley, K.; Porter, E. Pulp and paper environmental effects monitoring in Canada: An overview. Water Qual. Res. J. Can. 2002, 37, 7–19. [Google Scholar] [CrossRef]
- Ribey, S.C.; Munkittrick, K.R.; McMaster, M.E.; Courtenay, S.; Langlois, C.; Munger, S.; Rosaasen, A.; Whitley, G. Development of a monitoring design for examining effects in wild fish associated with discharges from metal mines. Water Qual. Res. J. Can. 2002, 37, 229–249. [Google Scholar] [CrossRef]
- Chiang, G.; Munkittrick, K.R.; Orrego, R.; Barra, R. Monitoring of the environmental effects of pulp mill discharges in Chilean rivers: Lessons learned and challenges. Water Qual. Res. J. Can. 2010, 45, 111–122. [Google Scholar] [CrossRef]
- Furley, T.H.; Perônico, C. (Eds.) Guia Técnico de Monitoramento dos Efeitos Ambientais em Corpos Hídricos; Instituto Aplysia: Vitória, Brazil, 2015; 264p, ISBN 978-85-5642-000-8. [Google Scholar]
- Environment Canada. An Integrated Oil Sands Environment Monitoring Plan. Cat; No. En14-49/2011E-PDF; 2011; ISBN 978-1-100-18939-0. Available online: http://publications.gc.ca/collections/collection_2011/ec/En14-47-2011-eng.pdf (accessed on 9 February 2019).
- Downes, B.J.; Barmuta, L.A.; Fairweather, P.G.; Faith, D.P.; Keough, M.J.; Lake, P.S.; Mapstone, B.D.; Quinn, G.P. Monitoring Ecological Impacts: Concepts and Practice in Flowing Waters; Cambridge University Press: Cambridge, UK, 2002; ISBN 9780521065290. [Google Scholar]
- Lindenmayer, D.B.; Likens, G.E. Effective Ecological Monitoring, 2nd ed.; CSIRO Publishing: Clayton, Australia, 2018; ISBN 9781486308927. [Google Scholar]
- Conrad, C.C.; Hilchey, K.G. A review of citizen science and community-based environmental monitoring: Issues and opportunities. Environ. Monitor. Assess. 2011, 176, 273–291. [Google Scholar] [CrossRef] [PubMed]
- Somers, K.M.; Kilgour, B.W.; Munkittrick, K.R.; Arciszewski, T.J. An adaptive environmental effects monitoring framework for assessing the influences of liquid effluents on benthos, water, and sediments in aquatic receiving environments. Integr. Environ. Assess. Manag. 2018, 14, 552–566. [Google Scholar] [CrossRef] [PubMed]
- Arciszewski, T.J.; Munkittrick, K.R.; Scrimgeour, G.J.; Dubé, M.G.; Wrona, F.J.; Hazewinkel, R.R. Using adaptive processes and adverse outcome pathways to develop a meaningful, robust, and actionable environmental monitoring programs. Integr. Environ. Assess. Manag. 2017, 13, 877–891. [Google Scholar] [CrossRef] [PubMed]
- Munkittrick, K.R.; McMaster, M.; Van Der Kraak, G.; Portt, C.; Gibbons, W.; Farwell, A.; Gray, M. Development of Methods for Effects-Based Cumulative Effects Assessment Using Fish Populations: Moose River Project; SETAC Press: Pensacola, FL, USA, 2000; pp. 236 + 18. [Google Scholar]
- Tanna, R.N.; Redman, A.D.; Frank, R.A.; Arciszewski, T.J.; Zubot, W.A.; Wrona, F.J.; Brogly, J.A.; Munkittrick, K.R. Overview of existing science to inform oil sands process water release—A technical workshop summary. Integr. Environ. Assess Manag. 2019, 15, 519–527. [Google Scholar] [CrossRef] [PubMed]
- Fennell, J.; Arciszewski, T.J. Current knowledge of seepage from oil sands tailings ponds and its environmental influence in northeastern Alberta. Sci. Total Environ. 2019, 686, 968–985. [Google Scholar] [CrossRef] [PubMed]
- Lima, A.C.; Wrona, F.J. Multiple threats and stressors to the Athabasca River Basin: What do we know so far? Sci. Total Environ. 2019, 649, 640–651. [Google Scholar] [CrossRef] [PubMed]
- Environment Canada. Environmental Effects Monitoring Technical Guidance; Environment Canada: Gatineau, QC, Canada, 2012. Available online: https://www.ec.gc.ca/esee-eem/default.asp?lang=En&n=AEC7C481-1 (accessed on 9 February 2019).
- Bowron, L.K.; Munkittrick, K.R.; McMaster, M.E.; Tetreault, G.; Hewitt, L.M. Responses of white sucker (Catostomus commersoni) to 20 years of process and waste treatment changes at a bleached kraft pulp mill, and to mill shutdown. Aquat. Toxicol. 2009, 95, 117–132. [Google Scholar] [CrossRef] [PubMed]
- Lindenmayer, D.B.; Likens, G.E. Adaptive monitoring: A new paradigm for long-term research and monitoring. Trends Ecol. Evol. 2009, 24, 482–486. [Google Scholar] [CrossRef] [PubMed]
- Suter, G.W.; Vermeire, T.; Munns, W.R.; Sekizawa, J. Framework for the integration of health and ecological risk assessment. Hum. Ecol. Risk Assess. 2003, 9, 281–301. [Google Scholar] [CrossRef]
- Munkittrick, K.R.; Arciszewski, T.J. Using normal ranges for interpreting results of monitoring and tiering to guide future work: A case study of increasing polycyclic aromatic compounds in lake sediments from the Cold Lake oil sands (Alberta, Canada) described in Korosi et al. (2016). Environ. Pollut. 2017, 231, 1215–1222. [Google Scholar] [CrossRef]
- Arciszewski, T.J.; Munkittrick, K.R. Development of an adaptive monitoring framework for long-term programs: An example using indicators of fish health. Integr. Environ. Assess. Manag. 2015, 11, 701–718. [Google Scholar] [CrossRef]
- Kilgour, B.W.; Somers, K.M.; Barrett, T.J.; Munkittrick, K.R.; Francis, A. Testing against “Normal” with environmental data. Integr. Environ. Assess. Manag. 2017, 13, 188–197. [Google Scholar] [CrossRef] [PubMed]
- Barrett, T.J.; Munkittrick, K.R. Seasonal reproductive patterns and recommended sampling times for sentinel fish species used in environmental effects monitoring programs in Canada. Environ. Rev. 2010, 18, 115–135. [Google Scholar] [CrossRef]
- Munkittrick, K.R.; Arens, C.J.; Lowell, R.B.; Kaminski, G.P. A review of potential methods for determining critical effect size for designing environmental monitoring programs. Environ. Toxicol. Chem. 2009, 28, 1361–1371. [Google Scholar] [CrossRef] [PubMed]
- Munkittrick, K.R. Ubiquitous criticisms of ecological field studies. Hum. Ecol. Risk. Assess. 2009, 15, 1–4. [Google Scholar] [CrossRef]
Program structure and administration | 1. The program should be developed by | Industry, government, multi-stakeholder |
2. Industry should be allowed input or review | Yes, No | |
3. Studies should be paid by | Industry, government, other | |
4. Requirements should be | Written into a permit, a regulation, OR are voluntary | |
5. Goal is to | Identify impaired areas inform adaptive management, prevent problems, restore conditions | |
6. Management decisions required | Within 10 years, within 5 years, soon | |
7. Monitoring program focus | Site-specific, regional, national | |
8. Participants measure the same things | Yes, No | |
9. Effort varies depending on the level of concern | Yes, No | |
10. Is there a core program, with site-specific requirements | Yes, No | |
11. Will results be used to change intensity or requirements | Yes, No | |
12. Study designs approved by | Government, peer review, industry | |
13. Results received by | Program office, regulator, industry | |
14. If the study is inadequate | Program resample, proceed to next cycle | |
15. If a change is detected | Regulatory action, management action, change intensity of monitoring | |
16. Decision will be based on | Ecological integrity and biodiversity, impact on water use, fish use or the fishery, increasing change | |
Program Objectives | 17. Focus of the program is | Human perspective and use, ecological perspective, both |
18. Main purpose | Monitoring, surveillance, assessment, prediction | |
19. Main focus | Unknown stressors, cumulative stressors, specific development | |
20. Design to look for | Stressors, effects, protecting values | |
21. Is research allowed | Yes, No | |
22. Program looks for | Change, effects, impacts | |
Study design | 23. Pre-development data | Yes, No |
24.If sites that are confounded or dangerous to sample | Alternative methods, monitoring is not necessary at all sites | |
25. When change is detected | Trigger studies, identify cause, fix change | |
26. Consequence of seeing a change | Collect more information, define cause, management decision | |
27. Decisions will be based on | Individual results, a pattern of responses, weight of evidence | |
28. Reference data | Historical data, temporal within site, comparable reference sites, gradient | |
29. Regional reference data | Within program, external | |
30. Natural variability | Not relevant, detected, understood | |
31. Confounding factors should be | Avoided, detected, identified | |
32. Focus | Species used by people, most exposed, all are important | |
33. Level of confidence desired (alpha) | 1 in 10 (0.10), 1 in 20 (0.05), OR 1 in 100 (0.01) | |
34. Level of power desired | 80%, 90%, 95% | |
35. How big a change do you want to detect | Statistical difference, predefined effect size, two standard deviations | |
Data Analysis and reporting | 36. Site data will be analyzed by | Industry, government, an independent agency, the public |
37. Broader program data will be analyzed by | Industry, government, an independent agency, the public | |
38. Data analyses need to be available within | 1 year, 2 years, 5 years | |
39. Study results will be | Public, only summaries are public, limited circulation, confidential | |
40. Summaries will be available | Internet, public presentations, public reports, scientific literature | |
41. Data summaries will include | Public summaries, evaluations against triggers, comparisons against predictions | |
Interpretation | 42.Will any change be interpreted as significant | Yes, No |
43. It is important to decide what an effect | Is, is not | |
44. The importance of a change will be decided by | Pre-defined triggers, best professional judgement, scientific or peer review | |
45. Does unacceptable change have to be decided in advance | Yes, No | |
46.The acceptability of a change is decided by | Government, stakeholders, it is a negotiation | |
47. An unacceptable change should | Be fixed, monitored more closely, it depends | |
48. An ecologically relevant change is | Any change, a change that is getting worse, a change that threatens biodiversity | |
49. An unsustainable change is | Any change, change that threatens growth, reproduction, survival and use, any change is important | |
50.The monitoring system should be adapted | When needed OR every 3, 5 OR 10 years |
Vision | Any effects of an event are detected, tracked, and mitigated, and recovery tracked and documented. |
Objective | To be able to detect change and predict effects, and adaptively manage for changing environmental conditions, with a science-based, integrated and transparent monitoring program. |
Aspiration | Monitoring will identify the magnitude and extent of changes associated with an event or development (and how it is changing over time), and effects on receptors that can be associated with the event. |
Purpose | To understand the impacts of an event and to develop a better understanding of the variability in responses in the system. |
Scope | To detect site-specific, local, and regional change in areas of impacted by the event. |
1 Ask too complicated or an unanswerable question |
2 Not know what the answer is that you are looking for |
3 Decide what endpoints are important after you start |
4 Let non-scientific factors decide the approach and design |
5 Figure out how big a change represents a concern after you start |
6 Decide how many samples you need after you finish collecting |
7 Assuming one study design can answer any possible question |
8 Assume your data is good when you get it |
9 Fail to adapt to your design or approach as you learn more |
10 Assume that you understand what is going on |
1. Monitoring exposures (stressor-based) |
a. Source (routine monitoring by industry) |
b. Ambient (baseline and enhanced stations; water includes quality and quantity stations, tributary, mainstem and lake/ocean) |
c. Deposition |
d. Receptor exposures |
2. Monitoring ecosystem effects (effects-based) |
a. Component (community and/or sentinel species components; for water need mainstem, tributary and lakes/ocean) |
3. Stakeholder-driven monitoring issues (values-based; triggered by regional stakeholder concern) |
4. Cross-components |
a. Background/baseline |
b. Baseline data for new sites |
c. Supporting data (meteorological, hydrological) |
5. Focused monitoring (triggered in by a specific concern) |
a. Extent and magnitude of effects or responses |
i. Spatial distribution |
ii. Temporal distribution |
iii. Change from historical |
b. Examination in alternate species, approaches or levels of organization |
6. Research (prioritized data gaps) |
a. Investigation of cause (when effects are seen) |
i. Characterizing exposures/ sources |
ii. Ecological consequences of impacts |
iii. Management implications |
b. Modelling |
i. Development—improving data, equations or testing assumptions |
ii. Validation—data integration, confirmation or calibration |
iii. Estimating predicted or historical trends |
7. Methods development fingerprinting, new measurement validation or remote sensing |
Tier | Example Trigger | Question | Frequency |
---|---|---|---|
Basic | Are there changes? | Regular | |
Confirmation | Difference beyond a critical effect size threshold (natural variability) | Can we confirm them? | More often |
Extent | Confirmation of changes (reference site adequacy) | What is the extent and magnitude of the change? | More stations and indicators |
Cause | Change across a sufficient area or of a sufficient magnitude, or is getting worse (temporal consistency) | What is the cause? | Research-oriented |
Concern | Change exceeds “ecological relevance” | What is the solution and do I have to mitigate or compensate? | Hopefully never |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Munkittrick, K.R.; Arciszewski, T.J.; Gray, M.A. Principles and Challenges for Multi-Stakeholder Development of Focused, Tiered, and Triggered, Adaptive Monitoring Programs for Aquatic Environments. Diversity 2019, 11, 155. https://doi.org/10.3390/d11090155
Munkittrick KR, Arciszewski TJ, Gray MA. Principles and Challenges for Multi-Stakeholder Development of Focused, Tiered, and Triggered, Adaptive Monitoring Programs for Aquatic Environments. Diversity. 2019; 11(9):155. https://doi.org/10.3390/d11090155
Chicago/Turabian StyleMunkittrick, Kelly R., Tim J. Arciszewski, and Michelle A. Gray. 2019. "Principles and Challenges for Multi-Stakeholder Development of Focused, Tiered, and Triggered, Adaptive Monitoring Programs for Aquatic Environments" Diversity 11, no. 9: 155. https://doi.org/10.3390/d11090155
APA StyleMunkittrick, K. R., Arciszewski, T. J., & Gray, M. A. (2019). Principles and Challenges for Multi-Stakeholder Development of Focused, Tiered, and Triggered, Adaptive Monitoring Programs for Aquatic Environments. Diversity, 11(9), 155. https://doi.org/10.3390/d11090155