Evaluating a Planning Support System’s Use and Effects in Urban Adaptation: An Exploratory Case Study from Berlin, Germany
Abstract
:1. Introduction
- Making evaluations specific enough to be meaningful to a particular case, yet generic enough to offer useful and usable insights for broader research and practice.
- Making evaluations flexible enough to capture locally relevant factors and unintended and unexpected effects yet structured enough to be recognizable and comparable to other applications.
- Capturing both the details of a tool used in a workshop, and the longer-term effects on the planning process, the participants, and the outcomes.
2. Conceptual Framework of Analysis
3. Research Design
3.1. Case Study Description
- A short plenary introduction for all project partners and invited participants;
- Parallel half-day design sessions for each working group to design pilot projects; and
- A plenary integration session to identify opportunities for collaboration between pilots from the different working groups.
3.1.1. Adaptation Support Tool
3.1.2. Relation of the Authors to the Project, the Workshop, and the Tool Evaluated
4. Research Methods
4.1. Evaluation Factors
4.2. Data Collection and Types of Data
- Interviews. Seventeen semi-structured interviews of one to three hours were carried out with the project management team and workshop participants. Audio recordings and written notes were transcribed for analysis.
- Discussions. In addition to the formal interviews, informal on-the-record discussions were used for confirming impressions and information, as well as asking for the views of a wider range of informants. Audio recordings and written notes were transcribed for analysis.
- Documents. A range of documents was reviewed, including planning documents, reports, websites, team emails, and work products.
- Questionnaires. A short questionnaire was taken by participants at the end of the design workshop. The questionnaire measured responses to the design workshop and the tool. The questionnaire included five-point Likert scale ratings and open questions (see Appendix B).
- Observations. Observations were made during the design workshop and at the project’s final symposium event. Written notes were transcribed for analysis.
4.3. Data Analysis
- Inductive (thematic) analysis was made first to surface codes and themes that emerged from the case study.
- Deductive analysis was undertaken using the pre-defined evaluation factors and a list of describing elements from literature. The list of elements (codes) was later refined to those listed in Table 1 (see Appendix C for original list).
- Meta-analysis was used for two purposes. First, to compare the inductive and deductive analyses for different and common findings. From this assessment, a comprehensive list of themes and codes was created. Second, the meta-analysis was used to examine a number of elements that were not well captured in text, yet were important for the evaluation, for example, assessing the quality of work products.
5. Results
5.1. Context
Analysis of Context and its Relation to the Role of the Tool
5.2. Input
Analysis of the Role of the Tool in the Input
5.3. Process
Analysis of the Role of the Tool in the Workshop Process
5.4. Content
Analysis: Role of the Tool in the Content
5.5. Results
Analysis: Role of the Tool in the Results
5.6. Use
Analysis: Role of the Tool in the Use of Results
5.7. Effects
Analysis: Role of the Tool in the Effects
6. Discussion
6.1. Reflections on the Use of the Tool in the Workshop
- Providing information about the many adaptation measures in the tool’s library, which created a common knowledge base and vocabulary for the design.
- Supporting dynamic communication by serving as a focal point of discussion and group work through a shared spatial language in the map and interaction with the tool.
- Ranking suitable measures for the local physical conditions, adaptation targets, and input criteria.
- Producing a mutually-supported spatial plan of preferred measures, with their basic dimensions and locations specified.
- Improving learning among participants through substantive content, enriched communication, and interactions.
6.2. Reflections on Connections with the Planning Process and Context
6.3. Reflections on the Research Methods
6.3.1. Case Study
6.3.2. Evaluation Framework
6.3.3. The Challenges of Evaluation Revisited
- The evaluation factors and describing elements broad enough to be relevant for a variety of tools and workshops and specific enough to produce meaningful insights for our case.
- The framework and analysis method flexible enough to reflect local conditions and capture emergent themes yet structured enough to produce systematic and comparable results.
- The longitudinal case study approach appropriate and effective for capturing the effects of the tool on the planning process and participants, and for revealing the importance of context, while still capturing the details of tool use in the workshop itself.
- The nested view of tools helpful for understanding the use of the tool, its results, and effects.
- The descriptive and qualitative nature of the framework a potential weakness in its reliability, but a strength in its ability to be applied to a wide variety of cases and tools, in different contexts.
- Using many sources and types of data for triangulation.
- Basing the evaluation on a structured framework for ensuring a systematic review of all the data.
- Using different approaches in the data analysis for capturing a comprehensive view of the data.
- Using a longitudinal study for ensuring the consistency of our findings over time.
- Checking our themes, hypotheses, explanations, and findings with key informants.
- Using an evaluator who is independent of the tool, the workshop and the project.
- Engaging an external reviewer to check the evaluation design, analysis and results.
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A
Workshop Agenda and Invited Stakeholders
Time | Activity | Material | Outcome |
---|---|---|---|
9:30 | Coffee and reception | ||
10:00–10:30 | Workshop introductions and overview | Presentations | |
10:30–11:00 | Working group updates—water, energy, mobility | Presentations | |
11:00–11:15 | Introductions
| Consent forms | Group familiarity Communicate aims and agenda Inform and consent |
11:15–11:30 | Presentation of Best-Practices Document by TNO | Report detailing best practices for sustainable urban water management measures in the Netherlands | Learning about best practices and several examples of sustainable urban water management in the Netherlands |
11:30–12:15 | AST Introduction and Start-Up 1
| AST on touch table AST library of measures AST set-up tab White board | Learning about the tool Learning about 67 adaptation measures, sharing experiences and local challenges Focusing on spatial aspect of problems Agreed set-up conditions in tool Short list of preferred measures |
12:15–13:15 | Designing Adaptation Plan in the AST 1
| AST on touch table with tool operator/facilitator | A plan developed in the AST with measures implemented, giving basic dimensions and indicators of effectiveness |
13:15–13:30 | Design Session Wrap-up
| Agreed elaboration of plan Agreed next steps for project | |
13:30–13:40 | Questionnaires
| Hardcopy surveys | Completed surveys |
13:30–14:30 | Lunch | Informal discussions and agreements for actions | |
14:30–15:00 | Working group presentations of design session results—water, energy, mobility | Presentations | Communicating results to other working groups |
15:00–16:00 | Integration session—looking for opportunities to integrate water, energy, mobility pilot projects | Discussion | Integrated project proposals |
16:00–16:15 | Coffee break | ||
16:15–17:00 | Funding session for workgroup leaders | Discussion |
Local Stakeholders from Berlin | |
IPS | Local urban water consultancy. Led the sustainable urban water management group. |
Nolde and Partner | Local design-build-operate consultancy, specializing in urban water solutions. Could design and build nature-based measures. |
Berlin Wasserbetriebe | Berlin water company. Would be responsible for implementing measures related to urban drainage and retention. Three departments were invited: research and development, sanitation, and drainage. |
Bezirksamt Mitte von Berlin | District authority, tasked with approving, operating and maintaining any measures in public streets or green areas in Moabit West. The offices of streets and green spaces, and of nature conservation were invited. |
Senatsverwaltung für Stadtentwicklung und Umwelt | City department for urban development and environment. The departments of water resources and ground water were invited. |
European Partners from the Climate-KIC Consortium | |
Deltares | Dutch institute for applied research in the field of water and subsurface. Facilitated the Adaptation Support Tool design session. |
TNO | Dutch institute for applied sciences. Developed the best practices document. |
Appendix B
Data Collection
1: Prior to Workshop August–September 2016 | |
Documents | Website of preceding project that initiated SSD project |
Report from preceding project that initiated SSD project | |
Internal team emails about organization and planning the design session | |
Interviews | 5 interviewees—interviews were recorded and transcribed |
1 design session organizer | |
1 design session participant | |
1 design session facilitator and participant | |
1 workshop organizer | |
1 project manager | |
2: During Workshop September 2016 | |
Observations | Observations of design session and larger workshop, based on observation protocol. Written notes were used to record observations. |
Surveys | Post-session surveys from all participants |
Discussions | During breaks, short discussions with most participants, organizers and facilitators to check information and ask for impressions. Written records were made of discussions. |
Documents | Presentations made in plenary and working group sessions |
Workshop records (agenda, invitees, participants, etc.) | |
List of measures selected by water group for application | |
Photographs of design session participants working with tool | |
Inspiration document prepared for session | |
3: Immediately Following Workshop September 2016 | |
Documents | Plans developed in design session |
Agreements, plans for next steps | From interviews and discussion in sessions, the planned next steps were recorded, as well as agreed actions of different actors. These were also reported by session organizers to the project management team. |
Interviews | 5 interviewees—interviews were recorded and transcribed. |
1 design session organizer | |
1 design session participant | |
1 design session facilitator and participant | |
1 workshop organizer | |
1 project manager | |
4: Project End—Final Symposium Event December 2016–January 2017 | |
Documents | Final project report |
Symposium records (agenda, invitees, participants) | |
Symposium handouts | |
Presentations made at end symposium events | |
Observations | Observations of end symposium event with stakeholders and partners, based on observation protocol and with use of German interpreter. Written notes and audio memos were used to record observations. |
Interviews | 3 interviewees—interviews were recorded and transcribed. |
1 design session organizer and participant | |
1 design session participant | |
1 design session facilitator and participant | |
1 project manager | |
Discussions | During breaks and after symposium, short discussions several partners, participants, organizers and managers, to check information and ask for impressions. Written records and audio memos were used to record discussions. |
5: One Year Post-Project End January–February 2018 | |
Interviews | 4 interviewees—interviews were recorded and transcribed. |
1 design session organizer and participant | |
1 design session participant | |
1 design session facilitator and participant | |
1 project manager | |
Documents | Project website (SSD Moabit) |
Project status updates and reporting shared during interviews |
Appendix C
Data Analysis
Evaluation Factors | Describing Elements | Codes Used to Develop Describing Element |
---|---|---|
CONTEXT | Local setting | CONTEXT—Setting information CONTEXT—Prior elements |
Institutional setting | CONTEXT—Institutional CONTEXT—Challenges Institutional | |
Project structure and process | CONTEXT—Process structure CONTEXT—Challenges Structural | |
INPUT | Aim and role of activity | INPUT—Aim of Activity |
Organization | INPUT—Resource Availability INPUT—Organization of Activity | |
Stakeholders and participants | INPUT—Actors INPUT—Participants | |
CONTENT | Depth and breadth | CONTENT—Depth and Breadth |
Data and information | CONTENT—Validity and Credibility | |
Tool and methodology | CONTENTv Methodology | |
PROCESS | Procedures | PROCESS—Procedures |
Communication | PROCESS—Communication | |
Way of working | PROCESS—Participants PROCESS—Working Method | |
Organization | PROCESS—Organization PROCESS—Resource Use | |
RESULTS | Outcomes | RESULTS—Work Products RESULTS—Non-product Results |
Documentation | RESULTS—Presentation RESULTS—Availability | |
Value and relevance | RESULTS—Acceptance RESULTS—Relevance RESULTS—Solution quality RESULTS—Verifiability and Validity | |
USE | Direct use | USE—Direct |
Indirect use | USE—Indirect | |
Unused | USE—Unused | |
EFFECTS | Learning effects | EFFECTS—Actors EFFECTS—Learning |
Problem situation effects | EFFECTS—Problem Situation | |
Planning process effects | EFFECTS—Planning Process | |
Decision effects | EFFECTS—Decisions/Policy Quality | |
Intended effects | EFFECTS—Intended |
- ‘Tracking codes’: Codes that were used to keep track of narratives in the data that did not contribute to a specific theme, but were useful for the meta-analysis. For example, a code ‘Planned use’ was helpful for tracking intended use of results, which could later be compared to the actual use.
- ‘Prompting codes’: Factors that were used in the meta-analysis, but were not conducive to text coding. These codes were used as prompts for the meta-analysis. For example, factors such as ‘sensible results’.
Tracking Code | Use |
---|---|
CONTEXT—Challenges General | To identify the role of these challenges as they reinforced/counteracted the role of the design session and tool |
PROCESS—Changes | To identify if the process of the design session strayed from plans |
PROCESS—Ending | To identify how the design session ended |
RESULTS—Planned Actions | A type of result that is a plan to take action by a participant. Later compared to actual actions taken following design session. |
USE—Planned Use | Track intended use of results for different time periods for comparison with actual use. Identified realized, unrealized and realized but unforeseen uses of results and their time frames |
EFFECTS—Types | Track different types of effects over time |
Needed actions | Track actions that were identified as necessary to reach certain aims, like implementation. Later checked which actions were taken and the results. |
Next steps/expectations | Track the plans and expectations of different actors to compare with the actual process and what transpired. |
Participation | Identify role of participation and views on participation |
Perspectives/views | Track different perspectives of actors in the project over time to identify changes, contradictions, shared views, etc. |
Prompting Code | Use |
---|---|
RESULTS—Consistency | Assessing the consistency of the results with the input conditions, actors, process and content of the design session and project |
RESULTS—Documentation | Assessing the quality of the documentation, different from the theme documentation |
RESULTS—Sensible | Assessing whether results seemed reasonable for the project and actors |
USE—Timeframe of Use | Examining when results were used in the process |
USE—Used Elements | Examining which results or elements were used |
USE—Used For | Examining in what capacity or for what purpose results were used |
USE—Who Used | Examining who used which results following the design session |
EFFECTS—Implementation | Assessing implementation or realization with a broad view, not only of ‘built project’ but ‘soft changes’ |
References
- Anguelovski, I.; Chu, E.; Carmin, J. Variations in approaches to urban climate adaptation: Experiences and experimentation from the global South. Glob. Environ. Chang. 2014, 27, 156–167. [Google Scholar] [CrossRef]
- Masson, V.; Marchadier, C.; Adolphe, L.; Aguejdad, R.; Avner, P.; Bonhomme, M.; Bretagne, G.; Briottet, X.; Bueno, B.; de Munck, C.; et al. Adapting cities to climate change: A systemic modelling approach. Urban Clim. 2014, 10, 407–429. [Google Scholar] [CrossRef]
- Mayer, I.S.; van Bueren, E.M.; Bots, P.W.G.; van der Voort, H.; Seijdel, R. Collaborative decisionmaking for sustainable urban renewal projects: A simulation—Gaming approach. Environ. Plan. B Urban Anal. City Sci. 2005, 32, 403–423. [Google Scholar] [CrossRef]
- Eikelboom, T.; Janssen, R. Collaborative use of geodesign tools to support decision-making on adaptation to climate change. Mitig. Adapt. Strategies Glob. Chang. 2017, 22, 247–266. [Google Scholar] [CrossRef] [Green Version]
- Henstra, D. The tools of climate adaptation policy: Analysing instruments and instrument selection. Clim. Policy 2016, 16, 496–521. [Google Scholar] [CrossRef]
- Van de Ven, F.H.M.; Snep, R.P.H.; Koole, S.; Brolsma, R.; van der Brugge, R.; Spijker, J.; Vergroesen, T. Adaptation Planning Support Toolbox: Measurable performance information based tools for co-creation of resilient, ecosystem-based urban plans with urban designers, decision-makers and stakeholders. Environ. Sci. Policy 2016, 66, 427–436. [Google Scholar] [CrossRef] [Green Version]
- Billger, M.; Thuvander, L.; Wästberg, B.S. In search of visualization challenges: The development and implementation of visualization tools for supporting dialogue in urban planning processes. Environ. Plan. B Urban Anal. City Sci. 2017, 44, 1012–1035. [Google Scholar] [CrossRef]
- Al-Kodmany, K. Using visualization techniques for enhancing public participation in planning and design: Process, implementation, and evaluation. Landsc. Urban Plan. 1999, 45, 37–45. [Google Scholar] [CrossRef]
- Geurts, J.L.A.; Joldersma, C. Methodology for participatory policy analysis. Eur. J. Oper. Res. 2001, 128, 300–310. [Google Scholar] [CrossRef]
- Pelzer, P.; Geertman, S.; van der Heijden, R.; Rouwette, E. The added value of Planning Support Systems: A practitioner’s perspective. Comput. Environ. Urban Syst. 2014, 48, 16–27. [Google Scholar] [CrossRef]
- Geertman, S.; Toppen, F.; Stillwell, J. Planning Support Systems for Sustainable Urban Development; Geertman, S., Toppen, F., Stillwell, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; Volume 195. [Google Scholar]
- Arciniegas, G.; Janssen, R.; Rietveld, P. Effectiveness of collaborative map-based decision support tools: Results of an experiment. Environ. Model. Softw. 2013, 39, 159–175. [Google Scholar] [CrossRef]
- Kuller, M.; Bach, P.M.; Roberts, S.; Browne, D.; Deletic, A. A planning-support tool for spatial suitability assessment of green urban stormwater infrastructure. Sci. Total Environ. 2019, 686, 856–868. [Google Scholar] [CrossRef] [PubMed]
- Pelzer, P.; Geertman, S. Planning support systems and interdisciplinary learning. Plan. Theory Pract. 2014, 15, 527–542. [Google Scholar] [CrossRef]
- Russo, P.; Lanzilotti, R.; Costabile, M.F.; Pettit, C.J. Adoption and Use of Software in Land Use Planning Practice: A Multiple-Country Study. Int. J. Hum. Comput. Interact. 2018, 34, 57–72. [Google Scholar] [CrossRef]
- Te Brömmelstroet, M. PSS are more user-friendly, but are they also increasingly useful? Transp. Res. Part A Policy Pract. 2016, 91, 166–177. [Google Scholar] [CrossRef] [Green Version]
- Wardekker, J.A.; de Jong, A.; Knoop, J.M.; van der Sluijs, J.P. Operationalising a resilience approach to adapting an urban delta to uncertain climate changes. Technol. Forecast. Soc. Chang. 2010, 77, 987–998. [Google Scholar] [CrossRef] [Green Version]
- McEvoy, S.; van de Ven, F.H.M.; Blind, M.W.; Slinger, J.H. Planning support tools and their effects in participatory urban adaptation workshops. J. Environ. Manag. 2018, 207, 319–333. [Google Scholar] [CrossRef]
- Sellberg, M.M.; Wilkinson, C.; Peterson, G.D. Resilience assessment: A useful approach to navigate urban sustainability. Ecol. Soc. 2015, 20, 43. [Google Scholar] [CrossRef]
- Arciniegas, G.; Janssen, R. Spatial decision support for collaborative land use planning workshops. Landsc. Urban Plan. 2012, 107, 332–342. [Google Scholar] [CrossRef]
- Pettit, C.J. Use of a collaborative GIS-based planning-support system to assist in formulating a sustainable-development scenario for Hervey Bay, Australia. Environ. Plan. B Plan. Des. 2005, 32, 523–545. [Google Scholar] [CrossRef]
- Geertman, S. Potentials for planning support: A planning-conceptual approach. Environ. Plan. B Plan. Des. 2006, 33, 863–880. [Google Scholar] [CrossRef] [Green Version]
- Goodspeed, R. Sketching and learning: A planning support system field study. Environ. Plan. B Plan. Des. 2015, 43, 444–463. [Google Scholar] [CrossRef]
- Vonk, G.; Geertman, S. Improving the Adoption and Use of Planning Support Systems in Practice. Appl. Spat. Anal. Policy 2008, 1, 153–173. [Google Scholar] [CrossRef] [Green Version]
- Vonk, G.; Geertman, S.; Schot, P. Bottlenecks blocking widespread usage of planning support systems. Environ. Plan. A 2005, 37, 909–924. [Google Scholar] [CrossRef] [Green Version]
- Kuller, M.; Farrelly, M.; Deletic, A.; Bach, P.M. Building effective Planning Support Systems for green urban water infrastructure—Practitioners’ perceptions. Environ. Sci. Policy Policy 2018, 89, 153–162. [Google Scholar] [CrossRef]
- Russo, P.; Lanzilotti, R.; Costabile, M.F.; Pettit, C.J. Towards satisfying practitioners in using Planning Support Systems. Comput. Environ. Urban Syst. 2018, 67, 9–20. [Google Scholar] [CrossRef]
- Pelzer, P.; Geertman, S.; van der Heijden, R. A comparison of the perceived added value of PSS applications in group settings. Comput. Environ. Urban Syst. 2016, 56, 25–35. [Google Scholar] [CrossRef]
- Geertman, S. PSS: Beyond the implementation gap. Transp. Res. Part A Policy Pract. 2017, 104, 70–76. [Google Scholar] [CrossRef]
- Pelzer, P.; Brömmelstroet, M.; Geertman, S. Geodesign in Practice: What About the Urban Designers. In Geodesign by Integrating Design and Geospatial Sciences; Lee, D.J., Dias, E., Scholten, H.J., Eds.; Springer: Berlin/Heidelberg, Germany, 2014; pp. 331–344. [Google Scholar]
- Midgley, G.; Cavana, R.Y.; Brocklesby, J.; Foote, J.L.; Wood, D.R.R.; Ahuriri-driscoll, A. Towards a new framework for evaluating systemic problem structuring methods. Eur. J. Oper. Res. 2013, 229, 143–154. [Google Scholar] [CrossRef] [Green Version]
- Te Brömmelstroet, M. Performance of planning support systems: What is it, and how do we report on it? Comput. Environ. Urban Syst. 2013, 41, 299–308. [Google Scholar] [CrossRef]
- Abelson, J.; Forest, P.G.; Eyles, J.; Smith, P.; Martin, E.; Gauvin, F.P. Deliberations about deliberative methods: Issues in the design and evaluation of public participation processes. Soc. Sci. Med. 2003, 57, 239–251. [Google Scholar] [CrossRef]
- Hassenforder, E.; Smajgl, A.; Ward, J. Towards understanding participatory processes: Framework, application and results. J. Environ. Manag. 2015, 157, 84–95. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pettit, C.; Bakelmun, A.; Lieske, S.N.; Glackin, S.; Hargroves, K.C.; Thomson, G.; Shearer, H.; Dia, H.; Newman, P. Planning support systems for smart cities. City Cult. Soc. 2018, 12, 13–24. [Google Scholar] [CrossRef]
- Innes, J.E.; Booher, D.E. Consensus building and complex adaptive systems: A framework for evaluating collaborative planning. J. Am. Plan. Assoc. 1999, 65, 412–423. [Google Scholar] [CrossRef]
- Jones, N.A.; Perez, P.; Measham, T.G.; Kelly, G.J.; d’Aquino, P.; Daniell, K.A.; Dray, A.; Ferrand, N. Evaluating Participatory Modeling: Developing a Framework for Cross-Case Analysis. Environ. Manag. 2009, 44, 1180–1195. [Google Scholar] [CrossRef]
- Rowe, G. Evaluating Public-Participation Exercises: A Research Agenda. Sci. Technol. Hum. Values 2004, 29, 512–556. [Google Scholar] [CrossRef]
- Rowe, G.; Frewer, L.J. Public participation methods: A framework for evaluation. Sci. Technol. Hum. Values 2000, 25, 3–29. [Google Scholar] [CrossRef]
- Thissen, W.A.H.; Twaalfhoven, P.G.J. Towards a conceptual structure for evaluating policy analytic activities. Eur. J. Oper. Res. 2001, 129, 627–649. [Google Scholar] [CrossRef]
- McEvoy, S.; van de Ven, F.H.M.; Santander, A.G.; Slinger, J.H. The influence of context on the use and added value of Planning Support Systems in workshops: An exploratory case study of climate adaptation planning in Guayaquil, Ecuador. Comput. Environ. Urban Syst. 2019, 77, 101353. [Google Scholar] [CrossRef]
- McEvoy, S. Planning Support Tools in Urban Adaptation Practice Planning Support Tools in Urban Adaptation Practice; Delft University of Technology: Delft, The Netherlands, 2019. [Google Scholar]
- McEvoy, S. Planning Support Tools in Urban Adaptation Practice Planning Support Tools in Urban Adaptation Practice. Available online: https://repository.tudelft.nl/islandora/object/uuid:48b7649c-5062-4c97-bba7-970fc92d7bbf?collection=research (accessed on 10 December 2019).
- EIT Climate-KIC. Moabit West|Climate-KIC. Available online: http://www.climate-kic.org/success-stories/moabit-west/ (accessed on 12 April 2018).
- Green Moabit. Stadtteilentwicklungskonzept: GREEN MOABIT—Bericht Berlin. 2013. Available online: https://sustainum.de/wp-content/uploads/2015/10/Green_Moabit_Bericht-1.pdf (accessed on 17 December 2019).
- Von Bergman, N.K. Smart Sustainable District Moabit West: Final Report 2016. 2017. Available online: http://ssd-moabit.org/wp-content/uploads/2017/01/final_reportcover_website.pdf (accessed on 17 December 2019).
- Voskamp, I.M.; van de Ven, F.H.M. Planning support system for climate adaptation: Composing effective sets of blue-green measures to reduce urban vulnerability to extreme weather events. Build. Environ. 2015, 83, 159–167. [Google Scholar] [CrossRef]
- Yin, R.K. Case Study Research Design and Methods, 3rd ed.; Sage: Thousand Oaks, CA, USA, 2003. [Google Scholar]
- Creswell, J.W. Research Design: Qualitative, Quantitative and Mixed Methods Approaches, 2nd ed.; Sage: Thousand Oaks, CA, USA, 2003. [Google Scholar]
- Pelzer, P.; Arciniegas, G.; Geertman, S.; Lenferink, S. Planning Support Systems and Task-Technology Fit: A Comparative Case Study. Appl. Spat. Anal. Policy 2015, 8, 155–175. [Google Scholar] [CrossRef] [Green Version]
- Pelzer, P.; Arciniegas, G.; Geertman, S.; De Kroes, J. Using MapTable to Learn About Sustainable Urban Development. In Planning Support Systems for Sustainable Urban Development; Springer: Berlin, Germany, 2013. [Google Scholar]
- Tyler, S.; Moench, M. A framework for urban climate resilience. Clim. Dev. 2012, 4, 311–326. [Google Scholar] [CrossRef]
- Birkmann, J.; Garschagen, M.; Setiadi, N. New challenges for adaptive urban governance in highly dynamic environments: Revisiting planning systems and tools for adaptive and strategic planning. Urban Clim. 2014, 7, 115–133. [Google Scholar] [CrossRef]
- Pelzer, P. Usefulness of planning support systems: A conceptual framework and an empirical illustration. Transp. Res. Part A Policy Pract. 2017, 104, 84–95. [Google Scholar] [CrossRef]
- Te Brömmelstroet, M. A Critical Reflection on the Experimental Method for Planning Research: Testing the Added Value of PSS in a Controlled Environment. Plan. Pract. Res. 2015, 30, 179–201. [Google Scholar] [CrossRef]
Evaluation Factors | |
---|---|
Context | The context in which a planning process and a specific workshop take place has important implications for what can be achieved by a planning support tool [22]. Contextual factors can include political and physical conditions, social, technical and ecological systems, and what events have come before. The elements of context that are relevant for a tool’s role and effects in a specific case can vary significantly. Context is described in this study by: local project setting; institutional setting; project structure and process. |
Input | The input to a workshop encompasses everything that was provided to it, such as the available data, the stakeholders related to the issue, and the objectives of the workshop. Input should not be confused with participants’ contributions during the workshop, which is content. Input is described in this study by: Aim and role of workshop; organization of workshop; stakeholders and workshop participants. |
Process | A workshop’s process includes the procedures, communication, and ways of working during the activity. This is not to be confused with the overall planning process, in which the workshop takes place. Tools typically intend to support a workshop process through improved interactions and communication. Process is described in this study by: workshop structure and procedures; communication; way of working. |
Content | Content refers to the substantive material used during a workshop, including information, knowledge, models, maps, perspectives, and values that are shared by participants or provided by organizers. Planning support tools are typically an important source of substantive content. Content is described in this study by: quality and type of data and information used; depth and breadth of content; tool or method used. |
Results | Results are the direct products of a workshop, which include artefacts, like maps, models and planning documents, and less tangible outcomes like alliances and agreements. Tools aim to improve the quality of workshop results through improved content and processes. Results are described in this study by: workshop results; documentation of results; value and relevance of results to the planning process and stakeholders. |
Use | The use of results includes the direct and indirect ways a workshop’s tangible and less tangible results are used over various time frames and by different actors, for different purposes (direct use of a tangible result, for instance, would be a planner applying an idea or measure from a tool directly in the next steps of developing the plan. Indirect use of less tangible results would be a stakeholder leveraging a new alliance to influence decisions on the plan). The use of results leads to effects and also captures the value and meaning of the results for different stakeholders. Use is described in this study by: direct use of results; indirect use of results; unused results. |
Effects | Effects are the workshop’s impacts on the system or actors involved. Assessing effects is complicated as they have different forms and are realized at different temporal and spatial scales. There are direct effects from a workshop, such as learning and new relationships, and indirect effects through the use of results, such as influencing later decisions. Effects may be intended or unintended, and a workshop may clearly be the cause of an effect or only contribute to it. In adaptation and collaborative planning, less tangible effects, such as creating shared meaning, are as important as traditional, more concrete ones [3,36]. Effects are described in this study by: effects on learning; effects on problem situation; effects on planning process; effects on decisions made; (un)intended effects. |
Local project setting |
|
Institutional setting |
|
Project structure and process |
|
Aim and role of the workshop |
|
Organization of the workshop |
|
Stakeholders and workshop participants |
|
Workshop structure and procedures |
|
Communication |
|
Way of working |
|
Quality and type of information used | Sources of Content
|
Depth and breadth of content |
|
Tool or method used |
|
Workshop results | Work Products
|
Documentation of the results |
|
Value and relevance of the results to the planning process and stakeholders |
|
Direct use of results | During the Project
|
Indirect use of results | During the Project
|
Unused results |
|
Effects on learning |
|
Effects on the problem situation |
|
Effects on the planning process |
|
Effects on the decision or policy |
|
(Un)Intended effects |
|
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
McEvoy, S.; van de Ven, F.H.M.; Brolsma, R.; Slinger, J.H. Evaluating a Planning Support System’s Use and Effects in Urban Adaptation: An Exploratory Case Study from Berlin, Germany. Sustainability 2020, 12, 173. https://doi.org/10.3390/su12010173
McEvoy S, van de Ven FHM, Brolsma R, Slinger JH. Evaluating a Planning Support System’s Use and Effects in Urban Adaptation: An Exploratory Case Study from Berlin, Germany. Sustainability. 2020; 12(1):173. https://doi.org/10.3390/su12010173
Chicago/Turabian StyleMcEvoy, Sadie, Frans H. M. van de Ven, Reinder Brolsma, and Jill H. Slinger. 2020. "Evaluating a Planning Support System’s Use and Effects in Urban Adaptation: An Exploratory Case Study from Berlin, Germany" Sustainability 12, no. 1: 173. https://doi.org/10.3390/su12010173
APA StyleMcEvoy, S., van de Ven, F. H. M., Brolsma, R., & Slinger, J. H. (2020). Evaluating a Planning Support System’s Use and Effects in Urban Adaptation: An Exploratory Case Study from Berlin, Germany. Sustainability, 12(1), 173. https://doi.org/10.3390/su12010173