Next Article in Journal
Effects of Urbanization on Regional Extreme-Temperature Changes in China, 1960–2016
Previous Article in Journal
Land Tenure Insecurity Constrains Cropping System Investment in the Jordan Valley of the West Bank
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhancing Rural Innovation and Sustainability Through Impact Assessment: A Review of Methods and Tools

Universidad Politécnica de Madrid, E.T.S.I. Agronómica, Alimentaria y de Biosistemas, Avda, Puerta de Hierro, 2 - 28040 Madrid, Spain
*
Authors to whom correspondence should be addressed.
Sustainability 2020, 12(16), 6559; https://doi.org/10.3390/su12166559
Submission received: 15 June 2020 / Revised: 8 August 2020 / Accepted: 10 August 2020 / Published: 13 August 2020
(This article belongs to the Section Sustainable Agriculture)

Abstract

:
Assessing impacts in innovation contexts/settings with the aim of fostering sustainability requires tackling complex issues. Literature shows that key sources of this complexity relate to the need to integrate the local context; identify the underlying problems; engage key stakeholders; and reflect on their feedback throughout the innovation process. A systematic literature review on innovation impact assessment reveals that social impacts have been the most studied, thus, where promising methods and tools were used. Nevertheless, there are many unresolved issues beyond assessing social impacts in innovation processes. Literature highlights that building on co-creating innovation processes that respond to stakeholders’ real needs and context, and adapting to changing circumstances by integrating timely feedback from stakeholders are two critical challenges calling for a systems thinking approach. This study proposes Developmental Evaluation (DE) as a systemic approach to evaluation which supports adaptive development in complex environments and that adds value by integrating continuous feedback from diverse stakeholders. As a non-prescriptive evaluation approach in terms of methods and tools, DE can provide meaningful guidance to use diverse methods and tools in furthering ongoing development and adaptation in innovation processes by linking the evaluation activities—impact assessment among them—with the DE principles that are situational, adaptive and continuously responsive.

1. Introduction

Complexity in rural territories and agricultural settings involves multi-dimensional problems such as unsustainable land management that creates a wide array of effects ranging from environmental issues such as climate change, biodiversity loss or soil depletion [1,2,3], to social issues that investigates on farmer’s motivation and behavior to promote sustainable agriculture practices [4]. Researchers and practitioners have developed frameworks and methods to evaluate the complexity of the sustainability of agricultural systems [2,5,6,7]. The social dimension of sustainability is probably the least researched pillar among the others [5] yet involves the most complex issues to be tackled. In light of the importance to include the social dimension into sustainability [5], evaluation practices have moved from traditional linear models to development-oriented evaluation practices. The rationale of development-oriented evaluation practices is to enhance the engagement of stakeholders along the evaluation process and integrate their feedback in real time [8,9,10]. Along the project´s pathway, stakeholder´s involvement in the evaluation can significantly alter the strategy and development of the project and adapt the necessary resources according to the changing context and collectively identified priorities.
The issues many rural territories face allow innovation to be an ideal concept that boosts opportunities for sustainable development and disseminates knowledge that creates new opportunities in uncertain contexts [11,12,13]. In many developing countries, innovation has become the key to economic development, employment, increased education and access to international markets [14,15,16,17] that improves the well-being of local livelihoods. Advanced technologies in infrastructure may enhance access to markets thereby increase agricultural production [17,18,19,20,21]. Additionally, improvements in information access and communication let people have better knowledge and access to health care services.
In the European Union, experiences such as the LEADER Initiative [22,23] along with extensive research point out that innovation can be reached in different manners, such as enhancing networks among stakeholders; emphasizing participatory approaches [24]; assessing the Knowledge Transfer and Innovation (KT&I) of the European Rural Development Policy (ERDP) [25]; fostering capacity building at each individual, organizational, project and network level, and observing how each level brings in new opportunities [26].
Other recent literature on innovation implies production modernization [27], formation of networks [28] and adoption of new sustainable agriculture practices by farmers as part of innovation [4]. Bopp et al. (2019), in their study, showed how the farmers were intrinsically and extrinsically (through economic incentives) motivated to promote the use of sustainable agricultural practices administered by the Chilean Ministry of Agriculture and concluded that the interaction between the intrinsic and extrinsic motivation significantly predicted the adoption of these practices. These types of initiatives to support innovation commonly focus their activity on promoting and developing organizations as pathways to bring social change in a community. As such, social change triggers social innovation that awakens new ways of thinking and encourages the collaborative actions that can facilitate an innovative environment, as well as sustainable well-being and health [29].
Assessing the impact of innovation examples should bring us closer to bringing the four points together and think of them as a whole system rather than as an individual challenge. The four main points are: integrating the local context; identifying the underlying problem; engaging key stakeholders; and reflecting on their feedback throughout the process. There is room to explore traditional impact assessment practices that have been limited to using linear models along with methods measuring impacts on economic returns such as cost-benefit analysis using quantitative methods [8,16] that failed to address the proper needs of the intended users [30,31]. Mixed-method approaches have been suggested to give immediate feedback with explicit answers and value-added dimensions that are not revealed through quantitative and qualitative analysis alone [9,28,32].
Through a literature review, this study explores how and what methods are used to uncover these not-so-simple issues. For feasibility of research, the study follows a logic to uncover the challenges faced in the innovation processes. It identifies the challenges of impact assessment in innovation contexts; which methods and tools are used to face these challenges; which of these challenges have been solved and which of them remain to be solved; and how Developmental Evaluation (hereafter DE) [10,33] can be used to contribute to solving these challenges.
A systematic literature review was done on ‘innovation impact assessment’ in the agriculture, rural and forestry areas to report on findings concerning methods and tools. The selection of articles was based on authors’ reviews on the extent to which the mentioned methods and tools responded to the challenges of impact assessment and innovation. Seventy-four articles were selected in the first round; after that, a second round of articles were selected. In the second round, 18 articles were regarded as promising for their reporting on participatory methods that focused on the co-creation stage, the involvement and active engagement of all stakeholders who want to bring about change within their project. These participatory methods intend to find the underlying causes of the problem as well as interaction among resources and stakeholders who are committed to change. With this in mind, the study aims to answer the following questions:
RQ 1. Which methodological approaches and impact areas have recently been studied in the literature and which of them are development-oriented and seek adaptation by integrating feedback from stakeholders?
RQ 2. What challenges are faced in innovation processes in the rural area?
RQ 3. Which of these challenges are supported by impact assessment?
RQ 4. What are the current challenges of impact assessment?
RQ 5. What tools and methods are there to solve the challenges of innovation impact assessment and which of the challenges remain to be solved?
RQ 6. How can the evaluation approach of Developmental Evaluation improve the tools and methods to respond to innovation?
Before the Materials and Methods section, the Developmental Evaluation approach is explained as an overarching approach. The present paper focuses on exploring development-oriented methods and tools that help to overcome the challenges faced by innovation and help to create more sustainable rural areas. DE will be the approach to improve the methods and tools found. After the DE brief explanation follows the Methodology section, divided into two parts. The first part explains the literature review step-by-step and the framework used for this study. The second part explains the in-depth review process of articles with methods that mention the challenges of the rural territory, innovation and its process and innovation impact assessment. The Results section also has two parts. The first part of the Results section shows which impact area and methods were studied in the 74 pre-selected articles. The second section analyzes the 18 selected articles that focus on the challenges of impact assessment and responding to their methods and tools. As a Discussion and Conclusion, the authors elaborate on the fidelity of DE and how DE and its principles could be incorporated within impact assessment practices.

2. Developmental Evaluation—A Theoretical Background

Developmental Evaluation (DE) is an evaluation approach designed to supporting innovation and adaptation in complex situations and environments. Furthermore, if evaluators were to focus on ‘what’ is being evaluated and ‘how’ the environment surrounding the ‘what’ is evolving, DE can provide responses to the ‘how what is being evaluated is adapting and with what the consequences are’. This approach works in any discipline, project or program that requires ongoing adjustments to refine and achieve its aim along its pathway.
Developmental Evaluation (DE) supports continuous decision making by means of bringing evaluative thinking and providing meaningful and timely information, data and knowledge. It does this by “documenting and interpreting the dynamic interactions and interdependencies that occur as innovation unfolds” [33] p. 7. Thus, DE acknowledges the complexity of the context during the project cycle and makes timely adjustments to adapt to it by considering the organization’s learning process. DE transcends the formative/summative dichotomy—as defined by Scriven (1967) [34]—where the evaluation field has been traditionally bounded [33]. It is unlike formative and summative evaluation, where the project is “identifiable, specifiable, stable, implementable, standardized and replicable” [33] p. 37.
Developmental Evaluation focuses on what is working and what is not working with the aim of supporting learning adaptability, rather than indicating success or failure. The United States Agency for International Development (USAID) adopted the DE approach for the 2-year evaluative learning review of the People-to-People Reconciliation fund program [10,35] which was characterized by multiple levels of stakeholders and the rigidity of government procurement processes. The advantage of utilizing DE was the immediate feedback that it was able to provide which allowed the program to be adaptable and flexible as an integrated part of the participative process. The feedback and reflection provided opportunities for real-time adjustments to the program. As an adaptive evaluation and utilization-focused approach [35], DE supported the intended users—those who want to bring about change and innovation—from beginning to end to ensure utility and actual use [10,33,36].
The Challenge Scholars program, created by the Grand Rapids Community Foundation with the Grand Rapids Public School, used the DE approach to investigate the correlation between grant investments and students’ academic achievements [10,37]. As part of the evaluation, a systems map of the Challenge Scholars program was developed so that the evaluators and stakeholders could easily understand the diverse perspectives engaged and their different expectations. The map laid out the foundation for a development plan and schedule for data collection that ranged from document reviews to interviews to observations and focus groups. This enabled evaluators and stakeholders to think systematically together in an ongoing process feeding into the evaluation and ultimately, into the development of the program.
Developmental Evaluation is not model-focused, nor methods or tools prescriptive. Rather, it is a guiding heuristic “about doing what makes sense that is grounded in situational adaptation” [10] p. 55. Events and actions in innovation projects do not happen in a planned, sequenced order. As a result, they demand a high level of awareness among the evaluators and stakeholders who are involved in the process of the project and are required to think strategically, respond flexibly, and adapt rapidly to changing circumstances. To avoid any sudden surprises, the evaluator must collaborate to design together with the stakeholders and conduct an evaluation process, considering the expectations and information needs of those who will use the evaluation in support of the innovation process. The evaluator must not work as an independent observer, but as part of the team and as a facilitator, who is continuously designing evaluation methods and frameworks while bearing in mind the interdependencies and interconnectivity of the stakeholders, the changing context and the necessary interventions. This kind of evaluation process may give researchers and decision-makers the impression that evaluation is an external process, however, they need to accept that innovative assessment is an internal process that continuously follows the implementation cycle within the organizations engaged [16,38].
The most fundamental principles of Developmental Evaluation (DE) are that it provides multiple pathways, is development-oriented, and engages stakeholders from beginning to end and encourages them to learn and adapt to new situations. The principles of DE are taken from the book Developmental Evaluation Exemplars: Principles in Practice [10], and are selected by the authors those which are considered essential and complementary to other impact assessment key elements. The approach looks at the participative process rather than the outcome or result where the evaluators often ask themselves evaluation probing questions and apply a logic when implementing methods that are suited to diverse situations. It does not “rely on or advocate any particular evaluation method, design, tool, or inquiry framework” [39] p.10. Instead, methods need to be emergent and flexible depending on the situation, the timing and changes in situations. Lastly, but not the least important, is the learning process. Numerous studies have discussed that institutional as well as individual learning should be part of the evaluation process, so that outcomes and results reach beyond economical or ecological impacts. The learning process should not be a result of the evaluation phenomena, but a task within the evaluation process.
Developmental Evaluation strongly engages the organizational learning principles that support and bridge the gap between intended and actual outcomes—the gap between innovation and practices [40]. The key challenge of organizational learning is the effective transition of knowledge into action. Hall et al. (2003) [16], in their study, emphasize that institutional learning should be embedded into the innovation system by combining learning and evaluation as a collective task. Rural development regarded as an entity or systems of learning needs to engage the knowledge of researchers and scientists with the on-ground experience and the practice of farmers to inform and enhance policy-making decisions.
Developmental evaluators often intervene from the initiation stage of the project and design the project with stakeholders to appropriately create probing evaluative questions and apply situational appropriate rigorous evaluation methods [10,37]. DE adds a dimension to the tools, methods and concepts, and captures these as innovations that drive change. The change occurs when actions have been taken to undertake the change, often at organizational level [38]. DE orients us to new perspectives and highlights the importance of adaptive change and actions that evolve as the project progresses, while remaining faithful to the main principles and goals. The practices of developmental measurements and evaluation practices are widely used where the environment is complex and involves multiple actors. It is a promising approach, responding and complementing on the shortcomings in that it “guides to adaptive actions in innovative initiatives facing high uncertainty” [39].

3. Materials and Methods

The interest of this study is to find methodologies and frameworks that draw upon new solutions to face the challenges for an impactful innovation impact assessment in the rural territory. The focus is to explore how methods address the innovation process itself as it develops throughout the process. Therefore, the study is not looking at the result of the innovation processes, but how impact assessment can support and optimize the innovation process. For that purpose, a two-stage systematic literature review was conducted. The first stage identifies papers addressing innovation impact assessment in the rural, agriculture and forestry area. In the second stage, the focus is to narrow down on papers which elaborate on participatory methods that are more context-specific, development and process-oriented and that seek to optimize innovation by overcoming challenges the stakeholders face in their context.

3.1. Step 1: Systematic Literature Search

The purpose of Step 1 is to collect all relevant information through a systematic internet search, snowball sampling and experts’ recommendations.

3.1.1. Data Collection

A scientific literature review was conducted, where the following keywords of ‘innovation impact assessment’, ‘rural impact assessment’, ‘forestry impact assessment’ and ‘agriculture impact assessment’ were searched in the article titles, along with scientific articles that were published from 2000 until June 2018 in Web of Knowledge, Scopus and selected prominent journals in the evaluation field, which are not limited to specific disciplines or projects. The journals are the American Journal of Evaluation; Evaluation and Program Planning; Evaluation Review: A Journal of Applied Social Research; and Evaluation: The International Journal of Theory, Research, and Practice. The search was not limited to any specific continent or region. To extend our search field, snowball sampling was done from two relevant articles [30,41], allowing us to collect another 39 articles. The reference lists of these two articles were considered another pool to collect our data. All articles that included evaluation or innovation or impact assessment were taken to be reviewed by the authors. To widen the scope, a call for recommendations of relevant literature was launched in August 2018 through a Horizon 2020 funded consortium on ‘optimizing innovation through multi-actor projects’ (LIAISON). Researchers, university professors and NGO staff from 15 countries in Europe have nominated 175 documents including scientific and grey literature. In total, 348 documents were reviewed by their titles and abstracts according to the main interest of this research, which is to find methods on context-specific and process-oriented innovation impact assessment. Only scientific and mainly social science-focused articles which reflected the following were considered for review in the first step: challenges faced by innovation processes, that is, methods that are very context dependent, but also complex and take the surrounding environment into account, and that include interaction between stakeholders and develop co-creation. The methodological approach and the methods and tools used in the 74 scientific articles were identified according to the aforementioned criteria that authors found in the literature to be essential challenges to overcome for an impact assessment. These articles were also classified into either economic, social, environmental or sustainable impact area.

3.1.2. Analytical Framework

An analytical framework was elaborated using adaptations from the European Commission (2017) [42], that uses a classification framework of economic, social and environmental impact, Weißhuhn et al. (2018) [43] and Tamee et al. (2018) [41], analyzing the 74 articles according to the methodological approach, impact area and methods used (Figure 1).
There are four categories for methodological approaches: conceptual, qualitative, quantitative, and mixed methods according to the approach by Weißhuhn et al. (2018). The middle column of Table 1 lists the methods used such as economic valuation, case studies or conceptual framework development, etc. The far-right column shows methodology examples that are commonly used (calculation of cost-benefit analysis, interviews, workshops, literature review, etc.) for each generic method. Both columns are referred from Tamee et al. (2018). The conceptual method is the development of a framework or a concept for undertaking a review. This information results from expert interviews, surveys and questionnaires undertaken to measure their preference, attitude and to record explicit answers. Quantitative methods use numeric and measurable data to calculate and measure econometric analysis and cost-benefit analysis. Mixed methods are a combination of two or more methods. Studies that undertake participatory approaches or case studies use mixed methods in order to support qualitative data with quantitative data. Table 1 shows the methodological approaches matched to the methods and tools adopted by Tamee et al. (2018) and Weißhuhn et al. (2018).
The European Commission guideline toolbox was used as a classification framework to determine whether the articles corresponded to economic, social and environmental impacts [42]. A fourth impact area, sustainability impact, was introduced by Weißhuhn et al. (2018) [43]. The sustainability impact addresses all three impacts—the economic, social and ecological dimensions. The EC classification framework includes all possible impacts that can derive from qualitative impact assessment. Moreover, we find it the most elaborated classification framework established to date. It enlists 10 economic impacts including trade, price volatility, competitiveness, intellectual property rights, technology and business skills; 8 social impacts including labor market, gender equality and food security; and 8 environmental impacts including ecosystem approach, biodiversity and natural resource management. The fourth classification, which is sustainability, is not officially mentioned in the European Commission document, but was classified by Weißhuhn et al. (2018) as a joint impact of the three areas. Decision making on natural resource management explores on the environmental impact with technical support and knowledge and social skills from stakeholders [44].

3.2. Step 2: Selection of 18 Articles

This step intends to go into a deeper analysis of methods and tools used in an innovation impact assessment. The methods and tools used in the 74 articles were reviewed whether they responded to the challenges of innovation process or not (see Step 1). From the 74, 18 articles were selected that used participatory methods to increase co-creation. For example, Quiedeville et al. (2017) [45] used the participatory approach to look into the pathways of the project and by cross-examining the impacts which the stakeholders envision. The selected articles were further analyzed in detail by looking at the co-benefits, which are the benefits of using a promising method or tool matching their objectives, and their main findings (outcome). Co-benefits were defined as the characteristics and advantages of the specific methods used, whereas the main findings were defined as the outcomes from the methods used. By analyzing on the main findings, authors could see whether the method or tool corresponded to the intended results. The results of the analysis are listed in the Results section (Table 3).

4. Results

This section is divided according to the research questions.

4.1. RQ 1. Which Methodological Approaches and Impact Areas Have Recently Been Studied in the Literature and Which of them Are Development-Oriented and Seek Adaptation by Integrating Feedback from Stakeholders?

Table 2 shows the 74 articles analyzed according to the methodological approach, impact area, and methods and tools used. Each article may fall under one or two impact areas, for example, social and environmental. Therefore, the total number of articles in each impact area does not add up to 74. For readers’ convenience, an example of methods and tools is provided for each impact area in each methodological approach. The methods mentioned in Table 2 are specific to the context and are therefore referenced to one or two related articles. However, tools are more generic in use and can be used in two or more methods, therefore, tools are not referred to a specific article.

4.1.1. Methodological Approach

Within the conceptual approach, the mostly studied impact area is social followed by sustainability. As these two impacts are studied widely, the tools and methods used are mostly descriptive of the project cycle and reviews of literature aimed at finding a common framework. The conceptual approach is gaining popularity among academics and policy-makers to build upon new emerging ideas. Figure 2 shows the percentage and number of the 74 articles according to the methodological approach and Figure 3 shows the percentage and number of the 41 articles of the mixed methods paper broken down into conceptual + qualitative + quantitative, conceptual + qualitative, conceptual + quantitative and qualitative + quantitative.
The conceptual approach (29.7%) is the most prominently used method type when regarded as a sole method (Figure 3). It is the development of a framework or a concept for measuring impacts of research, including “tracking of innovation pathways or the identification of barriers and supporting factors for impact generation” [43] p. 38.
Mixed methods are a combination of conceptual, qualitative and quantitative methods (Figure 3). The findings from the conceptual method may lead to synergy effects when combined with other methods as they can contemplate each other. For example, measured outcomes from quantitative analysis give profound support to exploratory phenomena. This way the combination of the quantitative and conceptual methods presents a genuine idea. The most notable combination of conceptual methods is with qualitative and quantitative, or with both separately (53 articles), because they provide deeper explanations of complex phenomena than any single method could provide on its own.

4.1.2. Impact Area

It is clear that social impact is the most dominant followed by environmental and sustainability impacts (Figure 4).
Articles on social impact mainly deal with theoretical issues and conceptual development [52,53,54,55] that integrates a new concept into an already existing framework and thereby establishes new indicators or provides a new perspective.
Figure 5 shows the percentage of impact areas for each methodological approach. It is evident that social takes up the highest portion of all methodological approaches apart from the quantitative approach. Research on social impacts use the conceptual method most often (73%) followed by mixed methods of the conceptual + qualitative (64%) and the conceptual + qualitative method (43%). The sustainability area primarily uses the qualitative approach (43%), since it concentrates on all three pillars: economic, social and environmental. Interestingly, there was no attempt to use mixed methods, that uses the combination of conceptual + qualitative (0%) or conceptual + quantitative (0%). Economic impact uses the qualitative method (29%) and the mixed method of qualitative + quantitative method (25%). This asserts that economic impact assessments need explanations beyond numeric results.
In the social impact area, the conceptual approach are frameworks where stakeholders commonly act and design on a project (for example the Participation Action Research or the Outcome- evidencing method). The conceptual approach used in combination with qualitative approach is, for example, conceptualizations to clarify and link concepts with relevant activities (for example, the concept of social innovation linking activities to evaluation practices). Additionally, quantitative data used can support concepts with numerical evidence. However, in the literature review, none of the studies used this combination. Studies on environmental issues such as biodiversity, land use change and other ecosystem approaches using tools to assess protected areas or resource management have been classified as social impact [56,57] because they deal with effect on the livelihood of the community and the community’s integration into social and environmental decision-making processes. Conceptual development is increasingly gaining interest regarding research that indicates a shift from economic to social impacts to solve complex issues.
Eighteen promising articles were used to evaluate frameworks or methods. These articles use participatory approaches that involve stakeholders’ continuous interaction during the project to undertake actions for the underlying cause. The methods are development-oriented, context-specific, applicable to various stages of the projects, and emphasize institutional change and learning processes. A detailed analysis of the 18 articles is done in RQ2 and Table 3.
Figure 6 shows the percentage and number of the 18 articles according to the methodological approach. Six of them used the conceptual method (31.6%) solely and nine of them used a mixed methods approach (47.5%). A total of eight (42.2%) articles have used the conceptual method in combination with other approaches.
Social impact research is among the highest in the 18 articles selected. Fourteen out of 18 articles address the societal challenges the stakeholders face, which include insecurity reduction, livelihood improvement in rural communities, farmers’ transitions to organic farming, institutional learning, community development in terms of community resilience, improvement of food value chain by including gender integration, and socio-economic and environmental impact of alternatives to current agricultural systems. Considering this, along with the result that the conceptual method, either alone or in combination with the qualitative approach, is dominant, it supports our findings that most social impact researches use the conceptual approach.
Figure 7 exhibits a visualization of the research process. The challenges of innovation and the challenges of impact assessment are defined separately, since the main aims of the two disciplines are different. Through the literature review, methods and tools were explored that corresponded to the challenges of innovation and challenges of impact assessment that projects faced. However, challenges still remain to be solved, such as adaptation and systems thinking. It is the authors’ argument that DE, as an umbrella approach, can guide the methods and tools toward a more sustainable approach of the impact assessment activities. The argument and research logic is explained throughout the remainder of the paper.
In the following sections, RQ 2 to RQ 4 respond to challenges innovation processes face and how methods and tools in impact assessment try to solve them. Challenges still remain to be solved and these are addressed in RQ 5. Lastly, in the Discussion and Conclusion section, RQ 6 elaborates on how the DE approach can support overcoming the remaining challenges.

4.2. RQ 2. What Challenges Are Faced in Innovation Processes in the Rural Area?

Many of the concerns faced in the rural territory are identifying the underlying problem including co-creation and designing, which is the active involvement of stakeholders. Careful planning of the project through the active interaction and participation of all those who want to bring about change is also a major concern.
All projects revised in this study are intended to solve challenges using exploratory methods. The objective was not only to achieve the goal by itself, but to embrace the context and react immediately to feedbacks and suggestions, which allowed development of their own concepts and the ability to elaborate on them. Fourteen of these studies were on social impacts (77.8%), one on economic impacts (5.6%), one on sustainability (5.6%); and two on mixed impacts (11.1%).

4.2.1. Identifying a Problem and Being Committed to Change

Innovation could be hindered by its high dependency on aspects such as infrequent technology and infrastructure availability, as well as the joint commitment of stakeholders to work on innovation in rural areas as a way to ensure adaptive action through learning. In addition, the technology innovation and the innovation of the social side bring about positive change and social well-being [58]. Social change is realized through stakeholders or by those supporting innovation by bringing in action from researchers, advisers and evaluators. In order to bring stakeholders’ commitment, it is essential to lead their motivation and commitment to participate first. The participatory action of stakeholders and the learning processes in innovation brings better social outcomes [46,49]. This can be achieved by actively involving and winning the commitment of scientists and farmers, which then creates a community-oriented environment [46,51,59]. However, the reason why the adaptation of innovation—and of innovative solutions—has been slow is due to the difficulties of identifying the underlying cause of the problem and taking proper actions accordingly, while not forgetting the needs of stakeholders [9] and making needed changes throughout the process. The ability to bring people together in the innovation process, both those directly and indirectly engaged, is one of the main obstacles to move forward in the co-creation process. In order to motivate those that are not directly engaged in innovative thinking to adapt innovative solutions, an identified common need is required in advance. Although this can occur through participation, the reality is that a consensus on the fundamental need to innovate among the farmers in their territory is not reached most of the time [59].

4.2.2. Co-Creating and Being Context-Dependent

When a common problem is identified, the next steps is to find approaches and methods that can support to respond to the identified problem. All interventions occurring from the problem identification stage to the problem solved stage create social change with intended or unintended social consequences [8,46]. The impact of the change will be perceived by stakeholders or local communities [60], therefore, their involvement from the beginning and during the designing process is essential to identify underlying causes of the problem and undertake actions appropriate to the context [9,16,45]. Collaborative research between scientists and farmers is crucial in the co-creation and designing stage where the development of a context-specific research method takes place. If different learning contexts are fulfilled by both the scientists and practitioners, and interaction is facilitated, social learning is fostered [45] which results in something that is beyond intentions.

4.2.3. Involving Stakeholders throughout the Process Through Interaction

We see from the review that studies support the idea that exploratory approaches better expose the explicit answers obtained from key informants. Explicit information coming from active citizen participation leads to an enhanced decision-making process and that improves the final performance or decisions [61,62,63]. Qualitative research methods collect feedback to assist in modifying projects. If they are coherent, credible and multi-dimensional, the different opinions and thoughts are interconnected and causal relations become clearer as they inform researchers and practitioners about the necessary resources needed to solve the problem [8]. Researchers use customized tools to improve plan preparation and monitoring in multiple stages throughout the project [14,19,64]. The process within a project is another critical variable to consider when making any further planning or assessments. A more systematic and structural process helps enhance critical reflection and assess the effectiveness of the developmental process along a value chain that considers changing conditions [65]. Critical to involvement are the participatory actions along the pathway, where stakeholders provide continuous feedback and interactions within their networks and promote developing innovative capacities and community empowerment [49,53].
Rural innovation needs to incorporate rural challenges that are seen as complex. These challenges need to be viewed with holistic approaches [66,67] that are non-linear and in which people take advantage of new technology and undergo change [66]. Innovation is very context-dependent. As Preskill and Gopal (2014) [13] p.14. put it, “Pay particular attention to context and be responsive to changes as they occur.” It often begins with problem identification and commitment to change [68], where the commitment must be made by all stakeholders involved even before the strategy implementation stage. This is the co-creation that involves and includes all interest and needs of stakeholders and is well reflected in the problem identification stage.
A detailed explanation of the literature referring to innovation process challenges in the rural territory is described in the right column of Table 3. The column of co-benefits correspond to how well the method or tool responded to the intended results, and main findings. The table also indicates how the relevant methods and tools acknowledge and embrace the challenges faced in the rural area.

4.3. RQ 3. Which of These Challenges Are Supported by Impact Assessment?

This section elaborates on how impact assessment can support overcoming the challenges in the innovation process in a rural territory. Impact assessment responds to many of the challenges mentioned above and, with time, has evolved as an evaluation practice that assesses the process along with development of the project.

4.3.1. Responding to the Needs of Stakeholders and Their Context

Current impact assessment practices address the interwoven issues involving economic, social and environmental dimensions by responding to the questions ‘how’ and ‘in what context’ projects are working [9,33,53,70] and ‘for whom’ things have occurred. It is responding to the stakeholders’ needs in dynamic contexts. To understand the context and ‘how’ and ‘why’ something has happened in its context, it is essential to follow the process. The process integrates the human factor that realizes development and growth by building on citizens’ capacity and sense of ownership to take the changes and developments into their own hands. A participative process can achieve collective learning and communication [71,72] that can build social capital, that is associated with trust and network within the region and is most likely to result in tangible improvements [73].

4.3.2. Thinking in Terms of Complexity

As a key tool for accountability, learning and improvement, the demand for innovation impact assessment has increased significantly in parallel with the broader impact assessment practices since the beginning of the 2000s. Programs or projects seek innovative solutions which are complex and multi-dimensional. Key components of complexity are not very well-known in advance [33]. However, the complexities of the innovation processes need to be acknowledged, analyzed and integrated. Practitioners and researchers are in search of innovation that is multi-faceted and often a socio-cultural process of learning and interaction between stakeholders and social practices (such as perceptions, meanings, experience, competences, purposes and values) [74,75]. To gain a multi-actor perspective and multi-disciplinary view, all interventions and actions happening with the project or program need to be viewed as a process.

4.4. RQ 4. What are the Current Challenges of Impact Assessment?

Since the 2000s, the broader impact assessment practice has evolved into analyzing the impact processes, including responding to the real needs of stakeholders according to each context. To involve stakeholders, the co-creation and designing stage is inevitable in order to arrive at the desired impact. This leads to building capacity and eventually to empowering individuals, organizations and/or institutions [45,50,53,62,76,77] to input their thinking into the problem identification stage.

4.4.1. Co-creation and Design in Order to Respond to the Context

Stakeholder commitment is crucial in innovation settings to promote learning processes that bring about better social outcomes [38,41,50,78]. Specific tools are needed to satisfy the needs of the stakeholders, identify the underlying cause of the problem and embed these in the context of the stakeholder. Numerous findings point out that the social and ecological impacts of the underlying cause must be addressed, and not only the economic impact, to explore on the social issues. This can be addressed by using exploratory analysis to explicitly answer the expectations of stakeholders [8,9,43,49,62,68,78,79]. These methods pinpoint the social demands of the stakeholders by considering their specific needs, ideas and thinking continuously as the project evolves.

4.4.2. Developmental Process

With time, evaluating the process of projects has become ever more important than measuring project impacts at a specific point in time. The continuously changing environment demands that project resources are allocated in a different manner when necessary so that they can assess and interpret changes according to context when a sudden event occurs. The progress of projects and the stronger commitment of stakeholders at each stage of the project cycle has become even more important when identifying specific needs and resources that emphasize the co-creation level [10,33].

4.4.3. Timely Feedback

Timely feedback from stakeholders, and also from the context of the project, provides informative and up-to-date evidence which may be essential for adequate management of the projects. Timely feedback from stakeholders and reflecting it to the projects has still not been well- practiced, yet it is essential to understand the mechanism that gives rise to changes. Together with feedback integration, being aware of and acknowledging the changing environment, and adapting plans, events and activities, accordingly, are remaining challenges that still need to be unraveled.

4.5. RQ 5. What Tools and Methods Are There to Solve the Challenges of Innovation Impact Assessment and Which of the Challenges Remain to Be Solved?

The analyzed 18 articles show that ‘social’ is indeed the keyword when it comes to innovation. There is emphasis on the context and the process of development of the project where the outcome of the project serves as a special purpose and toward a better outcome. Participatory Impact Pathway Analysis (PIPA), Social Network Analysis (SNA), Outcome Harvesting, Participatory Action Research (PAR), Outcome-evidencing method, Development-oriented Analysis (DOA), Social Impact Assessment, Regulatory Impact Assessment method, Customized Monitoring and Evaluation Framework have been identified as promising evaluation methods that address the importance of ongoing development of the project process. Many of the articles (57% from the 18 articles) apply participatory approaches, which have led their projects and programs to innovation. According to these articles, participatory actions and the continuous improvement are required by all those who want to innovate [9,45,49,52,53].
The reflective and participatory methods and tools, mentioned above, are responding to most of the challenges identified in the innovation process and impact assessment. However, it does not respond to two essential points. Adapting to changing situations by reflecting on timely feedback and switching to systems thinking are the two aspects current evaluation practices need to consider. Identifying, coping with and responding to the surrounding environment remain a challenge to both innovation and impact assessment. This implies that adaptation is crucial in present-day evaluation practices. Nowadays, multiple events occur abruptly and rapidly, demanding stakeholders and evaluators to respond accordingly

5. Discussion

The study finds that social impact studies using mixed-method approaches are inevitable in research and practice when facing complex issues. The result outlines specific methods on innovation and impact assessment, however, they do not reply to how to entangle challenges in diverse and continuously changing circumstances. To further improve impact assessment of innovation, researchers, practitioners and evaluators need to concentrate on how they will work and co-create the design of the project so that stakeholders understand the underlying problems and come to an agreement about what needs to be solved and what opportunities there are, as a way to improve the current situation [41,45,50,62,79,80,81,82]. This study proposes Developmental Evaluation as an evaluation approach to complement the found methods. The Discussion section moves on to explain the methods and tools found in the literature and how the principles of DE can improve these methods and tools. From the evaluation point of view, the methods need to be embedded under an evaluation approach for the purpose of optimizing innovation impact assessment. Given the findings from the literature review, it can be implied that the DE approach can bestow the adaptive and innovative solutions to the methods mentioned. Since DE is a “methodologically agnostic” evaluation approach [83] this thought experiment is theoretically valid and can bring meaningful insights on how these promising methods can be better tailored to innovation processes.

5.1. Developmental Evaluation Principles

Developmental Evaluation differs from traditional impact assessment practices in that it looks at the participative process of a project or program rather than just focusing on goal attainment. Stakeholders’ participation from the beginning on this co-creation process has a clear purpose. The purpose is to feed the projects with their thoughts and opinions supported by timely and meaningful evaluative thinking so as to improve the projects and help them to become useful to the stakeholders in a changing and complex environment. Diwakar et al. (2008), in their study, came up with a monitoring plan to make the developmental process transparent to all those involved. This does not only help impact assessment at various stages of the project, but also at the preparation stage.
The essential DE principles are adaptation, process-oriented, multiple pathways and timely feedback. Section 5.2 responds to how the DE principles can improve the methods and tools to respond to innovation.
Fidelity to the DE principles is the key for the successful application and utility of this approach [84]. During the process, the feedback of the stakeholders make it possible for the activities to adapt to new situations and circumstances. Schramm et al. (2016), in their impact assessment tool, relied on their ‘voice of customer’ strategy, which gave them explicit answers that provided valuable input data. Ongoing feedback is essential in DE and many examples in our literature review have supported that feedback integration bears good results [8,9,41,65]. Adaptation makes the project responsive to emerging events and changing environments, and reduces the risks of any unknown occurrence or unforeseeable negative side effect. In parallel to responding to emergent events, integrating timely feedback of stakeholders along the project process is just as crucial. Timely feedback can provide valuable information that might have been overlooked at the initial stage of the project. Such information may determine the interaction among resources and maintain a smooth environment where any obstacles are identified and acted upon before they hinder the project’s success. Being aware of the changing environment, and at the same time listening to the feedback of stakeholders, requires a systematic, ongoing, and constructive way of thinking about the issues and changing conditions in a systemic way. Systemic thinking is presented as a way out of the complex situation.

5.2. RQ 6. How Can the Evaluation Approach of Developmental Evaluation Improve the Tools and Methods to Respond to Innovation?

This section explains how each method functions and interprets them under the principles of DE. Each method addresses one or two principles, but there is no method that addresses all of them. This is an important finding, because the idea of DE is that the principles work together in series as in the form of a cog, so that they can complement each other. Thus, DE cannot be implemented without addressing all the principles. DE principles are a set, not a menu.
The Participatory Impact Pathway Analysis (PIPA) serves as a tool that encourages reflection, learning and adjustments as the project develops. It utilizes participatory approaches to determine the activities within the projects. The outputs, outcomes and impacts are then linked to the activities, which then becomes a map of the impact pathways. By elaborating on these impact pathways, one can identify how stakeholders’ capacity has developed before and after the project. PIPA is development-oriented, but also impact-oriented, in that stakeholders make assumptions of how it will achieve the desired impact. Under the DE approach, PIPA should pay more attention to some of the principles, like adapting the tools to the current situation, and should not take the impact pathways as fixed.
Like PIPA, Participatory Action Research (PAR) demands a high level of participation which strengthens communication between stakeholders and focuses them on the process. PAR relies on collaborative research where the best-suited method for the project is defined by scientists and farmers together, which can be adjusted to certain circumstances. In this sense, it is adaptive since adjustments can be made by the actors involved during the designing process. Both PIPA and PAR, follow a high level of participation of stakeholders, which involves stakeholders learning from the success and failures throughout the process.
Social Network Analysis (SNA) provides multiple pathways in that it visualizes a network and the relationship between stakeholders to a set of events. It describes the interdependencies and interaction between actors or events and how change occurs and, thus, affects outcome. However, the evolutionary process of the relationship is not addressed, so change and learning is not captured and thus, is weak in adaptiveness. Under the DE approach, the learning can be captured while the change happens throughout the process of the project. The learning must be taken in by stakeholders and evaluators so that they can adapt to new situations.
Unlike SNA, Outcome Harvesting (OH) is a more reflective evaluation approach and serves “to plan to adapt to continual change and take into account unexpected results” [10]. It does not work toward a predetermined outcome. Instead, it collects evidences of what has already been achieved and then works backwards, looking at how the project has contributed to the desired change. It comes with outcome mapping which lays out the interaction among the components (or stakeholders), thus informing the value-added relationships among its members and contributing to the work each one is doing. Outcome Harvesting method is strong in adaptiveness, but weak in its process-oriented assessment. Since it is working backwards to collect evidence of what and how things have worked, it takes into account past processes and past learning, which still might be valuable data for future reference.
The Outcome-evidencing method, as one application to systems thinking, is a promising impact evaluation approach addressing the complexity of interventions and systems. It identifies outcomes and has the capacity to provide ongoing feedback through repeated cycles [9], which allows for adapting and changing during the process and makes causal claims and challenges to the overarching project. It also allows for continuous learning.
By matching the principles of DE to the pros and cons of each method, it allows us to see how DE can support the methods to enhance innovation. Through the matching, we can see which element of each method succeeds or falls short to address. The multi-dimensional approaches of DE require a systems thinking approach, which blends all of the above-mentioned principles together. DE enables systems thinking in that it acknowledges the systemic nature of interventions being assessed; their deep and entangled relations with the context and among diverse subsystems; and the different ways in which these interventions can be understood, as long as the complex behaviors emerge from and shape these systems. The development process is facilitated by rapid response and feedback to emergent situations and the use of different methods and tools suited to the situation.
Being aware of the complex issues such as the surrounding and changing environment, stakeholders’ different needs and responding to their feedback is time-consuming and may not be fulfilled as planned due to the lack of time and commitment of stakeholders and evaluators. In some cases, a common agreement on why evaluation is needed during the process may not be reached most of the time.

6. Conclusions

Studies in the social impact area (44.6%) have been studied the most (Figure 4). This finding asserts the interest of many researchers and practitioners that social issues must be incorporated into evaluation [5,37,39] using mixed-method approaches (Table 2). Concept, qualitative or quantitative approaches alone cannot support to solve social issues alone, because it involves more complex issues, such as network and differing interests. Taking into account the human-related factor, the social factor, and building on innovation from this perspective calls for social innovation, that is, a means of development and growth by building on citizens’ capacity and enterprises to undergo change [72,85]. Innovation has been conceptualized as a promising driver to create and foster diverse livelihoods that bring sustainability, inclusiveness and resilient growth [11,12]. Not only does innovation bring increased productivity or employment, but also increased social capital, local food system and access to international markets that bring benefits beyond monetary profitability. The innovation process adapts and responds to the complex environment where it unfolds, fostering co-creation; engaging diverse perspectives; learning from and adapting to local and global contexts; defining processes; and learning from implementation on an ongoing basis.
The complex environment embeds the main challenges that need to be answered for optimizing innovation. Thus, it challenges any evaluation—and thus, impact assessment—that needs to be faced. The challenges are confronted with the issue that the target being assessed—the innovation process—should be evaluated as the process evolves. The DE approach allows adaptation to a changing environment as a mechanism to improve and optimize positive outcomes. Evaluators and stakeholders need to pay attention to the surrounding environment that is influencing the project by understanding the local context and integrating local knowledge and human resources. The surrounding environment most often is diverse and dynamic in terms of inconsistency and unpredictability. As systems thinking is being mainstreamed, and thus, complexity awareness increases, measuring impact has become a multidisciplinary issue challenging stakeholder, managers, researchers, policy-makers and evaluators to go beyond their disciplines and reach out for new solutions.
The arguments in the Discussion section show that different methods such as PIPA, SNA, OH and PAR and Outcome-evidencing method are among the few that can take up the challenges which evaluators and stakeholders are facing in their rural territory, with the help of the DE approach. The DE can help to enhance the usefulness of the methods. For example, projects and programs should not be assessed just on how their results and outcomes attain expected goals, but more attention should be paid to the process, where activities are being developed and continuous adaptation to new circumstances is demanded. Continuous research should be done to explore evaluation approaches that enhance methods and tools by giving them a new logic and purpose. As a next step, further research is highly recommended to understand the effects of the implementation of the evaluation methods found in this research under a Developmental Evaluation approach in order to critically assess their performance under such an approach.

Author Contributions

Conceptualization, J.M.D.-P.; S.Y.L. and P.V.; methodology, J.M.D.-P. and S.Y.L.; formal analysis, S.Y.L.; investigation, J.M.D.-P., S.Y.L. and P.V.; resources, S.Y.L.; writing—original draft preparation, S.Y.L.; writing—review and editing, J.M.D.-P., P.V. and S.Y.L.; visualization, S.Y.L.; supervision, J.M.D.-P. All authors have read and agreed to the published version of the manuscript.

Funding

This research has received funding from the EU Horizon 2020 project LIAISON: Linking Actors, Instrument and Policies through Networks: no. 773418.

Acknowledgments

The authors hereby acknowledge that a part of the data used in this paper originates from the archives in the EU Horizon 2020 project LIAISON: Linking Actors, Instrument and Policies through Networks; no. 773418.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lyle, G. Understanding the nested, multi-scale, spatial and hierarchical nature of future climate change adaptation decision making in agricultural regions: A narrative literature review. J. Rural Stud. 2015, 37, 38–49. [Google Scholar] [CrossRef]
  2. Henzler, K.; Maier, S.D.; Jager, M.; Horn, R. SDG-Based Sustainability Assessment Methodology for Innovations in the Field of Urban Surfaces. Sustainability 2020, 12, 4466. [Google Scholar] [CrossRef]
  3. Singh, C.; Dorward, P.; Osbahr, H. Developing a holistic approach to the analysis of farmer decision-making: Implications for adaptation policy and practice in developing countries. Land Use Policy 2016, 59, 329–343. [Google Scholar] [CrossRef]
  4. Bopp, C.; Engler, A.; Poortvliet, P.M.; Jara-Rojas, R. The role of farmers´ intrinsic motivation in the effectiveness of policy incentives to promote sustainable agricultural practices. J. Environ. Manag. 2019, 244, 320–327. [Google Scholar] [CrossRef] [PubMed]
  5. Janker, J.; Mann, S.; Rist, S. Social sustainability in agriculture—A system-based framework. J. Rural Stud. 2019, 65, 32–42. [Google Scholar] [CrossRef]
  6. Lora, A.V.; Nel-lo Andreu, M.G. Alternative Metrics for Assessing the Social Impact of Tourism Research. Sustainability 2020, 12, 4299. [Google Scholar] [CrossRef]
  7. Wang, J.; Maier, S.D.; Horn, R.; Holländer, R.; Aschemann, R. Development of an Ex-Ante Sustainability Assessment Methodology for Municipal Solid Waste Management Innovations. Sustainability 2018, 10, 3208. [Google Scholar] [CrossRef] [Green Version]
  8. Vanclay, F. The Potential Application of Qualitative Evaluation Methods in European Regional Development: Reflections on the Use of Performance Story Reporting in Australian Natural Resource Management. Reg. Stud. 2015, 49, 1326–1339. [Google Scholar] [CrossRef]
  9. Paz-Ybarnegaray, R.; Douthwaite, B. Outcome Evidencing: A Method for Enabling and Evaluating Program Intervention in Complex Systems. Am. J. Eval. 2017, 38, 275–293. [Google Scholar] [CrossRef]
  10. Patton, M.Q.; McKegg, K.; Wehipeihana, N. Developmental Evaluation Exemplars: Principles in Practice; Guilford Press: New York, NY, USA, 2016; pp. 234–251. [Google Scholar]
  11. Naldi, L.; Nilsson, P.; Westlund, H.; Wixe, S. What is smart rural development? J. Rural Stud. 2015, 40, 90–101. [Google Scholar] [CrossRef]
  12. Organization for Economic Cooperation and Development (OECD). Agricultural Innovation Systems: A Framework for Analysing the Role of the Government; OECD Publishing: Paris, France, 2013. [Google Scholar]
  13. Preskill, H.; Gopal, S. Evaluating Complexity. Propositions for Improving Practice. 2014. Available online: http://www.fsg.org/publications/evaluating-complexity (accessed on 29 January 2019).
  14. Vilys, M.; Jakubavicius, A.; Zemaitis, E. Public Innovation Support Index for Impact Assessment in the European Economic Area. Entrep. Bus. Econ. Rev. 2015, 3, 123–138. [Google Scholar] [CrossRef]
  15. Barrueto, A.K.; Merz, J.; Kohler, T.; Hammer, T. What prompts agricultural innovation in rural Nepal: A Study Using the Example of Macadamia and Walnut Trees as Novel Cash Crops. Agriculture 2018, 8, 21. [Google Scholar] [CrossRef] [Green Version]
  16. Hall, A.; Sulaimanb, V.R.; Clark, N.; Yogananda, B. From measuring impact to learning institutional lessons: An innovation systems perspective on improving the management of international agricultural research. Agric. Syst. 2003, 78, 213–241. [Google Scholar] [CrossRef] [Green Version]
  17. Akpoko, J.G.; Kudi, T.M. Impact assessment of university-based rural youths Agricultural Extension Out-Reach Program in selected villages of Kaduna-State, Nigeria. J. Appl. Sci. 2007, 7, 3292–3296. [Google Scholar]
  18. Del Rio, M.; Hargrove, W.L.; Tomaka, J.; Korc, M. Transportation Matters: A Health Impact Assessment in Rural New Mexico. Int. J. Environ. Res. Public Health 2017, 14, 629. [Google Scholar] [CrossRef] [Green Version]
  19. Diwakar, P.G.; Ranganath, B.K.; Gowrisankar, D.; Jayaraman, V. Empowering the rural poor through EO products and services—An impact assessment. Acta Astronaut. 2008, 63, 1–4. [Google Scholar] [CrossRef]
  20. Michelsen, O.; Cherubini, F.; Stromman, A.H. Impact Assessment of Biodiversity and Carbon Pools from Land Use and Land Use Changes in Life Cycle Assessment, Exemplified with Forestry Operations in Norway. J. Ind. Ecol. 2012, 16, 231–242. [Google Scholar] [CrossRef]
  21. Mutuc, M.E.M.; Rejesus, R.M.; Pan, S.; Yorobe, J.M., Jr. Impact Assessment of Bt Corn Adoption in the Philippines. J. Agric. Appl. Econ. 2012, 44, 117–135. [Google Scholar] [CrossRef] [Green Version]
  22. Dargan, L.; Shucksmith, M. LEADER and innovation. Sociol. Rural. 2008, 48, 274–291. [Google Scholar] [CrossRef]
  23. Dax, T.; Strahl, W.; Kirwan, J.; Maye, D. The Leader programme 2007–2013: Enabling or disabling social innovation and neo-endogenous development? Insights from Austria and Ireland. Eur. Urban Reg. Stud. 2016, 23, 56–68. [Google Scholar] [CrossRef]
  24. Dax, T.; Oedl-Wieser, T. Rural innovation activities as a means for changing development perspectives—An assessment of more than two decades of promoting LEADER initiatives across the European Union. Stud. Agric. Econ. 2016, 118, 30–37. [Google Scholar] [CrossRef] [Green Version]
  25. Bonfiglio, A.; Camaioni, B.; Coderoni, S.; Esposti, R.; Pagliacci, F.; Sotte, F. Are rural regions prioritizing knowledge transfer and innovation? Evidence from Rural Development Policy expenditure across the EU space. J. Rural Stud. 2017, 53, 78–87. [Google Scholar] [CrossRef]
  26. Turner, J.A.; Klerkx, L.; White, T.; Nelson, T.; Everett-Hincks, J.; Mackay, A.; Botha, N. Unpacking systemic innovation capacity as strategic ambidexterity: How projects dynamically configure capabilities for agricultural innovation. Land Use Policy 2017, 68, 503–523. [Google Scholar] [CrossRef]
  27. Giannakis, E.; Bruggeman, A. The highly variable economic performance of European agriculture. Land Use Policy 2015, 45, 26–35. [Google Scholar] [CrossRef]
  28. Cofré-Bravo, G.; Klerkx, L.; Engler, A. Combinations of bonding, bridging, and linking social capital for farm innovation: How farmers configure different support networks. J. Rural Stud. 2019, 69, 53–64. [Google Scholar] [CrossRef]
  29. Eichler, G.M.; Schwarz, E.J. What Sustainable Development Goals do Social Innovations Address? A Systematic Review and Content Analysis of Social Innovation Literature. Sustainability 2019, 11, 522. [Google Scholar] [CrossRef] [Green Version]
  30. Barrientos-Fuentes, J.C.; Berg, E. Impact assessment of agricultural innovations: A review. Agron. Colomb. 2013, 31, 120–130. [Google Scholar]
  31. Mackay, R.; Horton, D. Expanding the use of impact assessment and evaluation in agricultural research and development. Agric. Syst. 2003, 78, 143–165. [Google Scholar] [CrossRef]
  32. Moschitz, H.; Home, R. The challenges of innovation for sustainable agriculture and rural development: Integrating local actions into European policies with the Reflective Learning Methodology. Action Res. 2014, 12, 392–409. [Google Scholar] [CrossRef]
  33. Patton, M.Q. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use; Guilford Press: New York, NY, USA, 2010. [Google Scholar]
  34. Scriven, M. The methodology of evaluation. In Perspectives of Curriculum Evaluation; Tyler, R.W., Gagne, R.M., Scriven, M., Eds.; Rand McNally: Chicago, IL, USA, 1967; pp. 39–83. [Google Scholar]
  35. Patton, M.Q. Utilization-Focused Evaluation; SAGE Publishing: Saint Paul, MN, USA, 2008. [Google Scholar]
  36. United States Agency for International Development (USAID). Evaluation: Learning from Experience. USAID Evaluation Policy. 2011. Available online: www.usaid.gov/sites/default/files/documents/1868/USAIDEvaluationPolicy.pdf (accessed on 2 February 2020).
  37. Allen, S.; Hunsicer, D.; Kjaer, M.; Krimmel, R.; Plotkin, G.; Skeith, K. Adapted Developmental Evaluation with USAID´s People to People Reconciliation Fund Program. In Developmental Evaluation Exemplars: Principles in Practice; Patton, M., McKegg, K., Wehipeihana, N., Eds.; Guilford Press: New York, NY, USA, 2015; pp. 216–233. [Google Scholar]
  38. Van Assche, K.; Beunen, R.; Holm, J.; Lo, M. Social learning and innovation. Ice fishing communities on Lake Mille Lacs. Land Use Policy 2013, 34, 233–242. [Google Scholar] [CrossRef]
  39. Gopal, S.; Mack, K.; Kutzli, C. Using Developmental Evaluation to Support College Access and Success. Challenge Scholars. In Developmental Evaluation Exemplars: Principles in Practice; Guilford Publications: New York, NY, USA, 2015. [Google Scholar]
  40. Shea, J.; Taylor, T. Using developmental evaluation as a system of organizational learning. Eval. Program Plan. 2017, 65, 83–93. [Google Scholar] [CrossRef] [PubMed]
  41. Imperiale, A.J.; Vanclay, F. Using Social Impact Assessment to Strengthen Community Resilience in Sustainable Rural Development in Mountain Areas. Mt. Res. Dev. 2016, 36, 431–442. [Google Scholar] [CrossRef] [Green Version]
  42. Tamee, R.A.; Crootof, A.; Scott, C.A. The Water-Energy-Food Nexus: A systematic review of methods for nexus assessment. Environ. Res. Lett. 2018, 13, 4. [Google Scholar]
  43. European Commission. Better Regulation “Toolbox”; European Commission: Brussels, Belgium, 2017. [Google Scholar]
  44. Cong, R.G.; Stefaniak, I.; Madsen, B.; Dalgaard, T.; Jensen, J.D.; Nainggolan, D.; Termansen, M. Where to implement local biotech innovations? A framework for multi-scale socio-economic and environmental impact assessment of Green Bio-Refineries. Land Use Policy 2017, 68, 141–151. [Google Scholar] [CrossRef]
  45. Quiedeville, S.; Barjolle, D.; Mouret, J.C.; Stolze, M. Ex-post evaluation of the impacts of the science-based research and innovation program: A new method applied in the case of farmers’ transition to organic production in the Camargue. J. Innov. Econ. Manag. 2017, 22, 145–170. [Google Scholar] [CrossRef]
  46. Spaapen, J.; van Drooge, L. Introducing ‘productive interactions’ in social impact assessment. Res. Eval. 2011, 20, 211–218. [Google Scholar] [CrossRef] [Green Version]
  47. De Francesco, F.; Radaelli, C.M.; Troeger, V.E. Implementing regulatory innovations in Europe: The case of impact assessment. J. Eur. Public Policy 2012, 19, 491–511. [Google Scholar] [CrossRef]
  48. Tecco, N.; Baudino, C.; Girgenti, V.; Peano, C. Innovation strategies in a fruit growers association impacts assessment by using combined LCA and s-LCA methodologies. Sci. Total Environ. 2016, 568, 253–262. [Google Scholar] [CrossRef]
  49. Graef, F.; Hernandez, L.E.A.; König, H.J.; Uckert, G.; Mnimbo, M.T. Systemising gender integration with rural stakeholders’ sustainability impact assessments: A case study with three low-input upgrading strategies. Environ. Impact Assess. Rev. 2018, 68, 81–89. [Google Scholar] [CrossRef]
  50. Pachón-Ariza, F.A.; Bokelmann, W.; Ramírez, C. Participatory Impact Assessment of Public Policies on Rural Development in Colombia and Mexico. Cuad. Desarro. Rural 2016, 13, 143–182. [Google Scholar] [CrossRef]
  51. Kumar, R.; Sekar, I.; Punera, B.; Yogi, V.; Bharadwaj, S. Impact Assessment of Decentralized Rainwater Harvesting on Agriculture: A Case Study of Farm Ponds in Semi-arid Areas of Rajasthan. Indian J. Econ. Dev. 2016, 12, 25–31. [Google Scholar] [CrossRef]
  52. Cristiano, S.; Proietti, P. Evaluating interactive innovation processes: Towards a developmental-oriented analytical framework. In Proceedings of the 13th European IFSA Symposium on Integrating Science, technology, policy and practice, Chania, Greece, 1–5 July 2018. [Google Scholar]
  53. Douthwaite, B.; Kubyb, T.; Van de Fliert, E.; Schulz, S. Impact pathway evaluation: An approach for achieving and attributing impact in complex systems. Agric. Syst. 2003, 78, 243–265. [Google Scholar] [CrossRef]
  54. Galan-Diaz, C.; Edwards, P.; Nelson, J.D.; Van der Wal, R. Digital innovation through partnership between nature conservation organisations and academia: A qualitative impact assessment. Ambio 2015, 44, 538–549. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  55. Momtaz, S. Institutionalizing social impact assessment in Bangladesh resource management: Limitations and opportunities. Environ. Impact Assess. Rev. 2005, 25, 33–45. [Google Scholar] [CrossRef]
  56. Wu, J.; Chang, I.S.; Lam, K.C.; Shi, M. Integration of environmental impact assessment into decision-making process: Practice of urban and rural planning in China. J. Clean. Prod. 2014, 69, 100–108. [Google Scholar] [CrossRef]
  57. Stephan, U.; Patterson, M.; Kelly, C.; Mair, J. Organizations Driving Positive Social Change: A review and integrative framework of change processes. J. Manag. 2016, 42, 1250–1281. [Google Scholar] [CrossRef] [Green Version]
  58. Swagemakers, P.; LIAISON Workshop-Madrid-Mediterranean Macro-Region Workshop, Madrid, Spain. Personal communication, 2018.
  59. Vanclay, F. The potential application of social impact assessment innintegrated coastal zone management. Ocean Coast. Manag. 2012, 68, 149–156. [Google Scholar] [CrossRef]
  60. Maredia, M.K.; Shankar, B.; Kelley, T.G.; Stevenson, J.R. Impact assessment of agricultural research, institutional innovation, and technology adoption: Introduction to the special section. Food Policy. 2014, 44, 214–217. [Google Scholar] [CrossRef]
  61. Gamble, J.A.A. A Developmental Evaluation Primer. Canada: The J.W. McConnell Family Foundation. Available online: http://tamarackcommunity.ca/downloads/vc/Developmental_Evaluation_Primer.pdf (accessed on 2 February 2020).
  62. Copestake, J. Credible impact evaluation in complex contexts: Confirmatory and exploratory approaches. Evaluation 2014, 20, 412–427. [Google Scholar] [CrossRef] [Green Version]
  63. Ton, G. The mixing of methods: A three-step process for improving rigour in impact evaluations. Evaluation 2012, 18, 5–25. [Google Scholar] [CrossRef]
  64. Crevoisier, O. The innovative milieus approach: Toward a territorialised understanding of the economy. Econ. Geogr. 2004, 80, 367–369. [Google Scholar] [CrossRef] [Green Version]
  65. Pires, A.D.; Pertoldi, M.; Edwards, J.; Hegyi, F.B. Smart Specialisation and Innovation in Rural Areas; S3 Policy Brief Series No. 09/2014; European Commission: Brussels, Belgium, 2014. [Google Scholar]
  66. Douthwaite, B.; Mur, R.; Audouin, S.; Wopereis, M.; Hellin, J.; Moussa, A.; Karbo, N.; Kasten, W.; Bouyer, J. Agricultural Research for Development to Intervene Effectively in Complex Systems and the Implications for Research Organizations; KIT Working Paper: Amsterdam, The Netherlands, 2017. [Google Scholar]
  67. Westley, F.; Zimmerman, B.; Patton, M.Q. Getting to Maybe: How the World Has Changed; Random House Canada: Toronto, ON, Canada, 2006. [Google Scholar]
  68. Milley, P.; Szijarto, B.; Svensson, K.; Cousins, J.B. The evaluation of social innovation: A review and integration of the current empirical knowledge base. Evaluation 2018, 24, 237–258. [Google Scholar] [CrossRef]
  69. Schramm, L.L.; Nyirfa, W.; Grismer, K.; Kramers, J. Research and development impact assessment for innovation-enabling organizations. Can. Public Adm. 2011, 54, 567–581. [Google Scholar] [CrossRef]
  70. Horton, D.; Mackay, R. Using evaluation to enhance institutional learning and change: Recent experiences with agricultural research and development. Agric. Syst. 2003, 78, 127–142. [Google Scholar] [CrossRef]
  71. Kirwan, J.; Ilbery, B.; Maye, D.; Carey, J. Grassroots social innovation and food localisation: An investigation of the Local Food programme in England. Global Environ. Chang. 2013, 23, 830–837. [Google Scholar] [CrossRef]
  72. Neumeier, S. Social innovation in rural development: Identifying the key factors of success. Geogr. J. 2017, 183, 34–46. [Google Scholar] [CrossRef]
  73. Neumeier, S. Why do social innovations in rural development matter and should they be considered more seriously in rural development research? Proposal for a stronger focus on social innovation in rural development research. Sociol. Ruralis. 2012, 52, 48–69. [Google Scholar] [CrossRef]
  74. Reckwitz, A. Toward a Theory of Social Practices A development in culturalist theorizing. Eur. J. Soc. Theory 2002, 5, 243–263. [Google Scholar] [CrossRef]
  75. Shove, E.; Pantzar, M.; Watson, M. The Dynamics of Social Practice: Everyday Life and How It Changes; Sage Publishing: Los Angeles, LA, USA, 2012. [Google Scholar]
  76. Lilja, N.; Dixon, J. Responding to the Challenges of Impact Assessment of Participatory Research and Gender Analysis. Exp. Agric. 2008, 44, 3–19. [Google Scholar] [CrossRef] [Green Version]
  77. Röling, N. Pathways for Impact: Scientists Different Perspectives on Agricultural Innovation. Int. J. Agric. Sustain. 2009, 7, 83–94. [Google Scholar] [CrossRef]
  78. Watts, J.; Horton, D.; Douthwaite, B.; La Rovere, R.; Thiele, G.; Prasad, S.; Staver, C. Transforming Impact Assessment: Beginning the Quiet Revolution of Institutional Learning and Change. Exp. Agric. 2008, 44, 21–35. [Google Scholar] [CrossRef] [Green Version]
  79. Utting, K. Assessing the Impact of Fair Trade Coffee: Towards an Integrative Framework. J. Bus. Ethics 2009, 86, 127–149. [Google Scholar] [CrossRef]
  80. Byambaa, T.; Janes, C.; Takaro, T.; Corbett, K. Putting Health Impact Assessment into practice through the lenses of diffusion of innovations theory: A review. Env. Dev. Sustain. 2015, 17, 23–40. [Google Scholar] [CrossRef]
  81. Jones, N.; McGinlay, J.; Dimitrakopoulou, P.G. Improving social impact assessment of protected areas: A review of the literature and directions for future research. Environ. Impact Assess. Rev. 2017, 64, 1–7. [Google Scholar] [CrossRef] [Green Version]
  82. Khurshid, N. Impact assessment of agricultural training program of AKRP to enhance the socio-economic status of rural women: A case study of northern areas of Pakistan. Pak. J. Life Soc/ Sci. 2013, 11, 133–138. [Google Scholar]
  83. Guijt, I.; Kusters, C.S.L.; Lont, H.; Visser, I. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use: Report from an Expert Seminar with Dr. Michael Quinn Patton; Centre for Development Innovation, Wageningen University and Research Centre: Wageningen, The Netherlands, 2012. [Google Scholar]
  84. Patton, M.Q. What is Essential in Developmental Evaluation? On Integrity, Fidelity, Adultery, Abstinence, Impotence, Long-Term Commitment, Integrity, and Sensitivity in Implementing Evaluation Models. Am. J. Eval. 2016, 37, 2. [Google Scholar] [CrossRef] [Green Version]
  85. Bock, B. Rural marginalisation and the role of social innovation: A turn towards nexogenous development and rural reconnection. Sociol. Ruralis. 2016. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Analytical framework for the review of innovation impact assessment.
Figure 1. Analytical framework for the review of innovation impact assessment.
Sustainability 12 06559 g001
Figure 2. Percentage (and number) out of the total 74 articles according to the methodological approach.
Figure 2. Percentage (and number) out of the total 74 articles according to the methodological approach.
Sustainability 12 06559 g002
Figure 3. Percentage (and number) out of the total 41 articles following each of the four combinations of mixed methods.
Figure 3. Percentage (and number) out of the total 41 articles following each of the four combinations of mixed methods.
Sustainability 12 06559 g003
Figure 4. Percentage (and number) of the total 74 articles according to impact area.
Figure 4. Percentage (and number) of the total 74 articles according to impact area.
Sustainability 12 06559 g004
Figure 5. Percentage of the total 74 articles of each methodological approach split into impact area.
Figure 5. Percentage of the total 74 articles of each methodological approach split into impact area.
Sustainability 12 06559 g005
Figure 6. Percentage (and number) of the total 18 articles according to methodological approach.
Figure 6. Percentage (and number) of the total 18 articles according to methodological approach.
Sustainability 12 06559 g006
Figure 7. Response to innovation impact assessment challenges by mainstreamed methods and Developmental Evaluation.
Figure 7. Response to innovation impact assessment challenges by mainstreamed methods and Developmental Evaluation.
Sustainability 12 06559 g007
Table 1. Categorization of methodological approaches.
Table 1. Categorization of methodological approaches.
Methodological ApproachesMethodsExample of Specific Methodologies and Tools
ConceptualReview
Theory-based
Document analysis, literature review, argumentation
Conceptual Framework for Innovation Impact Assessment/Innovation EvaluationFramework development based on reviews (for example, conceptual innovation)
QualitativePublic participationQuestionnaire, interview, expert surveys, etc.
QuantitativeSurveyRegression analysis, Bayesian probabilistic method
Economic valuationEconometric analysis, cost-benefit analysis, cost-effectiveness
MixedParticipatory evaluation 1 approachesIndividual rating, group voting, actor mapping, evaluation of assessment tools
Case studies 2 toolDetailed analysis of individual research programs
Adopted and modified from Weißhuhn et al. (2018) [43] and Tamee et al. (2018) [41]. 1 The participatory evaluation is, for example, the opinion of individuals through survey and interviews (qualitative method) added to an already developed framework (conceptual method). 2 Case studies involve the use of qualitative methods to understand stakeholders’ perceptions. (These methods used to be grouped under the so-called qualitative methods). Depending on the purpose and research design, either before, after or in parallel to the use of qualitative methods, quantitative methods are also used with different purposes depending on the moment where they are used. After this, some case studies synthetize their main findings contextualized into a framework (conceptual method).
Table 2. Articles analyzed according to methodological approach and impact area.
Table 2. Articles analyzed according to methodological approach and impact area.
Methodological ApproachImpact AreaMethods and Tools Used
Conceptual (22)Social (17)Method: Social Impact Assessment (SIA) (Imperiale and Vanclay, 2016) [41]; Contextual Response Analysis (CRA) (Spaapen and van Drooge, 2011) [46]; case study
Tool: Mixed-gendered focus group workshops, literature review
Environmental (2)Method: Environmental Impact Assessment (EIA) (Cong et al., 2017) [44]; Health Impact Assessment (HIA) (Del Rio et al., 2017) [18]
Sustainability (4)Tool: Literature review on agricultural innovations, farmer-driven innovations, participatory technology department and innovation systems
Mixed-method (41)Economic (6)Tool: Calculation of economic indicators, face-to-face interview, static data from international organizations, community survey, Analysis of Variance (ANOVA)
Social (21)Method: Regulatory Impact Assessment (RIA) (De Francesco et al., 2012) [47]
Tool: Focus groups, face-to-face interview, rapid appraisal workshop, survey, field work, sustainability indicators, public meetings, health impact assessment, observation, household survey
Environmental (17)Method: CropWatch agroclimatic indicators (CWAIs) (Gommes et al., 2017) [48]
Tool: Indicator species analysis, Mantel test of geographic distance, Bray Curtis coefficient, Life cycle environmental impact assessment
Sustainability (4)Method: Life Cycle Assessment (LCA) (Tecco et al., 2016) [48]; Environmental Impact Assessment (EIA) (Cong et al., 2017) [44]
Tool: Stakeholder questionnaire, focus group interview, rapid rural appraisal, observation, sustainability indicators, scenario analysis
Qualitative (6)Economic (2)Tool: Email survey, household interviews, data coding, blind interviews
Social (3)Method: Participatory Impact Pathway Analysis (PIPA), Social Network Analysis (SNA), Outcome Harvesting method (OH) (Quiedeville et al., 2017) [45], Participation Action Research (PAR) (Graef et al., 2018) [49]
Sustainability (3)Method: Economic-Environmental Input-Output (EEIO) model (Cong et al., 2017) [44]
Tool: Geographic information system (GIS), discriminant analysis
Quantitative (6)Social (1)Method: Framework for Participatory Impact Assessment (FoPIA) analysis (Pachón-Ariza et al., 2016) [50]
Environmental (3)Tool: Water sample analysis, skeletochronology method, life cycle analysis (LCA), land classification
Sustainability (1)Method: Crop-wise analysis (Kumar et al., 2016) [51]
Tool: Benefit-cost ratio, Net Present Worth (NPW)
Table 3. Articles on promising evaluation frameworks and tools for leveraging social changes.
Table 3. Articles on promising evaluation frameworks and tools for leveraging social changes.
Author/YearImpact AreaTools and MethodsCo-BenefitMain Findings
Copestake (2014) [63]Economic, SocialQualitative Impact Protocol (QUIP)Better reflection of uncertain and insufficiently understood impact pathways, blind interview to avoid bias and gain explicit answers from key informants.The balanced approaches of exploratory and confirmatory to impact evaluation gave explicit answers to the question.
Diwakar et al. (2008) [19]SocialEarth Observation (EO)Data monitoring for action that brings transparency to see the developmental process of whole project.Helps in microlevel plan preparation, concurrent project monitoring and impact assessment in multiple stages throughout the project.
Cristiano and Proietti (2018) [52]SocialDevelopment-Oriented Analysis (DOA)Focuses on interactive innovation processes and multi-actor approach.Integrated framework with participatory and reflexive approaches that support policy and project designs promoting development in innovative capacities.
Quiedeville et al. (2017) [45]SocialParticipatory Impact Pathway Analysis (PIPA), Outcome Harvesting (OH), Social Network Analysis (SNA)PIPA: Participatory approach allowing actors to change and increase interactions within the innovation network. OH: Lets PIPA adapt to requirements of ex-post assessment. SNA: Identifies important actors and their statements.Critical to the success factors of innovation were agricultural policies, economic factors, testing conducted independently by farmers and institutional framework rather than learning and interactions with farmers.
Hall et al. (2003) [16]SocialIn-depth review of case studies of a specific projectCase study to demonstrate the importance of institutional learning.Institutional learning must be embedded in a new perspective of innovation systems by (i) understanding how research community operates, (ii) realizing learning as part of the practice of research organizations, (iii) realizing capacity development and behavioral changes, (iv) realizing evaluation as collective task.
Maredia et al. (2014) [60]SocialCost-benefit analysis; impact evaluations: decentralized participatory model; ex post assessment: partial equilibrium economic surplus modelCost-benefit analysis: Assists in making strategic decisions and assesses potential impacts; impact evaluation: Test the effectiveness of projects and institutional innovations; ex post assessment: Analytical approach in assessing the impact on investments.Methods took into account the variables (time frame, size of intervention, type of research question and evaluation addressed) during the evaluation process.
Tecco et al. (2016) [48]SustainabilityEnvironmental and social Life Cycle Assessment (LCA)Assessment of achieved impacts, trade-offs, appropriateness in the context and scale of adoption.Dynamic combination of data and information provided by stakeholders improved the decision-making process.
Wu et al. (2014) [56]SocialPlan Environmental Impact Assessment (PEIA)Plan environmental impact assessment throughout various stages of the project to enhance outcome.For the framework to become a standardized procedure, decision-makers need to accept PEIA as an internal process and not as an external intervention.
Paz-Ybarnegaray and Douthwaite (2017) [9]SocialOutcome-evidencing methodIdentifies outcomes giving immediate feedback to ongoing project implementation and makes causal claim to substantiate or challenge the overarching program theory.Outcome-evidencing allows agents to identify underlying causes and undertake actions along the process; it is a one-off evaluation that answers if, how and in what contexts projects are working.
Schramm et al. (2011) [69]EconomicSmart Science Impact (R&D impact assessment tool)Relies on ‘voice of customer’ where explicit answers of clients provide valuable input data.The economic, social and environmental IA tool can be easily adapted for use by government, not-for-profit or private sectors that conduct fund or contract research and development activities.
Milley et al. (2018) [68]SocialConcept of Social Innovation (SI)Working with the concept of Social Innovation (SI) may lead to innovation.To find conceptual clarity, SI and relevant evaluation practices, researchers and practitioners need to move toward a principle-based approach grounded in empirical research taking into account the SI context.
Ton (2012) [63]SocialWorking with the client’s Monitoring and Evaluation FrameworkAssessing the effectiveness of projects and programs with interventions over the value-chain development process considering changing conditions.The structure and systematic process helps reduce the tendency to one-method design; it enhances critical reflection within the team and allows creativity to find ways to handle information.
Vanclay (2015) [8]SocialLiterature review on qualitative evaluations, explanation of performance story-tellingQualitative methods collect evidence about performance of a project or program and enable the collection of feedback to assist in modifying the project.Story-telling approach can be effective if it is coherent, credible, and multi-dimensional where the different components are interconnected and the causal relations between them become clear.
Imperiale and Vanclay (2016) [41]Environment, SocialSocial Impact Assessment (SIA)Helps social practitioners in designing the problem with stakeholders and implementing the project with local communities, thus achieving improved social outcomes.The framework positively changed the outcomes of sustainable development projects and took on a community-oriented approach, that understands better the needs of the affected and conceptualizes actions needed for better social outcomes.
Graef et al. (2018) [49]SocialParticipatory Action Research (PAR)Context-oriented and collaborative research approach: Local stakeholders and scientists together develop and select research methods, generate data and reflect in cycles on how efforts unfold and what the impacts of intervention are.Allowed the collaborative research of different perspectives of scientists and stakeholders and the learning for scientists and stakeholders.
De Francesco et al. (2012) [47]SocialRegulatory Impact Assessment (RIA)Tool for major innovation in the reform agenda in many European countries.The RIA implementation differs by country in terms of political and economic systems and also during the process of implementation.
Vilys et al. (2015) [14]EconomicPublic innovation support assessmentThe conceptual framework proposes quantitative indicators to support effectiveness at the national level.The assessment creates new opportunities and proposes indicators that enable the improvement of public support effectiveness.
Cong et al. (2017) [44]SustainabilityLCA, geographic information system (GIS) analysis, economic-environmental input-out (EEIO) model; interregional input-output module (LINE) modelLCA-GIS-EEIO framework upscales micro analysis to macro analysis to see the effects of local changes, disaster, land use changes. It integrates top-down as well as bottom-up approaches.The framework brings environmental economic contributions with careful selection of location.

Share and Cite

MDPI and ACS Style

Lee, S.Y.; Díaz-Puente, J.M.; Vidueira, P. Enhancing Rural Innovation and Sustainability Through Impact Assessment: A Review of Methods and Tools. Sustainability 2020, 12, 6559. https://doi.org/10.3390/su12166559

AMA Style

Lee SY, Díaz-Puente JM, Vidueira P. Enhancing Rural Innovation and Sustainability Through Impact Assessment: A Review of Methods and Tools. Sustainability. 2020; 12(16):6559. https://doi.org/10.3390/su12166559

Chicago/Turabian Style

Lee, So Young, José M. Díaz-Puente, and Pablo Vidueira. 2020. "Enhancing Rural Innovation and Sustainability Through Impact Assessment: A Review of Methods and Tools" Sustainability 12, no. 16: 6559. https://doi.org/10.3390/su12166559

APA Style

Lee, S. Y., Díaz-Puente, J. M., & Vidueira, P. (2020). Enhancing Rural Innovation and Sustainability Through Impact Assessment: A Review of Methods and Tools. Sustainability, 12(16), 6559. https://doi.org/10.3390/su12166559

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop