Next Article in Journal
A Mixed Study of Beliefs about Critical Thinking in a Sample of Trainee Teachers in Argentina and Spain
Previous Article in Journal
Teachers’ Continuing Professional Development: Action Research for Inclusion and Special Educational Needs and Disability
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Academic Third Mission through Community Engagement: An Empirical Study in European Universities

by
Paulina Spânu
,
Mihaela-Elena Ulmeanu
* and
Cristian-Vasile Doicin
Faculty of Industrial Engineering and Robotics, National University of Science and Technology Politehnica Bucharest, 060042 București, Romania
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(2), 141; https://doi.org/10.3390/educsci14020141
Submission received: 30 November 2023 / Revised: 13 January 2024 / Accepted: 25 January 2024 / Published: 30 January 2024

Abstract

:
Community engagement is fundamental for tertiary education, as it allows universities to connect with external stakeholders, create social impact, and improve the development of strategies for public engagement. The current study aims to evaluate the level of community engagement in tertiary education, assess the level of sustainable practices, and identify areas for improvement. The research employed a survey method, using a standardized questionnaire to gather data from 44 respondents, representing 35 European universities from nine countries. The survey covered various aspects of community engagement, such as university commitment, documentation, public awareness, investments, incentives, training, and stakeholder engagement. Quantitative analysis was employed using ANOVA and AHP to analyze the data collected from 20 questions. The results revealed that universities have a clear commitment to public engagement and have well-documented policies in place. However, there were areas identified for improvement, such as increasing investments to encourage public engagement and offering more training activities to support it. Additionally, the universities were found to have a limited target group for their community engagement activities and insufficient communication of the results of impact assessments. The findings of this study will be used to improve the development of strategies and enhance public engagement in tertiary education through the Academic Third Mission.

1. Introduction

Academic Third Mission is a priority on universities’ agendas, focusing on the role of higher education institutions in contributing to the socio-economic development of their regions and communities through activities such as technology transfer, community outreach, and applied research [1,2]. This mission is in addition to the traditional roles of teaching and research, which are often referred to as the “first” and “second” missions, respectively [3,4,5]. The concept of the Academic Third Mission is intended to encourage universities to engage more actively with their local communities and to contribute to the development of a knowledge-based society. The European Union (EU) has recognized the importance of the Academic Third Mission and has made it a priority to support the engagement of universities with their local communities and regions [6,7]. The EU has implemented several initiatives and programs aimed at promoting the Third Mission, such as the Horizon 2020 program and the European Regional Development Fund [8,9]. These initiatives provide funding and resources for universities to conduct applied research and engage in technology transfer and community outreach activities.
There are several policy instruments that have been designed to support, monitor, and evaluate the engagement of universities in the community in relation to the Third Mission and can include funding programs, performance indicators, impact assessments, regional development strategies, public-private partnerships, and community engagement [10,11,12,13]. Worldwide governments and organizations, including the EU, provide funding for universities to engage in activities that support the Third Mission, such as applied research and technology transfer. Universities are often required to report on their engagement in Third Mission activities and are evaluated on their performance in these areas [14,15]. This can include measures such as the number of patents filed, the number of startups created, and the number of community outreach programs [16,17]. Studies and evaluations are conducted worldwide to assess the impact of universities’ Third Mission activities on the community and society [18]. Universities are encouraged to engage with regional development strategies and to align their Third Mission activities with regional priorities [19,20]. Governments and organizations often support universities to form partnerships with businesses and industry to boost progress and prosperity [21]. Of all the policy instruments, community engagement is particularly important.
Community engagement is a key aspect of the Third Mission, as it is through engagement with the local community that universities can truly understand the needs and priorities of the region and tailor their activities to have the most impact [3,22]. Community engagement allows universities to identify the needs of the community through direct engagement and communication with residents, organizations, and local leaders [23]. This helps universities develop programs and services that are responsive to local needs and priorities. It also helps build trust between the universities and the community by demonstrating their commitment to addressing local issues and by involving community members in the planning and implementation of Third Mission activities. By engaging with the community, universities can better understand the social, economic and environmental issues that affect the community and design their programs and services to have the greatest positive impact [24]. Community engagement can provide opportunities for students and faculty to gain real-world experience, which can enhance the educational experience and prepare graduates for careers that impact the community positively. Also, it promotes collaboration between universities, businesses, and organizations to address local issues and create new opportunities [25,26,27,28].
Due to all the benefits of community engagement within the Academic Third Mission, the authors proposed a study on the participatory and deliberative processes of several European universities, with the final goal of designing a general framework for academic community-led innovation. Participatory practices refer to the involvement of ‘the public’ in the decision-making processes of universities [29]. These processes entail actively involving community members in the planning and implementation of Third Mission activities to ensure that they are responsive to local needs and priorities [30]. This can include involving community members in the design and implementation of research projects, technology transfer initiatives, community outreach programs, co-creation and co-design of curriculum, and public engagement [31,32,33]. Participatory processes ensure that community members have a say in the activities that affect them and that their perspectives and experiences are taken into account.
Deliberative processes are aimed at making decisions upon an issue involving the weighing of reasons for and against a course of action [34]. Participation focuses on empowering citizens to take action. Deliberation focuses on discussion and debate between citizens and other stakeholders [35,36]. The process involves community members in a structured and informed discussion to identify and evaluate options and make collective decisions [25,37]. These processes allow community members to express their views, consider different perspectives, and make informed decisions. Deliberative processes can include public meetings, community forums, and other forms of consultation and dialogue [22,24,38].
Given the importance of participatory and deliberative processes within the global scope of the Academic Third Mission through community engagement the current research provides valuable insights into the current practices and challenges of European universities. The study involves a research methodology that uses quantitative tools, focusing on specific practices and strategies that universities use to engage with their communities and the impact of these practices on the community. It also examines the barriers and challenges that universities face in engaging with their communities and the strategies they use to overcome these barriers. Additionally, it assesses the effectiveness of participatory and deliberative processes in promoting community engagement and the alignment of Third Mission activities with community needs and priorities.

2. Research Methodology

The current study was carried out under the TENACITY European project funded by Erasmus Plus through grant agreement no. 2021-1-IT02-KA220-HED-000032042. The project focuses on the Academic Third Mission and, specifically, on supporting universities to develop participatory and deliberative practices. In this context, the main objective of the research was to detect the needs, gaps and opportunities for designing a framework for the Higher Education Third Mission by collecting information from nine different European countries. This was conducted by applying an online questionnaire aimed at investigating universities’ commitment to public engagement activities. Specifically, the investigation focused on the university experience with participatory and deliberative processes. The questionnaire was targeted at university staff/professors/researchers involved in managing/delivering relevant activities.
The research was conducted on a sample of 44 respondents from 35 universities in 9 different European countries (Table 1).
The 35 universities were selected randomly amongst European institutions. The sample consisted of 31 professors, 4 researchers, 4 doctoral students, and 5 administrative staff members (1 rector, 1 chancellor, 1 public engagement officer, and 2 other administrative staff). This distribution of the positions held in the institutions by the survey participants is not a limitation for the research and is not significantly influencing the research results. Within the TENACITY project, a letter of consent was created at the consortium level, outlining the purpose and ethical considerations of the research, including issues such as anonymity, voluntary participation, and confidentiality. The initial version of the questionnaire was specifically designed to target the university experience in participatory and deliberative processes, taking into account the characteristics of the target audience.
The research process was carried out in two stages. The first stage involved the completion and validation of the questionnaire. The initial English version of the questionnaire was reviewed by experts from each partner institution to ensure that the questions were clear and easily understood by survey participants. The final English version of the questionnaire was implemented in Google Sheets and distributed by e-mail to the target group for participation in the research. The data collection process was carried out in approximately two months. Quantitative analysis was used to assess public engagement using a 7-point Likert scale, where value 1 corresponds to “totally disagree” and value 7 corresponds to “totally agree”. The scale provided two moderate opinions along with two extremes, two intermediate, and one neutral opinion to the respondents. This scale provides better accuracy of results and more data points to run statistical information. The survey was constructed with 20 items (Table 2) that used the same response scale in order to allow the application of an Analysis of Variance (ANOVA) to the data set. This approach was preferred in order to improve the consistency of information from a large number of participants, such as university staff, community members, and researchers, on their perceptions and experiences of participatory processes of public engagement, as well as facilitate the use of statistical analysis on the numerical data.
ANOVA was selected as an appropriate validation method due to the overall goal of the study and the necessary prerequisites being met. The main goal of the research was to detect the needs, gaps, and opportunities for designing a framework for the Higher Education Third Mission by collecting information from different HEIs in European countries. ANOVA was a useful tool in this research context for comparing responses across different target groups and analyzing aggregated scores from the Likert scale survey. The method helped in assessing whether perceptions and needs vary significantly from one European country to another. The survey was constructed to investigate different aspects of the Third Mission of Higher Education (commitment, implementation, investments, incentives, training, educational paths, and community engagement). ANOVA was used to analyze these aspects simultaneously, providing insights into which aspects differ significantly across different groups. Although in line with the research’s main goal, ANOVA was deployed only after validation of its prerequisites.
The first prerequisite, independence of observations, was ensured through the distribution channel and application of the questionnaire. The final English version of the questionnaire was distributed by e-mail, individually to each member of the target group. Members of the target group were selected randomly from information available online. After selection, the consortium members validated the final 44 participants, verifying that they did not have any prior collaboration and were not in contact for the completion of the survey. The questionnaire was completed without revealing personal information like name, surname, age, or gender and involved completing a Google survey on their personal computers.
Normality was the second prerequisite of ANOVA, which was analyzed before applying the method. This prerequisite entails that the data in each group should be approximately normally distributed, which is particularly important for small sample sizes (which is the case). The Shapiro–Wilk test (best for small to moderate sample sizes) was used to calculate a statistic (W) and a p-value for each of the 20 questions in each country except Italy, Malta, and Portugal, which had less than 3 respondents. The test showed that the majority of questions have a normal distribution (Table A1 and Table A2, shown in Appendix B of the manuscript). To validate even further the normality of the data, a Q-Q plot was put together (Figure A2, Appendix B), and the normally distributed data appears as roughly a straight line. Considering the aforementioned, the normality prerequisite was considered met.
Homogeneity of variances is the third important ANOVA prerequisite and was verified using Levene’s test. This checks for homogeneity of variances and is less sensitive to deviations from normality, making it suitable for Likert scale data. It is performed by comparing the variance within each group (country) to the overall variance. Homogeneity of variances was considered met if Levene’s Test p-value was over 0.05. Calculations conducted in Table A3, and Appendix C validates this prerequisite.
The fourth prerequisite is related to the level of measurement. This is met due to the structure of the survey. The 1 to 7 scores represent ratings, where differences are consistent and meaningful across the entire scale, for all 20 questions.
Random sampling, the fifth prerequisite, has been ensured since the early stages of the experiment design. The request for involvement in the study was sent randomly to HEIs around Europe with a timeframe of one month for receipt upon initial acceptance. With 44 respondents from 35 universities giving a positive reply in this timeframe, they were further verified for having no prior connection and validated for taking the study individually. The e-mail instructions highlighted the importance of independent responses. The responses were collected independently, ensuring anonymity and avoiding situations where participants from the same country and university discuss their responses before completing the survey.
Group independence of observations is the sixth prerequisite of ANOVA and is critical for its validity. The experiment design phase ensured group independence based on the premise that each country’s data was selected and collected independently of the others. Moreover, the Durbin-Watson test was conducted on the residuals of ANOVA to check for autocorrelation as a proxy for independence. A value of 2.42 was obtained, suggesting a small degree of negative autocorrelation. However, this value is close enough to 2 to generally not be a cause for concern regarding the independence of observations. This result is a good indicator of the independence of the responses.
The seventh prerequisite of applying ANOVA, related to an appropriate sample size, is the main determinant in selecting this method, as it does not impose a minimum value. Nevertheless, a very small sample size can lead to a lack of statistical power, making it difficult to detect a real effect if it exists. To counteract this limitation, Cronbach’s Alpha was used to measure the internal consistency and reliability of the set of scales used and test items.
Based on all prerequisites being met and alignment with the study goal, ANOVA was the appropriate method to use in the conducted research.

3. Results Interpretation and Discussion

3.1. Quantitative Analysis

Quantitative analysis involved an Analysis of Variance (ANOVA) on the collected data set for items Q1 ÷ Q20 (Table 3). The statistical analysis was conducted to examine the differences between groups on a particular measure. The groups in the data set were the different questions (Q1, Q2, Q3, etc.), and the measures being analyzed were the responses given to each question. These responses were given in numbers, where each number represented an option on a 1–7 Likert scale (Appendix AFigure A1). The items for public engagement must show a common variant, correlate with each other, and, at the same time, correlate each item with the score that reflects this attribute.
After conducting the ANOVA with Two-Factor Without Replication the results include the source of variation, the sum of squares (SS), the degrees of freedom (df), the mean squares (MS), the F-ratio, the p-value, and the F critical value. These indicate that there is a significant difference between the means of the groups on the measure being analyzed (p-value is less than 0.05), and the source of variation was broken down into three main parts: Rows, Columns, and Error.
The Rows source of variation demonstrates that there is a significant difference between the means of the groups that were formed by rows. The Rows source of variation in the ANOVA results refers to the variation in the responses between the different questions. The calculated value of SS of 2102.727, df of 43, MS of 48.90063, F of 23.51994, p-value of 3.6·10−114, and F crit of 1.394538 are all indicators of the statistical significance of the variation between the questions. The results suggest that there is a significant difference in the responses given to the 20 questions, with a large F-ratio and a very small p-value. Thus, all values are significant, indicating that there is a difference in means among the groups. The relevance of these values is that they can be used to identify which questions are most important to the participants, which questions are not well understood, and which questions are measuring different aspects of public engagement. The Columns source of variation shows that there is a significant difference between the means of the groups that were formed by columns. The SS is 113.1636, df is 19, MS is 5.955981, F is 2.864672, p-value is 4.31·10-05, and F crit is 1.599272. The calculated values are significant, indicating again that there is a difference in means among the groups. The Columns source of variation in this analysis refers to the variation in responses between the different questions. The relevance of the calculated values in terms of the questions can be determined by looking at the p-value and the F-value for each question. A low p-value (typically below 0.05) and a high F-value represent that there is a significant difference in the responses between the different questions, indicating that the question is measuring a different aspect of public engagement. For example, if we analyze the question “Does the university offer incentives and rewards to promote public engagement?” (Q6), the p-value and F-value are both low, indicating that there is a significant difference in responses between this question and the other questions. Thus, offering incentives and rewards is an important factor in promoting public engagement [12,39]. On the other hand, if we look at the question “Does the university integrate external services into its portfolio of services to promote public engagement?” (Q8), the p-value and F-value are both relatively high, indicating that there is not a significant difference in responses between this question and the other questions. This shows that integrating external services may not be a major factor in promoting public engagement [15,18,19]. The Error Source of Variation is the variability that is not explained by the other sources of variation. It represents the random variation or noise in the data set. In terms of the questions, it represents the degree to which the responses to each question vary from the overall mean of the sample. A lower error variance corresponds to more consistent and less random responses for a given question, while more variable and less consistent responses have a higher error variance.
Focusing on the need to assess the consistency and reliability of the scale used, Cronbach’s Alpha was used to assess the reliability and internal consistency in the development and validation stages. The ANOVA undertaken for public engagement has a Cronbach’s Alpha of 0.957483, which is a strong indicator of the internal consistency of the questionnaire, which means that the items on the scale or questionnaire are measuring the same underlying construct and the results are reliable. Results show that there is a significant difference between the means of the groups or conditions on the measure being analyzed, and the source of variation in the difference is coming from both Rows and Columns. Moreover, the Cronbach’s Alpha coefficient was used in the analysis of the results as the main indicator of the measurement accuracy of the test. Since F > F crit (23.51994 > 1.394538), the null hypothesis will be rejected. Population means are not all equal. Which means that at least one of the means is different. Because p < 0.001, it means that at least two means differ highly significantly from each other.
To further analyze the significance of each question, Table 4 was put together, containing information about the number of respondents (Count), the sum of scores (Sum), the average of scores, and the variance and standard deviation (Std. Dev.) for each item (Q1 ÷ Q20). The results show that there is a range of averages and variances among the questions. The average ranges from 3.477 to 4.795, and the variance ranges from 3.469 to 5.465, indicating that there is a significant difference between the means of the questions and the measure being analyzed. It is also worth noting that the variance is an indicator of the spread of the data; the larger the variance, the more spread out the data is, and it could involve the presence of outliers.
A low standard deviation means that most of the scores are near the mean, and a high value means that the scores are more dispersed. To identify which questions are considered more significant by the participants, the average scores were evaluated and contrasted among the questions. Questions with higher average scores are considered more significant by the participants. Furthermore, questions with a lower standard deviation imply that the responses are more consistent; hence, it is more likely that the question is considered more important by the participants. Based on the results from Table 4, in hierarchical order, starting with the most important, questions Q1, Q12, Q13, Q9, and Q10 are the most significant for the participants in terms of importance and consistency.
To determine which questions are not well understood, apart from the standard deviation, the distribution of responses was calculated and analyzed. The distribution of scores is a measure of how the scores are distributed across the range for each question. It can be visualized for all 20 questions using the histogram and the frequency distribution presented in Figure 1.
For example, for question Q1, the frequency of scores is given by {1:4, 2:1, 3:8, 4:5, 5:5, 6:11, 7:10}. Four respondents gave a score of 1, one respondent gave a score of 2, eight respondents gave a score of 3, and so on. Questions with a wide range of responses and a high standard deviation are generally not well understood. For all 20 questions, the calculated range was 6. Although the standard deviation for all questions is low, the study requires further clarifications for question Q15. The average values for the question range from 3.477 to 4.795, with Q1 having the highest average value of 4.795. The participants generally agreed that the universities’ commitment to public engagement is clearly defined. However, it is worth noting that the average for Q1 is only slightly above the midpoint of the scale (4.5), which means that the results are not overwhelmingly in favor of the statement. There were some participants who disagreed or were uncertain about the statement; thus; there is a need for further investigation [18].
Regarding the documentation of public commitment (Q2), the lowest results were recorded in Greece (with an average of 3.85) and the best results were recorded in Germany with an average of 6.33, indicating that German universities have the best practices for documentation of public engagement activities. The results suggest that the commitment to public engagement is well documented, but there may be room for improvement in terms of clarity and dissemination of information. As other research shows, confusion on the subject can be due to a lack of consistency in the channels of information and the diversity of tools [11,34]. In order to further investigate this issue, Q3 was analyzed.
According to the respondents, most universities make efforts so that their documented commitment to public engagement is known and understood; there are no significant differences between the partner countries. The conclusion aligns with several other findings at a European level and can be explained mainly due to cultural and societal similarities but also due to strategic collaboration paths between institutions [6,7,9,22,24]. Based on the results, it can be inferred that the universities may need to improve their efforts to ensure that their documented commitment to public engagement is also publicly known and understood. Such strategies are implemented and actively promoted by universities and institutions worldwide, but with notable differences in the effectiveness of the tools [26,33]. Depending on the cultural approach, universities need to establish the most effective methods for undertaking public engagement documentation.
When asked if people from different levels of the university are responsible for the implementation of the public involvement agenda (Q4), the respondents appreciated the efforts of the university staff, suggesting that there is a fair level of responsibility among people at different levels of the university for implementing the public engagement agenda. European universities tend to assume a high level of responsibility in undertaking academic third-mission actions, endeavors sustained by a variety of common efforts and initiatives [6,7,12,22]. However, there is still room for improvement as the mean score is not the highest, indicating that there may be some lack of clarity or understanding of the responsibilities related to public engagement across different levels of the university. Several studies found that lack of clarity can be due to improper communication throughout the universities’ management and organizational hierarchies [17,19].
Surveyed universities are concerned with investments to encourage public involvement (average = 4.159 for Q5), but they are less involved in offering incentives and rewards to promote audience involvement (average = 3.773 for Q6). Some universities have been known to strongly encourage public engagement through student involvement, which has proven beneficial in the long-term development of third mission strategies [37]. The EU has promoted continuous development of public engagement through the academic third mission of universities [6], so as to counteract the gap between academia and entrepreneurs. The average score for Q6 is 3.773, which is relatively low compared to the other questions. For this question, the respondents generally disagree with or are neutral in their opinion that their universities offer incentives and rewards to promote public engagement. The standard deviation of 1.975 also infers that there is a significant amount of variation in the responses, indicating that some respondents may strongly disagree while others may be more neutral or slightly disagree. There is definitely room for improvement in this area for the universities in terms of offering incentives and rewards to promote public engagement. This is mainly performed through structural funds [8,9], but also through local initiatives [13,15].
The results for questions Q7, Q8, and Q9 were very close to the central tendency (average: Q7 = 3.818, Q8 = 3.477, Q9 = 4.295). Training activities to support public involvement are not sufficient, and services to promote public involvement are less satisfactory in surveyed universities. A fair interpretation of the obtained results could be that the respondents do not believe that the university is effectively integrating external services into its portfolio to promote public engagement. This was also the case for several other institutions outside of the study [15,20,21,30]. Thus, this is a clear area for improvement for the university in terms of its public engagement efforts and is in correlation with other literature findings [32,39].
For questions Q10 and Q11 there are no significant differences between the results collected from different countries. These results reflect, in the opinion of the respondents, the satisfactory preoccupation of universities in using updated methods and approaches to develop public engagement skills among students and in the integration of public engagement practices in study programs [23]. The general opinion of the respondents is that they do not believe that the university is effectively integrating public engagement practices into its degree programs. For this question respondents stated that there are universities where the public is involved to some extent in the study programs. The justification for this statement is based, in the opinion of the respondents, on the fact that the universities consider the opinion of the public based on the feedback received from them, especially formulated during internships, and volunteering. It could be beneficial to follow up with strategies that have proven successful over one common framework [18,22,24].
By identifying the needs of external stakeholders (Q13 = 4.636), the universities are involved in the promotion of interdisciplinary educational paths (Q12 = 4.705), as the surveyed professors claim. Most of the participants think that their university is effectively promoting interdisciplinary educational paths. The results show that universities effectively promote interdisciplinary educational paths, and this is something that is positively perceived by the respondents, a result that aligns with most literature research [20,21,32].
Regarding the evaluation of the activities and results of public commitment (Q15 = 4.023) and indicators used (Q14 = 3.636), the best results were recorded in the universities of Romania and Lithuania, and lower results were obtained in Greece. These results could be explained by the fact that the respondents from Romania are teaching staff directly involved in the evaluation activity, compared to Greece, where doctoral students were involved in the survey. This context also explains the average obtained for question Q16 = 3.795 regarding the communication of the evaluation results on the impact of the institutions’ activities. This issue is of particular importance in the process of standardization, and universities should address their challenges based on proven strategies [16]. Results suggest that the respondents feel that the universities are not effectively using indicators to measure their activities and public engagement results, and it may be beneficial for universities to review and improve their methods for measuring and evaluating the effectiveness of public engagement activities. Insight into these processes is given by literature and professionals [11,14,20]. The low average score and large variation in responses suggest that this may be an area where the university could improve in terms of public engagement efforts [2]. This set of data shows that there is a need for the universities to improve in integrating the results of their public engagement activities into future planning and organizational development [2,4]. The standard deviation of 2.103 for Q17 means that the responses to this question are relatively spread out. This is also supported by the distribution of scores. In the ANOVA table, the values reveal that there is a significant difference between the means of the different rows, inferring that the responses to this question vary between different groups. Regarding the influence of universities at the local and regional level in Q17, the lowest average was obtained for universities in Greece; for the other countries, the results were approximately equal.
Social impact from public involvement activities and the definition at the university level are not fully satisfactory for respondents from all countries (Q18, Q19), with the averages obtained being close to the recorded central tendency. This satisfactory result was also recorded for question Q20 regarding the integration of interested parties in the management of the institution. Based on the obtained results, it can be concluded that the universities are generally successful in setting and communicating the goals and objectives of their public engagement activities and have a clear sense of direction in terms of how they want to create impact. This is a positive indication and hints at the fact that the universities effectively communicate their purpose and objectives with regard to public engagement with their communities and stakeholders [13,15]. Relationships with various stakeholders are crucial for universities in order to train students for real-life case scenarios and offer a smooth transition to the job market. Integration initiatives include joint labs, entrepreneurship accelerators, spin-off communities, and many others, for the mutual benefit of universities and companies alike [13,20,21,36,39].
In order to avoid the dependence between two quantitative variables in the sample of data collected by applying the questionnaire, Pearson’s correlation coefficient (r) was determined. The obtained coefficients had values between –1 (perfectly negative correlation) and 1 (perfectly positive correlation). The sign of the coefficient represents the meaning of the correlation, namely: the positive value corresponds to the variations of the same meaning and the negative one to those of the opposite direction. The absolute values of the correlation coefficients, presented in Table 5, express the intensity of the association between the items. Thus, for α < 0.05, values of the correlation coefficient from −0.25 to 0.25 were obtained, representing a weak or zero correlation, from 0.25 to 0.50 (or from −0.25 to −0.50) acceptable degree of association, from 0.50 to 0.75 (or from −0.50 to −0.75) moderate to good correlation, and from 0.75 to 1 (or from −0.75 to −1) very good correlation.
Among all the survey items in the first part of the questionnaire, only positive values were recorded that corresponded to variations of the same meaning. There are some moderate-to-strong positive relationships between the different questions. For example, Q2 and Q3 have a correlation coefficient of 0.84, indicating a strong positive relationship between the two questions.
Q4 and Q5 have a correlation coefficient of 0.66, indicating a moderately positive relationship between the two questions. Similarly, Q5 and Q6 have a correlation coefficient of 0.74, indicating a moderately positive relationship between the two questions. The highest association was recorded between items Q18 and Q19 (0.87), Q2 and Q3 (0.84), and Q15 and Q19 (0.80). However, it can also be seen that there are some weaker or no relationships between certain questions. For example, Q10 and Q14 have a correlation coefficient of 0.35, indicating a weak relationship between the two questions, and Q8 and Q17 have a correlation coefficient of 0.41, indicating a moderate relationship between the two questions.
The weakest correlation between items was recorded between items Q12 and Q1 (0.12), Q12 and Q5 (0.13), and Q12 and Q3 (0.17). These results suggest that there are moderate to strong positive relationships between some of the questions, indicating that the answers to these questions may be related to one another. However, there are also some weaker or no relationships between certain questions, indicating that the answers to these questions may not be as related to one another. It is important to keep in mind that correlation does not imply causation, and further analysis would be needed to understand the underlying relationships between the variables.

3.2. Relative Importance of Community Engagement

The questionnaire was put together so that the answers reflect a different facet of community engagement in European universities. Questions do not overlap in information but rather offer a complementary vision on how universities integrate community engagement practice into their academic third missions. Thus, each question is viewed both as a separate entity, with its own value in the setting of the overall objective of the questionnaire, and as a puzzle piece in the development of transformative actions.
In this context, results obtained by ANOVA and Pearson’s correlation showed that further analysis is necessary to substantiate the construction of a cohesive framework that could impact the decision-making process regarding community engagement in European universities.
Given the complexity of the analyzed issue, the Analytic Hierarchy Process (AHP) was applied to define the importance of each one of the 20 questions, respectively, as an underlying component of community engagement. The authors identified AHP as the most suitable method, attributing its effectiveness to its ability to minimize biases in the results of the decision-making process [40,41]. This approach necessitated a total of 190 pairwise comparisons among all 20 questions. In AHP, a consistency ratio below 10% is considered acceptable for maintaining result accuracy [42]. Goepel’s AHP Online System facilitated the analysis [43].
A decision matrix needs to be put together, evaluating the importance of each question in relation to all others and the degree of that importance. The used AHP scale was: 1—Equal Importance, 3—Moderate Importance, 5—Strong Importance, 7—Very Strong Importance, 9—Extreme Importance (2, 4, 6, 8 values in-between). To set the values for each pair of questions, the calculated standard deviation (Table 4) was used.
There are two important steps in putting together the matrix, as follows: 1. Which question is more important than the other; 2. How much more important is one question than the other based on the AHP scale. The first step is straight-forward as the question with the lowest standard deviation is the most important of the two being compared.
The second step involves weighing the differences in standard deviation and spreading them across the 9-point scale. A square matrix is used to calculate the standard deviation differences (1).
Q 1 Q 2 Q 3 Q j Q 20 Q 1 Q 2 Q 3 Q i Q 20 x 11 x 12 x 13 x 1 j x 120 x 21 x 22 x 23 x 2 j x 220 x 31 x 32 x 33 x 3 j x 320 x i 1 x i 2 x i 3 x i j x i 20 x 201 x 202 x 203 x 20 j x 2020
where xij is the difference between the standard deviation of question Qi and the standard deviation of question Qj. If xij has a negative value, then Question Qi is more important than question Qj. Based on the maximum absolute value amongst these differences, each question gets assigned a point on the AHP scale, according to the procedure shown in Table 6.
Using the criteria given in Table 6, 190 comparisons were made in pairs and an AHP decision matrix was put together (Figure 2a). The relative importance of each question was calculated based on the decision matrix, using the principal eigenvector solution with five iterations and a delta value of 4.7 × 10−8. Each question’s weight was assigned based on the priority in the AHP Ranking, as shown in Figure 2b.
The consolidated results of the AHP reveal a consistency ratio of 3.5% (Figure 3), significantly lower than the predetermined threshold. Consequently, the model’s inconsistencies are within an acceptable range, allowing the derived importance coefficients to be reliably utilized in subsequent decisions.
AHP shows that the most important questions relate to the promotion of interdisciplinary educational paths (Q12), the clarity of the public engagement definition (Q1), the integration of external services into universities’ portfolios of services to promote public engagement (Q8), and the offer of incentives and rewards to promote public engagement (Q6). Q12, although the most important for the survey participant universities, has the lowest correlation coefficient of all questions, implying that this is a mandatory area of improvement and further investigation for all universities.
It is interesting to note that ANOVA identified Q1 as having the highest average value amongst the group, and according to AHP, it is the second most important component for universities. In this regard, there is a balance between value and importance, and further steps might involve improving functionality rather than value.
The ANOVA on Q8 showed that European universities do not effectively integrate external services into their portfolio to promote public engagement. This result corroborated its’ importance. AHP shows that universities should implement a more efficient framework targeting practical solutions to external service integration. Q6 has strong positive values, with all other questions showing the grounded connection in research, making its’ importance valuable for further analysis and improvement. Based on the AHP and ANOVA results the authors put together a set of recommendations and limitations fort the current study.

3.3. Recommendations and Study Limitations

The Academic Third Mission refers to the engagement of universities with their local communities through activities such as research, education, and services [5,23]. Public engagement, or the involvement of citizens in these activities, is crucial for the success of the Third Mission [35]. However, the results of the current study indicate that there are a number of challenges to effective public engagement in tertiary education. These challenges include a lack of awareness and understanding of the Third Mission among citizens, difficulty in involving citizens in decision-making processes, and conflicts of interest that arise in the participatory process. In light of these challenges, it is essential to develop strategies for improving public engagement in tertiary education through the Academic Third Mission [18,19,22]. Some possible strategies include increasing awareness and understanding of the Third Mission among citizens, involving citizens in decision-making processes and providing them with the tools and resources to participate effectively, and addressing conflicts of interest in the participatory process. Based on the obtained results, the authors propose nine different strategies (S1 ÷ S9) for further development.
Improving public engagement in tertiary education requires a multifaceted approach, emphasizing transparency, early involvement, and a culture of participation. A key strategy is enhancing transparency and communication between universities and the community (S1). This can be effectively achieved by regularly publishing the results of participatory activities on the university’s website and establishing a dedicated online channel to listen to and implement citizens’ recommendations. Involvement of citizens should begin at the initial stages (S2), including the collection and processing of context data, identification of priorities, and planning and programming of interventions. Such early engagement ensures that their needs and perspectives are integral to decision-making processes. Additionally, fostering a culture of participation within the university is crucial (S3). This involves providing training and support to staff and students in participatory methods and encouraging active participation in decision-making processes. The formation of interest groups and coalitions during debates ensures diverse perspectives in decision-making (S4). Equally important is the regular evaluation and monitoring of the participation process (S5) to identify areas for improvement, ensuring inclusivity and fairness. Diverse participatory methods, such as town meetings, deliberative surveys, and design workshops, are essential to represent varied viewpoints (S6). Collaboration with other organizations and experts is another key aspect (S7), providing access to a broad range of perspectives and expertise in decision-making. It is also important to consider the available resources and the level of conflict (S8) related to the intervention area and the local community before implementing any strategy. Finally, supporting citizens to understand their needs and make informed decisions is paramount (S9). This includes informing them of the outcomes of the participatory process, the work conducted by researchers and experts, and collecting feedback for potential interventions and improvements. A specific online channel for listening to and implementing citizens’ recommendations further supports this strategy, making for a more robust and inclusive approach to public engagement in tertiary education.
In order to facilitate the implementation of the above strategies, the study showed that there are still several areas in which universities can improve their engagement with citizens through the Academic Third Mission [1,4]. In order to effectively involve citizens in the decision-making process and ensure that their needs are being met, universities should consider implementing a variety of good practices. First, universities should prioritize transparency and communication throughout the participatory process. This includes clearly communicating the goals and objectives of the participatory process to citizens, as well as providing regular updates on the progress of the process and the outcomes achieved [2]. Universities should also make an effort to ensure that the results of the participatory process are widely shared and easily accessible to citizens, such as through a dedicated section on the university website. Second, universities should actively involve citizens in the planning and implementation of the Third Mission activities. This can be achieved through a variety of methods, such as working groups, town meetings, and participatory budgeting [20]. By involving citizens in the planning process, universities can ensure that their needs and priorities are taken into account and that the resulting interventions are more effective. Third, universities should consider providing support to citizens to understand their needs and make informed decisions. This can be achieved through a variety of methods, such as information desks, listening points, and providing information about the final result produced by the participatory process and the work conducted by researchers and experts [21,23,30]. Fourth, in order to prevent conflicts of interest, universities should have a clear policy in place to identify and address such situations. This can include the establishment of a conflict-of-interest committee, the implementation of a code of conduct, and the provision of training to staff and stakeholders on how to handle conflicts of interest [33,35]. Finally, universities should conduct regular evaluations of the participatory process to identify areas for improvement and ensure that the needs and priorities of citizens are being met. This can include conducting surveys or focus groups to gather feedback from citizens, as well as conducting internal evaluations of the process [37].
The study revealed the main areas of improvement for the involved European universities and some important recommendations were proposed for further development. Based on these an initial framework is proposed in Figure 4.
To substantiate the framework and apply the identified sustainable strategies, the project consortium developed an online platform which enables stakeholders to get involved, participate and decide on sustainable academic contexts. The platform is available at www.tenacityplatform.com (accessed on 15 November 2023) and allows sustainable implementation of academic deliberative arenas for open science and innovation, and the delivery of an e-learning platform for academic deliberative practitioners. In accordance with study findings, the platform allows six main categories of stakeholders to participate in the creation of sustainable academic practices, namely: citizen, policy maker, professor, researcher student and teacher.
An important feature of this interactive tool is the iterative feedback loop which allows participants to the deliberative process to improve on any subroutine, enhancing the overall sustainability and probability of use for future applications. This approach also lowers the impact of identified limitations, all the way to potentially eliminating some of them. Multifunctionality was also promoted, and organic development of novel avenues was permitted, all leading to sustainable product development in academic settings.
Nevertheless, the study brings with it limitations which should be considered when assimilating the presented information and conclusions. One potential limitation of this study is the small sample size of the survey participants. With only 44 participants, it is difficult to generalize the findings to the larger population of citizens and universities. Small samples may have limited representativeness and statistical power, and assumptions such as normality can be more challenging to meet. Nonetheless, even a small quantitative study can establish baseline data on a topic, providing a starting point for future research and comparisons.
Additionally, the survey responses were self-reported and may not accurately reflect the true experiences and perspectives of the participants. The study also relies on the assumption that the participants have a clear understanding of the term “participatory practices” and have had similar experiences in their participation in university activities. There could also be a bias in the survey responses, as the participants may have had a vested interest in presenting their experiences in a certain way. Another limitation is that the study does not consider other factors that may influence the implementation of participatory practices in universities. For example, the survey does not take into account the specific political, economic, and cultural context of each university or the level of resources available to support participatory practices.
One mentionable limitation is that the study does not consider how the COVID-19 pandemic may have affected the ability of citizens and universities to participate in participatory practices, such as the shift to online engagement or the reduced availability of resources. The small sample size and self-reported nature of the survey responses, along with the assumptions made about the participants’ understanding and experiences, may limit the generalizability of the findings. Also, the study does not take into account other factors that may influence the implementation of participatory practices in universities. To overcome the study limitations, it is recommended to conduct quantitative analysis and further research on larger studies. Future actions include the use of the current study as a pilot to inform a larger, more comprehensive research project. Additional qualitative methods, such as focus groups or case studies, will also supplement the survey data to provide a richer, more nuanced understanding of the third mission in different European HEIs, further developing the proposed framework.
The advantages of using ANOVA in our design analysis also counteract some of the study limitations. It allowed us to quantify trends and patterns for community engagement, even with the small sample size. This provided initial insights and identified potential areas of interest for further qualitative analysis. The quantitative data collection involved standardized instruments; the survey used Likert scales, allowing for consistency in data collection and facilitating comparisons across respondents and institutions.

4. Conclusions

The current study provides valuable insights into the current state of public engagement in tertiary education through the Academic Third Mission in European universities. The results of this survey can be used to identify gaps and areas for improvement in the development of strategies for promoting public engagement. Additionally, the study leads to the conclusion that European universities need a general framework for promoting and improving public engagement in tertiary education through the Academic Third Mission. Furthermore, the study’s findings can be used to enrich a repository of good practices in Europe, which will be showcased in a handbook and on the TENACITY project website. This can serve as a valuable resource for universities looking to improve their public engagement strategies. The obtained results can be used to help identify the needs of universities in order to improve their deliberative practices. A survey was designed and applied to collect the data from 44 respondents, representing 35 universities from nine European countries. Quantitative (ANOVA) and qualitative analysis was undertaken to analyze various aspects of public engagement, such as university commitment, documentation, public awareness, investments, incentives, training, and stakeholder engagement.
The ANOVA results showed that while the respondents generally have a neutral opinion on the statements regarding public engagement at the university, there are some areas where they feel more positively or negatively. For example, the higher scores for Q1, Q2, and Q9 suggest that the respondents feel that the university’s commitment to public engagement is clearly defined, well documented, and has well-structured target groups for its community public engagement activities. Lower scores for Q3, Q4, and Q5 show that the respondents feel that the university does not ensure that the documented commitment to public engagement is also publicly known and understood, people at different levels of the university are not responsible for implementing the public engagement agenda, and the university does not currently make adequate investments to encourage public engagement. Similarly, higher scores for Q6 and Q7 imply that the respondents feel that the university offers incentives and rewards to promote public engagement and offers training activities to support public engagement. The smaller values obtained for Q8, Q10 and Q11 showcase the situation where the respondents feel that the university does not integrate external services into its portfolio of services to promote public engagement, does not use up-to-date methods and approaches to develop public engagement skills among students, and does not integrate public engagement practices into degree programs. Results for Q12, Q13 and Q19 were registered in the upper part of the evaluation scale and signify that the respondents think that the university promotes interdisciplinary educational paths, compares and identifies the needs of its external stakeholders, and has defined the kind of impact it aims to create through public engagement. On the other hand, lower scores for Q14, Q15 and Q16 suggest that the respondents feel that the university does not use indicators to measure its activities and public engagement results, does not ensure that the results of the impact assessment of public engagement activities are used for future planning and organizational development, and does not communicate the results of the assessment on the impact of its public engagement activities inside and outside the institution. Higher scores for Q17, Q18, and Q20 entail that the university influences community engagement at local and regional levels, creates a social impact from public engagement activities, and integrates community stakeholders into the institution’s leadership.
AHP was used to add value to the current study by prioritizing the questions based on their relative importance, thus offering a comprehensive view that is beneficial for both analytical and decision-making purposes. The analysis identified four key survey areas: promoting interdisciplinary paths (Q12), defining public engagement (Q1), integrating external services (Q8), and incentivizing public engagement (Q6). Q12, crucial but with the lowest correlation, highlighted a significant improvement area. Q1’s high average in ANOVA aligned with its AHP importance, suggesting a need to focus on functionality. Q8’s poor integration of external services in universities, as per ANOVA, combined with its AHP significance, called for more efficient external service integration strategies. Q6’s strong correlations indicated its vital role in research and improvement.
The current study is an important contribution to the field of public engagement in tertiary education through the Academic Third Mission by providing valuable insights and recommendations that can be used to improve the development of strategies and enhance public engagement in European universities.

Author Contributions

Conceptualization: M.-E.U. and C.-V.D.; Methodology: P.S.; Validation: C.-V.D., P.S. and M.-E.U.; Formal analysis: C.-V.D.; Investigation: M.-E.U.; Resources: P.S.; Data curation: P.S., M.-E.U. and C.-V.D.; Writing—original draft preparation: M.-E.U.; Writing—review and editing: P.S. and C.-V.D.; Visualization: M.-E.U.; Supervision: C.-V.D.; Project administration: P.S.; Funding acquisition: P.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Community’s ERASMUS+ PROGRAMME under grant agreement no. 2021-1-IT02-KA220-HED-000032042—ACADEMIC THIRD MISSION: comuniTy Engagement for a kNowledge bAsed soCIeTY (TENACITY). This work was supported by a grant from the National Program for Research of the National Association of Technical Universities—GNAC ARUT 2023, Grant No. 14/06.10.2023.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available in Appendix A.

Acknowledgments

Authors would like to acknowledge the contribution of University POLITEHNICA of Bucharest for the support given in the form of infrastructure and other resources not covered through the funded projects.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data, in the writing of the manuscript, or in the decision to publish the results.

Appendix A

Figure A1. Results for the 20 items within TENACITY project.
Figure A1. Results for the 20 items within TENACITY project.
Education 14 00141 g0a1aEducation 14 00141 g0a1bEducation 14 00141 g0a1cEducation 14 00141 g0a1dEducation 14 00141 g0a1e

Appendix B

Table A1. Shapiro–Wilk test applied to calculate the statistic (W) and the p-value for each of the 20 questions from the survey in Germany, Greece and Lithuania.
Table A1. Shapiro–Wilk test applied to calculate the statistic (W) and the p-value for each of the 20 questions from the survey in Germany, Greece and Lithuania.
QuestionGermanyGreeceLithuania
Wp-ValueNormalityWp-ValueNormalityWp-ValueNormality
Q10.6297760.001241No0.948170.532673Yes0.8603790.261574Yes
Q20.9446640.682961Yes0.9588620.704304Yes0.910990.487663Yes
Q30.8494020.224231Yes0.8998120.112078Yes0.9713740.849971Yes
Q40.7906530.086487Yes0.8810890.060231Yes0.8480790.219999Yes
Q50.9446640.682961Yes0.8192580.008724No0.8949450.406387Yes
Q60.910990.487662Yes0.8815970.061244Yes0.8397020.194534Yes
Q70.8633690.272453Yes0.8590020.029495No0.9630720.798227Yes
Q80.8494020.224231Yes0.9090980.152901Yes0.8397020.194534Yes
Q90.9929120.971877Yes0.8455290.019323No0.9929120.971878Yes
Q100.8274270.161191Yes0.8762810.051458Yes0.7435730.033567No
Q110.6297760.001241No0.9344320.35164Yes0.8633690.272453Yes
Q120.8005630.103233Yes0.7601750.001673No0.6297760.001241No
Q130.939270.649878Yes0.9049350.133024Yes0.8480790.219999Yes
Q140.9497060.714281Yes0.8445880.018768No0.7729070.061847Yes
Q150.8274270.161191Yes0.8576270.028237No0.7634790.051229Yes
Q160.9983960.995064Yes0.8326790.013032No0.8869120.369Yes
Q170.8633690.272453Yes0.8538560.025066No0.9497060.714281Yes
Q180.9446640.682961Yes0.9007590.11568Yes0.9497060.714281Yes
Q190.8949450.406388Yes0.8775390.053617Yes0.9270820.577355Yes
Q200.9270820.577355Yes0.8565350.027278No0.6297760.001241No
Table A2. Shapiro–Wilk test applied to calculate the statistic (W) and the p-value for each of the 20 questions from the survey in Romania, Spain, Sweden.
Table A2. Shapiro–Wilk test applied to calculate the statistic (W) and the p-value for each of the 20 questions from the survey in Romania, Spain, Sweden.
QuestionRomaniaSpainSweden
Wp-ValueNormalityWp-ValueNormalityWp-ValueNormality
Q10.8584860.146728Yes0.7747080.022823No0.9713740.849971Yes
Q20.8584860.146728Yes0.8134340.055481Yes0.9497060.714281Yes
Q30.8674120.176171Yes0.9325280.572603Yes0.910990.487662Yes
Q40.8463020.113659Yes0.7843530.028585No0.8949450.406387Yes
Q50.8538830.133334Yes0.9097110.393876Yes0.7634790.051229Yes
Q60.9293570.545445Yes0.9260570.517886Yes0.9497060.714281Yes
Q70.9215790.481756Yes0.835710.090587Yes0.8005630.103233Yes
Q80.9106620.400475Yes0.8799770.226348Yes0.7286340.023857No
Q90.6705360.001752No0.9111280.403738Yes0.9713740.849971Yes
Q100.7197580.006067No0.9555360.77965Yes0.8820720.34756Yes
Q110.8639610.164219Yes0.8463020.113659Yes0.9630720.798227Yes
Q120.8400440.099451Yes0.9070510.375833Yes0.8820720.34756Yes
Q130.8560910.139616Yes0.8624860.159333Yes0.8274270.16119Yes
Q140.8711930.190135Yes0.8744510.202933Yes0.7435730.033567No
Q150.8703280.186858Yes0.8639610.164219Yes0.7985260.099603Yes
Q160.8632250.161763Yes0.8127360.054621Yes0.8820720.34756Yes
Q170.9345840.590524Yes0.909030.389195Yes0.9630720.798227Yes
Q180.8349690.089147Yes0.9452530.686389Yes0.8820720.34756Yes
Q190.8249480.071632Yes0.9319180.567328Yes0.8633690.272453Yes
Q200.7917180.033888No0.9653650.863218Yes0.8397020.194534Yes
Figure A2. Q-Q plot of residuals.
Figure A2. Q-Q plot of residuals.
Education 14 00141 g0a2

Appendix C

Table A3. Levene’s test for validation of homogeneity of variances for 20 questions of the survey (p-value > 0.05).
Table A3. Levene’s test for validation of homogeneity of variances for 20 questions of the survey (p-value > 0.05).
SpainRomaniaItalySwedenGreeceGermanyLithuaniaOverallLevene’s Test
Statistic
Levene’s Test
p-Value
Homogeneity
Q12.9523816.6190480.5000002.9166672.1318680.2500006.0000003.7212541.6400970.165415Yes
Q22.5714296.6190480.0000003.3333333.3626370.6666675.6666673.9425091.7252530.144158Yes
Q32.6666676.2380950.0000005.6666673.4120882.2500002.9166673.9976771.0587900.405442Yes
Q48.3333336.6666670.5000006.3333333.4945053.5833335.5833335.0272940.7545310.610160Yes
Q55.9047624.4761900.5000005.6666673.7582420.6666671.5833334.8408830.8294770.555211Yes
Q65.2380952.9047620.5000003.3333332.8351655.6666673.0000003.8652730.7688210.599509Yes
Q74.2380954.9523812.0000004.9166672.5274730.9166674.9166673.9303140.3673640.894622Yes
Q84.5714295.2857142.0000008.3333331.3461542.2500003.0000003.8164922.2860030.057563Yes
Q93.6190482.9047620.0000002.9166673.6098901.6666676.6666673.9837401.4949750.208598Yes
Q104.0000001.8100000.5000008.6670002.9510002.0000008.2500004.0630001.0239300.426171Yes
Q116.6700005.5700000.5000004.9200003.4500004.0000008.2500004.3200000.6148510.716864Yes
Q123.9050001.9050000.5000008.6670002.3740004.9170006.2500003.5030000.5330000.884000Yes
Q133.6190002.9050002.0000002.0000003.3460007.5830005.5830004.0070000.6040000.725000Yes
Q146.2380006.9520000.0000004.6670006.5270002.3330008.3330005.9280000.5590000.784000Yes
Q155.5714295.2857140.50000010.2500004.6868132.0000005.6666675.2921020.7810040.590487Yes
Q165.2381006.9048008.0000008.6667004.1319004.3333006.9167005.3566000.4034000.871700Yes
Q175.6190482.2380952.0000004.9166673.4560440.9166673.3333334.1927990.6227520.710774Yes
Q184.5700005.8100002.0000008.6700004.0700000.6700003.3300004.6700000.8649750.530078Yes
Q193.5714295.2380952.0000008.2500003.7197801.5833334.6666674.5272940.3956070.876791Yes
Q204.4761907.4761900.0000003.0000003.7197804.6666672.2500004.9872241.0966710.383788Yes

References

  1. Taliento, M. The Triple Mission of the Modern University: Component Interplay and Performance Analysis from Italy. World 2022, 3, 489–512. [Google Scholar] [CrossRef]
  2. Compagnucci, L.; Spigarelli, F. The Third Mission of the university: A systematic literature review on potentials and constraints. Technol. Forecast. Soc. Chang. 2020, 161, 120284. [Google Scholar] [CrossRef]
  3. Stolze, A.; Sailer, K. Advancing HEIs’ Third-Mission through Dynamic Capabilities: The Role of Leadership and Agreement on Vision and Goals. J. Technol. Transf. 2022, 47, 580–604. [Google Scholar] [CrossRef] [PubMed]
  4. Kesten, A. Analysis of the Missions of Higher Education Institutions within the Scope of Third Mission Understanding. Int. J. Educ. Methodol. 2019, 5, 387–400. [Google Scholar] [CrossRef]
  5. Brundenius, C.; Göransson, B. The Three Missions of Universities: A Synthesis of UniDev Project Findings. In Universities in Transition: Insight and Innovation in International Development; Göransson, B., Brundenius, C., Eds.; Springer: Berlin/Heidelberg, Germany, 2011; pp. 329–352. [Google Scholar] [CrossRef]
  6. European Union. Communication on a European Strategy for Universities; Publications Office: Luxembourg, 2022; pp. 1–20. Available online: https://education.ec.europa.eu/sites/default/files/2022-01/communication-european-strategy-for-universities-graphic-version.pdf (accessed on 21 January 2023).
  7. European Commission. Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee, and the Committee of the Regions on a European Strategy for Universities. COM(2022) 16 Final, 2022; pp. 1–16. Available online: https://education.ec.europa.eu/document/commission-communication-on-a-european-strategy-for-universities (accessed on 17 February 2023).
  8. Husiev, O.; Arrien, O.U.; Enciso-Santocildes, M. What does Horizon 2020 contribute to? Analysing and visualising the community practices of Europe’s largest research and innovation programme. Energy Res. Soc. Sci. 2023, 95, 102879. [Google Scholar] [CrossRef]
  9. Salomaa, M.; Charles, D. The University Third Mission and the European Structural Funds in Peripheral Regions: Insights from Finland. Sci. Public Policy 2021, 48, 352–363. [Google Scholar] [CrossRef]
  10. UNESCO Institute for Lifelong Learning. Policy Brief: The Contribution of Higher Education Institutions to Lifelong Learning; UNESCO: Hamburg, Germany, 2022; p. 12. Available online: https://unesdoc.unesco.org/ark:/48223/pf0000381924 (accessed on 5 February 2023).
  11. Kuipers-Dirven, R.; Janssen, M.; Hoekman, J. Assessing University Policies for Enhancing Societal Impact of Academic Research: A Multicriteria Mapping Approach. Res. Eval. 2022, 31, 136–148. [Google Scholar] [CrossRef]
  12. OECD. Benchmarking Higher Education System Performance; Engagement with the Wider World; OECD Publishing: Paris, France, 2019; Chapter 7. [Google Scholar] [CrossRef]
  13. Boffo, S.; Cocorullo, A. University Fourth Mission, Spin-Offs and Academic Entrepreneurship: Connecting Public Policies with New Missions and Management Issues of Universities. High. Educ. Forum 2019, 16, 125–142. Available online: https://eric.ed.gov/?id=EJ1308034 (accessed on 11 November 2023).
  14. Blasi, B.; Romagnosi, S.; Ancaiani, A.; Malgarini, M.; Momigliano, S. A New Method for Evaluating Universities’ Third Mission Activities in Italy: Case Study Contribution to the OECD TIP Knowledge Transfer and Policies Project. 2019. Available online: https://stip.oecd.org/assets/TKKT/CaseStudies/11.pdf (accessed on 29 January 2023).
  15. Healy, A.; Perkmann, M.; Goddard, J.; Kempton, L. Measuring the Impact of University Business Cooperation; Final Report; Publications Office of the European Union: Luxembourg, 2014; Available online: https://eprints.ncl.ac.uk/file_store/production/210569/222589CF-A500-4014-83FA-96561181E783.pdf (accessed on 8 February 2023).
  16. Peake, W.O.; Potter, P.W. University Strategy, Accreditation Standards, and the Applied Education Mission: Using the Student Consulting to Build Bridges. Small Bus. Inst. J. 2022, 18, 10–22. [Google Scholar] [CrossRef]
  17. Heaton, S.; Siegel, D.S.; Teece, D.J. Universities and innovation ecosystems: A dynamic capabilities perspective. Ind. Corp. Chang. 2019, 28, 921–939. [Google Scholar] [CrossRef]
  18. Vidicki, P.; Vrgović, P.; Stevanov, B.; Medić, N. Framework for Measuring Innovation Performance in Higher Education Institutions. Teh. Vjesn. 2023, 30, 68–79. [Google Scholar] [CrossRef]
  19. Cosenz, F. Adopting a Dynamic Performance Governance Approach to Frame Interorganizational Value Generation Processes into a University Third Mission Setting. In Governance and Performance Management in Public Universities; Caperchione, E., Bianchi, C., Eds.; SIDREA Series in Accounting and Business Administration; Springer: Cham, Switzerland, 2022; pp. 87–108. [Google Scholar] [CrossRef]
  20. Salomaa, M. Third Mission and Regional Context: Assessing Universities’ Entrepreneurial Architecture in Rural Regions. Reg. Stud. Reg. Sci. 2019, 6, 233–249. [Google Scholar] [CrossRef]
  21. Sánchez-Barrioluengo, M.; Benneworth, P. Is the Entrepreneurial University Also Regionally Engaged? Analysing the Influence of University’s Structural Configuration on Third Mission Performance. Technol. Forecast. Soc. Chang. 2019, 141, 206–218. [Google Scholar] [CrossRef]
  22. Farnell, T.; Ilić, B.Ć. Towards a European Framework for Community Engagement in Higher Education. In Socially Responsible Higher Education: International Perspectives on Knowledge Democracy; Hall, B., Tandon, R., Eds.; Brill: Leiden, The Netherlands, 2021; pp. 253–264. [Google Scholar]
  23. Petersen, I.; Kruss, G.; van Rheede, N. Strengthening the University Third Mission through Building Community Capabilities alongside University Capabilities. Sci. Public Policy 2022, 49, 890–904. [Google Scholar] [CrossRef]
  24. O’Brien, E.; Ćulum Ilić, B.; Veidemane, A.; Dusi, D.; Farnell, T.; Šćukanec Schmidt, N. Towards a European Framework for Community Engagement in Higher Education: A Case Study Analysis of European Universities. Int. J. Sustain. High. Educ. 2022, 23, 815–830. [Google Scholar] [CrossRef]
  25. Gruber, A.M. Community Engagement in Higher Education: Online Information Sources. Coll. Res. Libr. News 2017, 78, 563. [Google Scholar] [CrossRef]
  26. López, B. How Higher Education Promotes the Integration of Sustainable Development Goals—An Experience in the Postgraduate Curricula. Sustainability 2022, 14, 2271. [Google Scholar] [CrossRef]
  27. Petersen, I.; Kruss, G. Universities as Change Agents in Resource-Poor Local Settings: An Empirically Grounded Typology of Engagement Models. Technol. Forecast. Soc. Chang. 2021, 167, 120693. [Google Scholar] [CrossRef]
  28. Preradović, M.N.; Čalić, M.; Van Overbeeke, P.S.M. Rural 3.0: A Case Study of University-Community Engagement Through Rural Service-Learning in Croatia. J. High. Educ. Outreach Engag. 2022, 26, 117. Available online: https://files.eric.ed.gov/fulltext/EJ1342742.pdf (accessed on 7 November 2023).
  29. Bell, K.; Reed, M. The Tree of Participation: A New Model for Inclusive Decision-Making. Community Dev. J. 2022, 57, 595–614. [Google Scholar] [CrossRef]
  30. O’Farrell, L.; Hassan, S.; Hoole, C. The University as a Just Anchor: Universities, Anchor Networks and Participatory Research. Stud. High. Educ. 2022, 47, 2405–2416. [Google Scholar] [CrossRef]
  31. France-Harris, A.; Burton, C.; Mooney, M. Putting Theory into Practice: Incorporating a Community Engagement Model into Online Pre-Professional Courses in Legal Studies and Human Resources Management. Online Learn. 2019, 23, 21–39. [Google Scholar] [CrossRef]
  32. Tijsma, G.; Horn, A.; Urias, E.; Zweekhorst, M.B.M. Training Students in Inter- and Transdisciplinary Sustainability Education: Nurturing Cross-Faculty Staff Commitment and Continuous Community Collaboration. Int. J. Sustain. High. Educ. 2022, 24, 765–787. [Google Scholar] [CrossRef]
  33. Iniesto, F.; Charitonos, K.; Littlejohn, A. A Review of Research with Co-Design Methods in Health Education. Open Educ. Stud. 2022, 4, 273–295. [Google Scholar] [CrossRef]
  34. Museus, S.D.; Shiroma, K. Understanding the Relationship between Culturally Engaging Campus Environments and College Students’ Academic Motivation. Educ. Sci. 2022, 12, 785. [Google Scholar] [CrossRef]
  35. Mourad, R. Deliberative Democracy in Higher Education: The Role of Critical Spaces Across Universities. J. Deliberative Democr. 2022, 1, 27–41. [Google Scholar] [CrossRef]
  36. Bacevic, J. Beyond the Third Mission: Toward an Actor-Based Account of Universities’ Relationship with Society. In Universities in the Neoliberal Era; Ergül, H., Coşar, S., Eds.; Palgrave Critical University Studies: London, UK, 2017; pp. 21–39. [Google Scholar] [CrossRef]
  37. Kennedy, J.; Pek, S. Mini-Publics, Student Participation, and Universities’ Deliberative Capacity. Stud. High. Educ. 2023, 48, 63–82. [Google Scholar] [CrossRef]
  38. Fremstad, E.; Bergh, A.; Solbrekke, T.D.; Fossland, T. Deliberative academic development: The potential and challenge of agency. Int. J. Acad. Dev. 2020, 25, 107–120. [Google Scholar] [CrossRef]
  39. Lopes, J.M.; Oliveira, M.; Oliveira, J.; Sousa, M.; Santos, T.; Gomes, S. Determinants of the Entrepreneurial Influence on Academic Entrepreneurship—Lessons Learned from Higher Education Students in Portugal. Educ. Sci. 2021, 11, 771. [Google Scholar] [CrossRef]
  40. Kovbasiuk, I.; Lowe, W.; Ericsson, M.; Wingkiist, A. Quick Decide—A tool to aid the analytic hierarchy process for group decisions. In Perspectives in Business Informatics Research; Matulevicius, R., Dumas, M., Eds.; Springer: Cham, Switzerland, 2015; pp. 179–196. [Google Scholar]
  41. Galarza-Molina, S.L.; Torres, A.; Moura, P.; Lara-Borrero, J. CRIDE: A Case Study in Multi-Criteria Analysis for Decision-Making Support in Rainwater Harvesting. Int. J. Inf. Technol. Decis. Mak. 2015, 14, 43–67. [Google Scholar] [CrossRef]
  42. Lane, E.F.; Verdini, W.A. A consistency test for AHP decision makers. Decis. Sci. 1989, 20, 575–590. [Google Scholar] [CrossRef]
  43. Goepel, K.D. BPMSG’s AHP Online System—Rational Decision Making Made Easy, Business Performance Management Singapore. Available online: https://bpmsg.com/ahp/docs/BPMSG-AHP-OS.pdf (accessed on 24 November 2023).
Figure 1. Distributions of scores for the public engagement data set.
Figure 1. Distributions of scores for the public engagement data set.
Education 14 00141 g001
Figure 2. AHP for 20 questions on community engagement in European universities: (a) AHP Decision matrix; (b) AHP Ranking.
Figure 2. AHP for 20 questions on community engagement in European universities: (a) AHP Decision matrix; (b) AHP Ranking.
Education 14 00141 g002
Figure 3. AHP consolidated result for all 20 questions on community engagement in European universities.
Figure 3. AHP consolidated result for all 20 questions on community engagement in European universities.
Education 14 00141 g003
Figure 4. General framework for promoting academic community engagement.
Figure 4. General framework for promoting academic community engagement.
Education 14 00141 g004
Table 1. European universities which participated in the conducted study.
Table 1. European universities which participated in the conducted study.
CountryNo. of
Universities
Universities
Germany4University of Stuttgart; Münster University of Applied Sciences—FH Münster; Deggendorf Institute of Technology; Martin Luther University Halle-Wittenberg
Greece7University of Thessaly; Harokopio University; Panteion University; Aristotle University of Thessaloniki; University of the Aegean; University of Patras; National Technical University of Athens.
Italy2University of Bolzano; University of Firenze
Lithuania3Vilnius University, Faculty of Communication; SMK University of Applied Sciences; Kazimieras Simonavičius University
Malta1University of Malta
Portugal1University of Minho, Institute of Education
Romania7University of Bucharest, Faculty of Foreign Languages and Literatures; Carol Davila University of Medicine and Pharmacy, Faculty of Dentistry; Transylvania University of Brașov, Faculty of Materials Science and Engineering; Bucharest University of Economic Studies, Faculty of Management; Ferdinand I Military Technical Academy; Craiova University, Faculty of Engineering and Management of Technological Systems; University of Targu Jiu, Faculty of Engineering, Constantin Brancusi
Spain6Santiago de Compostela University; University of Jaen; University of Valladolid; Universidad Autónoma de Madrid; University of Seville, Department of Developmental and Educational Psychology; Pablo de Olavide University
Sweden4Södertörn University; KTH Royal Institute of Technology; University West; Umeå University
Table 2. Question set used for survey in public engagement.
Table 2. Question set used for survey in public engagement.
IDQuestion
Q1Is the university’s commitment to public engagement clearly defined?
Q2Is the commitment to public engagement well documented?
Q3Does the university ensure that the documented commitment to public engagement is also publicly known and understood?
Q4Are people at different levels of the university responsible for implementing the public engagement agenda?
Q5Does the university currently make adequate investments to encourage public engagement?
Q6Does the university offer incentives and rewards to promote public engagement?
Q7Does the university offer training activities to support public engagement?
Q8Does the university integrate external services into its portfolio of services to promote public engagement?
Q9Does the university have clearly defined target groups for its (community) public engagement activities?
Q10Does the university use up to date (e.g., didactic) methods and approaches to develop public engagement skills among students?
Q11Does the university integrate public engagement practices into degree programs?
Q12Does the university promote interdisciplinary educational paths?
Q13Does the university compare and identify the needs of its external stakeholders?
Q14Does the university use indicators to measure its activities and public engagement results (of the community)?
Q15Does the university ensure that the results of the impact assessment of public engagement activities are used for future planning and organizational development?
Q16Does the university communicate the results of the assessment on the impact of its public engagement activities inside and outside the institution?
Q17Does the university influence (community) engagement at local and regional levels?
Q18Does the university create a social impact from public engagement activities?
Q19Has the university defined the kind of impact it aims to create through public engagement?
Q20Does the university integrate (community) stakeholders into the institution’s leadership?
Table 3. ANOVA on public engagement data set.
Table 3. ANOVA on public engagement data set.
Source of VariationSSdfMSFp-ValueF Crit
Rows2102.7274348.9006323.519943.6 × 10−1141.394538
Columns113.1636195.9559812.8646724.31 × 10−051.599272
Error1698.6368172.079114
Total3914.527879
Cronbach’s Alpha = 0.957483
Table 4. Standard deviation and variance for the 20-question data set regarding public engagement.
Table 4. Standard deviation and variance for the 20-question data set regarding public engagement.
Question IDCountSumAverageVarianceStd. Dev.
Q1442114.7953.7011.924
Q2442024.5914.1082.027
Q3441764.0004.0472.012
Q4441864.2275.1102.261
Q5441834.1594.9282.220
Q6441663.7733.9011.975
Q7441683.8183.9661.992
Q8441533.4773.7901.947
Q9441894.2954.1202.030
Q10441884.2734.2962.073
Q11441764.0004.4192.102
Q12442074.7053.4691.862
Q13442044.6363.9581.989
Q14441603.6365.0272.242
Q15441774.0235.4652.338
Q16441673.7955.2362.288
Q17441924.3644.4232.103
Q18441944.4094.6192.149
Q19441713.8864.6152.148
Q20441743.9555.2072.282
Table 5. Correlation of coefficients.
Table 5. Correlation of coefficients.
Q1Q2Q3Q4Q5Q6Q7Q8Q9Q10Q11Q12Q13Q14Q15Q16Q17Q18Q19Q20
Q11.00
Q20.731.00
Q30.770.841.00
Q40.390.460.421.00
Q50.490.550.610.661.00
Q60.470.530.640.650.741.00
Q70.460.530.560.530.560.661.00
Q80.430.500.490.610.540.560.361.00
Q90.460.500.590.480.670.720.550.621.00
Q100.460.530.630.550.590.620.460.490.711.00
Q110.430.570.600.600.470.610.690.380.470.541.00
Q120.120.240.170.270.130.350.330.310.370.350.521.00
Q130.220.370.300.400.460.560.570.360.560.370.590.651.00
Q140.310.540.440.510.490.550.520.510.470.350.620.600.671.00
Q150.320.550.520.510.580.540.660.370.540.550.660.520.700.851.00
Q160.470.550.610.530.580.640.660.470.530.460.720.450.670.740.771.00
Q170.480.350.400.440.500.610.570.410.470.350.410.260.480.400.480.531.00
Q180.590.510.600.530.640.640.710.410.580.580.640.340.540.620.760.740.691.00
Q190.520.550.620.630.610.690.760.540.640.640.720.460.560.700.800.740.570.871.00
Q200.410.290.360.450.540.630.410.370.420.310.450.370.550.510.500.590.500.610.541.00
Table 6. Criteria to assign points on the AHP scale for each pairwise comparison.
Table 6. Criteria to assign points on the AHP scale for each pairwise comparison.
Points on the AHP Scale Interval   Range   for   x i j When Assigning Points on the AHP Scale *
10
2 0   ,   m a x x i j n 1 n 8 + 1 2
3 m a x x i j n 1 1 + 1 2   ,   m a x x i j n 1 n 7 + 1 2
4 m a x x i j n 1 1 + 1 2   ,   m a x x i j n 1 n 6 + 1 2
5 m a x x i j n 1 1 + 1 2   ,   m a x x i j n 1 n 5 + 1 2
6 m a x x i j n 1 1 + 1 2   ,   m a x x i j n 1 n 4 + 1 2
7 m a x x i j n 1 1 + 1 2   ,   m a x x i j n 1 n 3 + 1 2  
8 m a x x i j n 1 1 + 1 2   ,   m a x x i j n 1 n 2 + 1 2
9 m a x x i j n 1 1 + 1 2   ,   m a x x i j n 1 n 1 + 1 2
* n = 9, the maximum value on the AHP scale.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Spânu, P.; Ulmeanu, M.-E.; Doicin, C.-V. Academic Third Mission through Community Engagement: An Empirical Study in European Universities. Educ. Sci. 2024, 14, 141. https://doi.org/10.3390/educsci14020141

AMA Style

Spânu P, Ulmeanu M-E, Doicin C-V. Academic Third Mission through Community Engagement: An Empirical Study in European Universities. Education Sciences. 2024; 14(2):141. https://doi.org/10.3390/educsci14020141

Chicago/Turabian Style

Spânu, Paulina, Mihaela-Elena Ulmeanu, and Cristian-Vasile Doicin. 2024. "Academic Third Mission through Community Engagement: An Empirical Study in European Universities" Education Sciences 14, no. 2: 141. https://doi.org/10.3390/educsci14020141

APA Style

Spânu, P., Ulmeanu, M. -E., & Doicin, C. -V. (2024). Academic Third Mission through Community Engagement: An Empirical Study in European Universities. Education Sciences, 14(2), 141. https://doi.org/10.3390/educsci14020141

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop