Next Article in Journal
Long Distance Moving Vehicle Tracking with a Multirotor Based on IMM-Directional Track Association
Next Article in Special Issue
Rhythmic-Synchronization-Based Interaction: Effect of Interfering Auditory Stimuli, Age and Gender on Users’ Performances
Previous Article in Journal
Synthesis of Zeolitic Material with High Cation Exchange Capacity from Paper Sludge Ash Using EDTA
Previous Article in Special Issue
A Taxonomy of Technologies for Human-Centred Logistics 4.0
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Project Management Information Systems (PMISs): A Statistical-Based Analysis for the Evaluation of Software Packages Features

Department of Engineering, University of Palermo, Viale delle Scienze Building 8, 90128 Palermo, Italy
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(23), 11233; https://doi.org/10.3390/app112311233
Submission received: 28 October 2021 / Revised: 23 November 2021 / Accepted: 24 November 2021 / Published: 26 November 2021
(This article belongs to the Collection Human Factors in the Digital Society)

Abstract

:
Project Managers (PMs) working in competitive markets are finding Project Management Information Systems (PMISs) useful for planning, organizing and controlling projects of varying complexity. A wide variety of PMIS software is available, suitable for projects differing in scope and user needs. This paper identifies the most useful features found in PMISs. An extensive literature review and analysis of commercial software is made to identify the main features of PMISs. Afterwards, the list is reduced by a panel of project management experts, and a statistical analysis is performed on data acquired by means of two different surveys. The relative importance of listed features is properly computed, and the interactions between the respondent’s profiles and PMIS features are also investigated by cluster and respondents’ analyses. The paper provides information for researchers and practitioners interested in PMISs packages and their applications. Furthermore, the analyses may help practitioners when choosing a PMIS, and also for developers of PMISs software in understanding user needs.

1. Introduction

Nowadays, Project Managers (PMs) deal with increasingly complex projects from both organizational and technological perspectives [1]. This complexity is even higher when a multiplicity of conflicting goals is to be achieved [2] and/or limited resources are available [3,4]. PMs often have to make fast decisions—rapidly adapting to changes in the project management field—and to satisfy all specifications with the aim to deliver projects in time and within the planned budget [5,6]. In this context, Project Management Information Systems (PMISs) are useful tools and techniques to gather, integrate and disseminate the outputs of project management processes [7], supporting PMs in planning, organizing and controlling projects of different complexity. As a result, a wide range of PMIS software is available on the market. Contrary to the customized versions, commercial PMIS software is ready-to-use mass products aimed at generic users [8,9,10], and their choice is a tricky task. Aiming to obtain an exhaustive list of features commonly owned by PMIS software, literature contributions to the field were reviewed, and the main commercial PMIS software was analyzed. Afterwards, collected features were examined by the authors, on the one hand, and by a group of experts, on the other hand, in order to exclude redundant functionalities and/or to merge the similar ones. This way, a reduced list of features was obtained, and a two-level hierarchical structure—comprising features and sub-features—was developed to offer a synthetic and easier overall view of the problem. To evaluate the relative importance of selected features/sub-features, two surveys were first designed—the first by Design of Experiment (DoE) and conjoint analysis, while the second by the ranking scale method—and distributed to a representative panel of experts in the field. Finally, a statistical-based analysis was performed to weigh the selected features, and the interaction between respondents’ profiles and PMIS features was also investigated by cluster and respondents’ analyses, for the first and second surveys, respectively.
Despite several literature contributions dealing with the application of PMISs and the evaluation of their usefulness and performance [11,12,13,14,15,16,17,18,19,20,21,22,23,24], few of them are concerned with determining the fundamental features that a PMIS should have and assessing their relative importance. In the authors’ opinion, the analysis of both the main commercial PMIS software and the scientific contributions in the field provides a structured framework to have an overall view of the state-of-the-art. In addition, the outcomes of the performed quantitative analysis may be used by researchers and practitioners to make proper decisions when choosing and/or developing PMIS software. Therefore, the purpose of the present work is threefold and may be summarized through the following Research Questions (RQs):
RQ1. What are the main features that PMIS software should contain?
RQ2. What is the relative importance of selected features?
RQ3. Is there any interaction between the relative importance of the identified features and the respondent’s profile?
The remainder of the paper is organized as follows. The literature review is presented in Section 2, and the methodological approach is described in Section 3. Results and discussion are synthesized in Section 4. Conclusions close the work in Section 5.

2. Literature Review

Nowadays, PMISs have been increasingly used by PMs as decision-aiding support tools. With this recognition, several contributions deal with the application of PMISs and the evaluation of their usefulness and performance, also focusing on the determination of basic features required by PMIS users. In [11], the impact of PMISs on PMs and projects performance is assessed. In particular, responses to questionnaires administered to thirty-nine PMs are analyzed to highlight the positive influence of PMIS on planning, scheduling, monitoring and controlling. In addition, the authors emphasize how important the reporting feature of PMISs is in providing project information and progress over time. In order to design a PMIS software, a REFerence information MODel for enterprise-wide Project Management (Ref-ModPM) is proposed by [12]. The model includes all project management processes (e.g., planning, controlling and coordinating), and it is suitable for both single and multi-project management to find out the basic requirements that a commercial software should have. In [13], the author demonstrates the successful contribution of PMIS use in every phase of the project life cycle by a conceptual model tested by means of data acquired from 170 experienced respondents. Referring to a multi-project environment, Caniels and Bakens [14] report PMIS features that facilitate the decision-making process, and point out relationships between PMIS information quality and PMs’ satisfaction with PMISs. Results arising from a survey are used to test the hypotheses using a Partial Least Squares (PLS) regression analysis. Referring to construction projects, Lee and Yu [15] propose a model that considers different aspects such as the system, information and service quality, intention of PMIS use, and user satisfaction to assess the impact of PMISs on project management. In [16], the authors claim that the choice of PMISs depends on both the sector in which the company operates and the type of project to be managed. With reference to the shipbuilding industry, this paper proposes an integrated approach constituted by a PMIS to plan and schedule the project activities, Project Data Management (PDM) to deal with documents and to share know-how, and a Control Tower (CT) to monitor and manage events and to broadcast messages among the different actors. Referring to an Indian construction project, Wale et al. [17] compare results obtained by Microsoft Excel with those arising from the use of Microsoft Project, thus highlighting the higher expense in terms of the times and costs of Excel. By the distribution of questionnaires to contractors, consultants, engineers and academic researchers, Obodoh et al. [18] analyze the impact of PMISs on project failure rates in the Nigerian construction industry. Frequency analysis and the Chi-Squared test are used on collected data to demonstrate the positive contribution of PMISs on to the project success. With relation to ERP projects, Nguyen et al. [19] perform Exploratory Factor Analysis (EFA) and regression analysis to demonstrate that users’ satisfaction is strongly influenced by friendliness, functional information, and support–service quality.
On the other hand, few contributions specifically focus on determining the basic features required by PMISs software users. In [20], the definition and the conceptual development of a Smart Project Management Information System (SPMIS) are given. The authors report an overview on principal commercial PMISs software at a worldwide level and describe the main 25 features they should have (e.g., initiation, planning and scheduling, resource management, control, risk and documents management). In the authors’ opinion, most of the analyzed software does not meet the requirements of today’s PMs. Referring to the construction sector, Liberatore et al. [21] highlight that interviewed professionals are mainly interested in activities/resources scheduling, resources levelling and net present value tools. In particular, the authors synthetize analytical techniques commonly used during the project’s planning and/or monitoring and show that the most applied technique is the critical path analysis followed by the resource scheduling/levelling and the Earned Value (EV). In the same field, Nitithamyoung and Skibniedìewski [22] identify a list of features and analyze their impact on the success/failure of web-based construction project management systems with the use of Application Service Providers (ASPs). Liberatore and Pollack-Johnson [23] analyze and evaluate factors that influence the use, the type of use (planning versus planning and control) and the choice of PMIS software. In [24], the main functionalities of PMISs are extrapolated by the analysis of 18 commercial software. The authors identify eight criteria, 28 sub-criteria and 44 sub-sub-criteria whose relative importance is assessed by means of the Analytic Hierarchy Process (AHP). The analysis confirms the importance of planning and control, whereas a final ranking among software is obtained by a direct evaluation.

3. Methodological Approach

The multi-step methodological approach implemented to answer the RQs is represented in Figure 1. Aiming to answer RQ1, both a literature review of the scientific contributions in the field (Section 2) and the analysis of the main commercial PMIS software were carried out (Section 3.1). Afterwards, a panel of experts was involved to obtain the final list of PMIS features to be weighted. In this regard, the methodological approaches implemented to answer RQ2 and RQ3 are reported in Section 3.2.1, Section 3.2.2, Section 4.2 and Section 4.3 respectively, along with the discussion of the obtained results.

3.1. PMIS Features and Sub-Features

Besides the literature review of the scientific contributions in the field, the main commercial PMIS software available on the market were selected and analyzed on the basis of the related guidelines and used in order to identify and list the owned features. In this paper, 8 software packages were examined, namely Trello (http://trello.com, 23 November 2021), Basecamp (http://basecamp.com, 23 November 2021), Asana (http://asana.com, 23 November 2021), Microsoft Project (https://support.microsoft.com, 23 November 2021), Huddle (http://www.huddle.com/product, 23 November 2021), Podio (http://podio.com, 23 November 2021), LiquidPlanner (http://www.liquidplanner.com, 23 November 2021) and Wrike (http://www.wrike.com, 23 November 2021). Therefore, the initial list of PMIS features was reduced by the authors by removing redundancies and/or grouping the similar ones. Survived functionalities were hierarchically organized to offer a synthetic and easier overall view of the problem, namely they were divided into features and sub-features. Afterwards, a panel of experts in the field (i.e., a full professor, the PM of an international company and a member of the Project Management Institute Southern Italy Chapter—PMI SIC) was involved to analyze the initial hierarchical structure, aiming to further reduce the number of PMIS features to which we should pay attention later on.

3.2. Survey Design and Analysis of Data

In order to evaluate the relative importance of the final selected features and sub/features, two different surveys were designed. To verify their consistency, simplicity and readability, surveys were firstly administered to the panel of involved experts by the web-based app Google form. On the basis of the received feedback, the final surveys were designed. In detail, both surveys included a first section to acquire some general information about respondents (e.g., gender, age, hours of project management experience, and work sector), while they were different with respect to the second section, specifically addressed to the computation of the relative importance (i.e., weight) of features and/or sub-features. The two surveys were distributed by internet to allow faster responses and reduced research costs [11]. With regard to the respondents, they were properly chosen by the panel of involved experts with the aim of including different aspects such as organization sector, experience, country and so on.
The following two sections detail how the second section of both surveys was designed and the analyses performed on the collected data.

3.2.1. First Survey: Design and Analysis of Data

In the first survey, the evaluation of features and sub-features was performed by the ranking scale method, preferred to the rating one since it forces the respondent to pay more attention to one item over another before deciding which functionality he/she is willing to give up [25,26,27]. Therefore, every respondent was asked to rank features and sub-features from the worst (i.e., to be excluded) to the best ones (i.e., deemed to be absolutely necessary in PMIS software).
In addition, the agglomerative hierarchical clustering algorithm was implemented to group similar respondents according to their responses with regard to every feature/sub-feature. At the beginning of the hierarchical clustering algorithm, every respondent was considered a singleton cluster. Afterwards, a new cluster was built step-by-step by merging a pair of nearest clusters, according to a similarity measure. The procedure was iterated until all respondents were merged into a single cluster or the desired number of clusters was reached [28]. In the present paper, a similarity measure based on the Euclidean distance among responses was used, and the complete linkage approach was applied. The obtained results can be easily visualized using a dendrogram. For every dendrogram, the number of clusters was identified by cutting it at a particular height, that is, level of similarity [29]. In general, the cutting level of a dendrogram may be evaluated observing the heights of the jumps, from the top to the bottom. A high jump means that the dissimilarity of the two clusters split is large, that is, a great gain in similarity is obtained by splitting them. Usually, the highest jump is chosen to obtain a small number of clusters. The obtained number of clusters was hence used as input data of the k-means method subsequently implemented to obtain a better clustering of respondents. Starting from the specified number of clusters and setting a centroid for every cluster, the k-means algorithm was used to associate every respondent with the nearest centroid. When all respondents were associated to a cluster, this step was completed and an initial clustering was performed. New k centroids were calculated and the association procedure was reiterated. The algorithm was stopped when no more changes to centroids were observed [30].
The weight of every feature and sub-feature was calculated as the median value of the respondents’ evaluations, the latter being the ranking position of the feature/sub-feature under investigation with relation to the whole dataset and per cluster. Normalizing these values in respect to the sum per column, the relative importance of features/sub-features was obtained. Finally, the interactions between the respondent profiles and PMIS functionalities weights was verified using the χ2 metric.

3.2.2. Second Survey: Design and Analysis of Data

In the second survey, the Design of Experiment (DoE) was used to develop some full functionalities packages—called prototypes—to be submitted to respondents. In particular, only features were involved in the design of prototypes, whereas sub-features were disregarded because of their high number. In order to allow respondents to evaluate prototypes, a conjoint analysis [31,32,33] was performed. The conjoint analysis allows the determination of whether the presence of a functionality significantly influences the appreciation level of PMIS software or not, also taking into account the interactions that could arise from the simultaneous presence of several functionalities [34]. With regard to DoE, a fractional factorial design of experiments was developed. It provided 2(k-p) runs (i.e., prototypes), two being the number of levels (i.e., presence or absence of a feature, identified by +1 and −1 respectively), k the number of factors (i.e., the eight features) and p the rate of reduction. A well-known design with eight factors foresees 16 runs, so that the resulting reduction rate p is equal to four. It means that the fractional design only comprised 24 = 16 runs rather than 28 = 256 of the full design. The chosen fraction design can be selected in many ways, 24 ways in this particular case. In this paper, the chosen fractional design was obtained by the Minitab© package (Table 1). It was characterized by the presence of four factors (i.e., level +1) and the absence of the other four (i.e., level −1) for 14 prototypes (from 2 to 15), while the first and the last prototypes (i.e., all features are absent or present) represent the zero and the full scale, respectively.
The 16 runs of Table 1 were the software prototypes considered to evaluate features. In every prototype, the included features were highlighted in yellow whereas the absent ones were in white. An example of a prototype is reported in Figure 2.
Every respondent was asked to rate every prototype on a seven-point scale, also allowing decimal scores. Scores equal to one and seven were given to the first and last prototypes, respectively.
Aiming to single out the features’ weights on the basis of the prototypes’ ratings, a regression model was performed [35,36,37,38,39]. Input data of this model were the dichotomous variables representing the presence (i.e., level +1) or the absence (i.e., level −1) of a functionality within every prototype. On the other hand, the response variable of the model was the rating of every prototype given by every respondent. The regression coefficients of the estimated model represented the features’ weights. The only significant terms were considered after using a stepwise regression, which was able to select the significant factors (i.e., features) with respect to their contribution to the total variability of responses. Finally, on the basis of the methodology proposed by Barone et al. [37,40], the normalized weights of features were calculated from the estimated regression model coefficients.

4. Results and Discussion

4.1. List of PMIS Features and Sub-Features

One hundred and fifty-two PMIS features were initially identified by the literature review and the analysis of PMIS packages. The hierarchical representation of this initial list is reported in the Appendix A, where the description of every feature/sub-feature is also given along with the main references and the software in which they are implemented. Involving the panel of experts, the number of features/sub-features was reduced to 82 by removing redundancies and/or grouping the similar ones. In this regard, some features and sub-features were discarded (e.g., “Interdependencies Management” was considered to be negligible) or merged and/or renamed (e.g., “Budget Control” and “Time Control” sub-features). The resulting hierarchical structure is reported in Figure 3, whereas detailed and modified descriptions of features/sub-features are included in Appendix B.

4.2. First Survey

The first survey was administered to 75 professionals, and a total number of 67 responses was received. As reported in Section 3.2.1, the number of clusters was identified by cutting the dendrogram obtained for every feature/sub-feature at a particular level of similarity. As an example, the dendrogram of “Resource Planning” obtained by Minitab© is shown in Figure 4. Four clusters were chosen, where the cutting level is the one represented by the orange dashed line and the four clusters are differently colored.
Table 2 shows the number of obtained clusters and the number of respondents per cluster. The relative importance of every feature—total and per cluster—are shown in Table 3.
On the basis of the last column of Table 3, “Activity Planning” and “Access Permits” were found to have the highest and lowest importance, respectively. On the other hand, different opinions were observed when comparing feature weights related to every cluster. Therefore, the interactions between respondents’ profiles (see Appendix B) and PMIS functionality weights were verified using the χ2 metric. In particular, taking into account all 67 responses, the distribution of respondents in clusters 3 and 4 was observed to be significantly different from the others in respect to the software usage and best software (Figure 5a,b). It means that a lot of respondents that do not use software or for which the best software does not exist are present in these clusters. Figure 5c shows that the frequency of foreign respondents in clusters 3 and 4 is switched. In detail, cluster 3 was characterized by a higher presence of foreign respondents in proportion to the total.
Weights obtained for “Activity Planning”, “Resource Planning”, “Control”, “Risk Analysis”, “Reporting”, “Communication Management” and “Utility” are reported in Table 4.
Considering the χ2 metric for the sub-features of “Activity Planning”, “Resource Planning”, “Control”, “Reporting”, “Communication Management” and “Utility”, no significant interactions between weights and clusters were observed. As a consequence, the corresponding histograms are not reported. On the other hand, a significant interaction between clusters of “Risk Analysis” and respondents’ education was observed (Figure 6), because of the higher relative frequency of educated respondents in cluster 2 than the other clusters. The corresponding weights per cluster are given in Table 5.

4.3. Second Survey

The second survey was administered to 15 experts to rank the 16 prototypes reported in Appendix B. Twelve responses were received for a total number of 192 evaluations. As detailed in Section 3.2.2, a regression model was performed. After using a stepwise regression to select the significant factors (i.e., features) with respect to their contribution to the total variability of responses, only significant terms of the regression model are reported in Table 6. The stepwise procedure highlighted the significance of all main factors, while the interactions between them were eliminated. In addition, the significance of some interactions between features and respondents was demonstrated.
The residual analysis (Figure 7)—performed on 192 ratings—confirmed the acceptability of the assumption of normality, which the regression model was based on. The goodness of fit was quite high (R2 = 83.91%; R2adj = 76%).
Figure 8 only shows those scenarios where significant interactions between features and the interviewed were observed. For instance, respondents 4 and 12 gave higher ratings to prototypes when “Activity Planning” was included. In detail, respondents 4 and 12 are both male, use PMISs software and they do not have any certification. With relation to “Control”, respondent 5 gave an opposite rating in comparison with the others, providing a high increase with relation to “Resource Planning”. Respondent 5 works in an IT company, uses software and he/she is trained in project management. On “Communication Management”, respondents 6 and 12—having more than 10,000 h of project management experience—provided an opposite rating in respect to the other respondents.
As suggested by Barone et al. [37,40], the normalized weights of features (Table 7) were calculated from the previously estimated regression model coefficients.
The input factors being characterized by the same dispersion (for every column of Table 1, eight +1 and eight–1), the regression coefficients can be combined directly, that is, without any standardization. Therefore, weights were calculated by normalizing them to one.
Differently to the first survey, the second one returned the highest importance of “Control”, while “Activity Planning” took the fifth position. Both methods assigned low weights to “Utility” and “Access Permits”, even if “Reporting” also had a low weight in this second analysis.

5. Conclusions

Today’s markets are highly competitive, dynamic and uncertain. As a consequence, Project Managers (PMs) increasingly need the support of tools specifically developed to assist them during all phases of the project life cycle. In this regard, the market makes available a wide variety of Project Management Information Systems (PMISs) software, suitable for projects differing in scope and user needs. Aiming to obtain an exhaustive list of features commonly shared by PMISs and to answer RQ1, a literature review and an analysis of the main commercial PMIS software were performed. A list of 82 features was identified and then analyzed by a panel of involved experts to further reduce the list. Finally, eight features and 24 sub-features were chosen because they were deemed to be fundamental to every PMIS. To obtain the relative importance of listed features and sub-features, and to answer RQ2, a statistical analysis was performed on data acquired by way of two different surveys properly designed and administered—by the net—to a selected number of experts. To answer RQ3, the interaction between the respondents’ profiles and the relative importance of listed PMIS features was also investigated by cluster and respondents’ analyses, for the first and second surveys, respectively. The analysis performed highlights that “Utility” and “Access Permits” have a lower importance than other features. On the other hand, “Activity Planning” and “Control” are the most important features from the first and second surveys, respectively. In the first case, “Reporting” and “Communication Management” have higher and lower weights, respectively, than in the second one. Sub-features were analyzed only from the first survey and the relative importance was properly computed.
In the authors’ opinion, the analysis of both the main PMIS packages and scientific contributions to the field may represent a structured framework in order to have an overall view of the state-of-the-art. On the other hand, the outcomes of the performed quantitative analysis may be used by researchers and practitioners to make proper decisions when choosing and/or developing PMIS software. In this regard, a possible future development may concern the evaluation of different PMIS software packages, involving the list of features/sub-features and the related weights arising from the presented analysis. However, the choice among PMIS software could also rely on some general characteristics such as cost, usability, and scalability. As consequence, a further research line may be addressed to compare PMIS software from this perspective.

Author Contributions

Conceptualization, R.M. and C.M.L.F.; methodology, A.L.; validation, R.M., G.L.S. and C.M.L.F.; formal analysis, R.M., G.L.S. and C.M.L.F.; investigation, A.L.; data curation, A.L., G.L.S.; writing—original draft preparation, R.M. and C.M.L.F.; writing—review and editing, R.M. and C.M.L.F.; visualization, R.M. and C.M.L.F.; supervision, R.M., G.L.S. and C.M.L.F.; project administration, R.M., G.L.S. and C.M.L.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Initial List of Features and Sub-Features

FeatureSub-FeatureDescription/AimReferencesSoftware
1. Activity planning
It includes all those tools for project activities planning and scheduling
1.1 Critical Path Method (CPM)Project planning and scheduling[11,12,13,16,17,18,19,24,41]Asana; MS Project; Podio
1.2 WBSHierarchical decomposition of the project[11,12,13,24,41,42]Asana; MS Project; Wrike
1.3 Gantt chartProject planning and identification of critical paths[11,13,16,18,20,24,41]MS Project; Wrike
1.4 MilestonesRepresentation of projects by means of milestones[12,20,24,41,42]Basecamp, Asana, MS Project
1.5 Intelligent ProgramsUpdate of activities and projects completion times when priorities and resources change Liquidplanner
2. Resource planning
(all features for project resources management)
2.1 Allocation of ResourcesPlanning of project resources[12,16,17,18,24,41,42,43]Asana; MS Project; Liquidplanner
2.2 Balancing of ResourcesBalancing of resources i.e., overload elimination[14,16,17,24,41]MS Project; Liquidplanner; Wrike
2.3 Critical Chain Project Method (CCPM)Planning and scheduling of activities considering the resources’ availability[17]MS Project (ProChain)
2.4 Cost ManagementPlanning and managing of project costs[11,12,13,16,17,19,42,43]MS Project; Podio; Liquidplanner
2.5 CalendarDevelopment and customization of projects and resources calendars[12,22,24,41,43]Trello; Asana; MS Project; Podio
2.6 Resource ChartVisualization of resources workload[12,16,17,18,24,41,42]Asana; MS Project; Podio
2.7 Resource Breakdown Structure (RBS)Hierarchical structure of resources by category and type[24]Asana; MS Project; Huddle; Podio
2.8 Stakeholders DirectoryRecording all information on team members and software users (e.g., telephone number, address, email, etc.) in a single directory[22,24]
3. Control
(all features needed to control budget, work and project results)
3.1 Performance TrackingProject monitoring by comparing the current performance with the planned one[16,17,20,42]Trello; MS Project; Podio
3.2 Budget ControlComparing the actual cost values with the planned ones[20,24,42]MS Project; Liquidplanner
3.3 Time ControlComparing the actual work values with the planned ones[16,20,24]Trello; Asana; MS Project; Podio; Liquidplanner; Wrike
3.4 Travel CostMonitoring of travel expenses[24]
3.5 Quality Management and validateInclusion of procedures for checking project results (e.g., check whether all requirements are met)[43]
4. Risk Analysis
(all features for identifying, evaluating, monitoring and managing risks and issues)
4.1 PERTPerforming the PERT analysis to compute the project time to completion when activities durations are random variables[11,13,18,20]
4.2 SimulationAllowing to compare costs and timescales of different project scenarios[24]MS Project (RiskyProject)
4.3 Risk ManagementSupporting the project risks management (e.g., SWOT analysis, creation of risk register, etc.)[11,12,13,14,16,19,20,24,41,42]
4.4 Issue ManagementSupporting the project issues management (e.g., creation of issue register, etc.)[24,42]
4.5 Incident managementAssociation of a problem to one or more person[24]
4.6 Claim ManagementAllowing the claim management[12]
5. Reporting
(all features for reporting i.e., report, version tracking)
5.1 Document ManagementManaging and/or archiving documents[11,12,13,19,22,43]
5.2 ReportDevelopment of standard and/or customized reports[11,14,20,42,43]Asana; MS Project; Liquidplanner; Wrike
5.3 Import/export dataImport/export of different format data[24]Asana; MS Project
5.4 Version TrackingDocuments tracking and recording[22,24]MS Project; Huddle; Wrike
5.5 Archiving project informationArchiving of project and team information[22,24]Asana
6. Communication management
(all features for project communication management)
6.1 CommunicationFacilitating communication, collaboration and information sharing among members[11,12,13,16,18,43]Trello, Basecamp; Asana; MS Project; Huddle; Podio
6.2 E-mailStakeholders’ communication by emails to stay up to date[16,24]Trello
6.3 ChatFacilitating synchronous communication among team members[24]Trello, Basecamp; Asana; Podio
6.4 Communication GroupStructured platform to facilitate stakeholders communication[24]
6.5 ForumFacilitating asynchronous communication among team members[22,24]Podio
6.6 Messages Outside the SystemSending information to a participant who is not connected to internet[22,24]
6.7 Video & AudioAllowing to make video and audio calls[20]Podio
6.8 RSS feedAccess to online contents in a standardized and computer-readable format[24]Asana; MS Project; Podio
6.9 WIKIAccess to a free website where inserting, updating or modifying information[24]Asana
6.10 Automatic check-inAllowing the team to ask for periodic questions (e.g., progress of an activity) Basecamp
6.11 Activity commentsAllowing to add notes/comments to activities Trello; Asana; Podio; MS Project; Liquidplanner
6.12 MentionAllowing to mention colleagues needed to complete the work by the instantly display of warning messages Asana; Liquidplanner; Wrike
6.13 GuestsAllowing to communicate with the external stakeholders of a project (e.g., suppliers, contractors, partners, etc..) Asana
6.14 FollowersAllowing to add team members as followers Asana
6.15 Calendars and emails synchronizingAllowing to automatically transform the content of e-mails into activities to be carried out Wrike
7. Utility
(all utility features i.e., to do list, filters and customized fields)
7.1 To do listVisualizing the list of things to be done on a specific day[24,41]Trello; Basecamp; MS Project; Podio
7.2 FiltersPerforming advanced searches on project documents (i.e., filters, sorting, grouping)[22,24,43]Asana; MS Project; Podio; Wrike
7.3 Customized fieldsAllowing to customize different fields (e.g., calendars, views, tables, filters, etc.)[22,24]Trello; Asana; MS Project; Podio; Wrike
7.4 Contacts listVisualizing external (i.e., clients, suppliers, etc.) and internal (i.e., team members, etc.) contacts list[42]
7.5 Procurement managementVisualizing the updated list of procurements[22]
7.6 External tools integrationsAllowing to add software features[20,21]Trello; Basecamp; Asana; MS Project; Huddle; Podio; Liquidplanner; Wrike
7.7 Guide and technical supportProviding help and support to users[24]Asana; MS Project; Wrike
7.8 Mathematical calculationsAllowing at recording, reporting and computing numerical values MS Project; Podio
7.9 ReminderDisplaying remind messages (e.g., list of things to be done or commitments on a specific day)[24]MS Project
7.10 Multi-language supportInstalling and using different languages[24]
7.11 Evaluation sheetsFilling evaluation sheets on suppliers, clients, project members, etc.[12,20]
7.12 RulesSetting rules to automate important actions Asana
7.13 Project copyCopying projects Asana
7.14 PDF and/or XPSSaving electronic files in pdf and/or xps formats MS Project
7.15 Customized brandCustomizing the project format by the insertion of the brand, personalized colours, etc. MS Project
7.16 Automatic completionAllowing to obtain suggestions on activities/resources name, dependencies, etc. MS Project
7.17 Main projectsGrouping projects into a single master project MS Project
7.18 Multi-levels eliminationAllowing to delete more commands MS Project
7.19 TextAllowing to add text Podio
7.20 PreviewVisualizing previews Huddle
8. Customer information
(all features for record and manage customer information)
8.1 Customer informationSaving customer information, classifying them into categories thus personalising messages to be sent[24]
9. Access permits
(all features that allows to establish rules for accessing the database and project documents according your role)
9.1 Access permitsEstablishing rules for having access to projects database and documents[24,42]Trello; Asana; MS Project; Huddle; Wrike
9.2 Central registers and audit controlAllowing to know who had access to projects files, from where and when, which files were downloaded and when, what changes were made to the file, etc. It can also include the use of digital signatures[24]Huddle
9.3 Integrated protectionAllowing to remotely delete projects data if a device is lost or stolen or if the revocation of user access is needed Huddle
9.4 Mobile PINAllowing to add a PIN for mobile devices Huddle; Wrike
9.5 Offline accessOffline access to software contents[22,24]
10. Type of software
(all features that refer to the software installation mode)
10.1 WEB basedWeb access by intranet or internet networks[16,18,24,43]Trello; Basecamp; Asana; MS Project; Huddle; Podio; Liquidplanner; Wrike
10.2 DesktopAccess by a personal computer[24]Asana; MS Project; Huddle
10.3 MobileMobile access by apps[24]Trello; Basecamp; Asana; MS Project; Huddle; Podio; Liquidplanner; Wrike
11. Type of license
(all features that refer to the software type of license)
11.1 ProprietaryUtilization of the PMIS software under the payment of a license which allows the installation of the software generally on the user work station[24]
11.2 Software as a Service—SaaSUtilization of the PMIS software under the payment of a monthly rent which allows the user to use the software via an internet connection[24]
11.3 Open sourceThe software house provides the source code which can be modified by the user[24]
12. User interface
(all features that allow to store information on the project’s users)
12.1 Project dashboardDisplaying the projects performance and progresses at a given time/period[24,43]Trello; Basecamp; Asana; MS Project; Wrike
12.2 Personal dashboardCustomization of the project’s dashboard[24]Huddle; Liquidplanner; Wrike
12.3 Multiple dashboardsDeveloping and visualizing multiple dashboards[24]
13. Interdependencies management
(all features that allow to manage the interdependencies between activities of different projects that share the same resources)
13.1 Interdependencies among resourcesVerification of interdependencies among resources[24]MS Project
13.2 Interdependencies among activitiesVerification of interdependencies among activities[24]Asana
13.3 Cross-project visibilityInstantaneous visualization of projects progresses, risks and budgets Asana; Liquidplanner

Appendix B

GENERAL INFORMATION
  • Gender: (Male; Female)
  • How old are you? (≤30; 31–40; 41–50; >50)
  • What country do you work in?
  • Do you use or have you ever used project management software? (YES; NO) (If yes, what was the best performing software?)
  • How many hours have you been involved in project management activities? (0; 1–1000; 1001–3000; 3001–10,000; >10,001)
  • Do you have a project management certification? (YES; NO) (If yes, which one? If you have more than one certification, list all of them)
  • Have you ever taught in this field? (YES; NO) (If yes, approximately, how many hours?)
  • Have you trained in this field? (YES; NO) (If yes, approximately, how many hours?)
  • Do you work in the public or private sector? (Public sector; Private sector)
  • Specify the sector: (Health care, Services, Civil construction, IT, Manufacturing, Finance, Energy, Others) (If other, which one?)
  • Approximately, what the company’s turnover is?
CRITERIA DESCRIPTION
  • Activity Planning (features for project activities planning and scheduling)
  • Resource Planning (features for project resources management)
  • Control (features needed to control budget, work and project results)
  • Risk Analysis (features for identifying, evaluating, monitoring and managing risks and issues)
  • Reporting (features for reporting, i.e., report, version tracking and dashboard)
  • Communication Management (features for project communication management)
  • Utility (utility features, i.e., to do list, filters and customized fields)
  • Access Permits (it allows to establish rules for having access to projects database and documents)
SURVEY 1
Please, order the following criteria/sub-criteria from the least important (i.e., criterion/sub criterion you are willing, more easily, to give up) to the most important ones (i.e., criterion/sub criterion you’re not willing to give up).
FEATURE
  • Activity Planning
  • Resource Planning
  • Control
  • Risk Analysis
  • Reporting
  • Communication Management
  • Utility
  • Access Permits
ACTIVITY PLANNING SUB-FEATURE
  • CPM—This function allows at performing the Critical Path Method (i.e., calculation of characteristic times, delays, critical path, etc.)
  • WBS—This function allows the hierarchical decomposition of the project
  • Gantt chart—This function allows to develop the bar chart of the project
  • Milestones—This function allows to represent the project through its milestones
RESOURCE PLANNING SUB-FEATURE
  • Allocation of resources—This function allows at planning the project resources
  • Scheduling with balancing of resources—This function permits to balance resources (i.e., eliminate overloads)
  • CCPM—This function allows the development of the Critical Chain Project Management (i.e., scheduling of activities by identifying critical chains with feed and project buffers)
  • Cost Management—This function allows to plan and manage all project costs (i.e., resources, raw materials, etc.)
  • Calendar—This function includes the design and customization of project and resources calendars
  • Resources Chart—This function allows the visualization of the workload of each resource
CONTROL SUB-FEATURE
  • Progress Control—This feature allows at comparing the actual cost and work values with the planned ones
  • Quality Management and Validate—This feature allows to include procedures for checking project results (i.e., view inspection results or check whether all requirements are met)
  • RISK MANAGEMENT SUB-FEATURE
  • PERT—This feature allows to perform the PERT analysis to get the project time to completion when activities durations are random variables
  • Simulation—In the presence of random costs and times, this function allows to determine the project time to completion through simulation (i.e., Monte Carlo simulation)
  • Risk/Issue management—This function supports the management of the project risks/issues (i.e., SWOT analysis, creation of risk/problem registers, etc..)
REPORTING SUB-FEATURE
  • Report—This function permits the creation of standard and/or customized reports of the project
  • Version Tracking—This function allows to track and record all documents related to the project
  • Dashboard—This feature allows team members to access all project-related information through a graphical, concise and customizable representation.
COMMUNICATION MANAGEMENT SUB-FEATURE
  • E-mail—This feature permits the exchange of emails among team members
  • Chat/Forum—This feature facilitates synchronous/asynchronous communication among team members
  • Video & Audio—This feature allows to make audio and video calls
UTILITY SUB-FEATURE
  • To do list—This feature allows to visualize the list of things to be done on a specific day
  • Filters—This feature allows to perform advanced searches on project documents (i.e., filters, sorting, grouping)
  • Customized fields—This feature allows to customize different fields (i.e., calendars, views, tables, filters, etc.)
SURVEY 2
Please, assign a score from 1 to 7 (also decimal) in accordance with your preference for every prototype.
1° PROTOTYPE—Score: 7
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
2° PROTOTYPE—Score: 1
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
3° PROTOTYPE—Score: ________
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
4° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
5° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
6° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
7° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
8° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
9° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
10° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
11° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
12° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
13° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
14° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk Analysis ReportingCommunication ManagementUtilityAccess Permits
15° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits
16° PROTOTYPE—Score_______
Activity PlanningResource PlanningControlRisk AnalysisReportingCommunication ManagementUtilityAccess Permits

References

  1. Baccarini, D. The concept of project complexity—A review. Int. J. Proj. Manag. 1996, 14, 201–204. [Google Scholar] [CrossRef] [Green Version]
  2. Williams, T.M. The need for new paradigms for complex projects. Int. J. Proj. Manag. 1999, 17, 269–273. [Google Scholar] [CrossRef] [Green Version]
  3. Maylor, H.; Brady, T.; Cooke-Davies, T.; Hodgson, D. From projectification to programmification. Int. J. Proj. Manag. 2006, 24, 663–674. [Google Scholar] [CrossRef]
  4. Elonen, S.; Artto, K.A. Problems in managing internal development projects in multi-project environments. Int. J. Proj. Manag. 2003, 21, 395–402. [Google Scholar] [CrossRef] [Green Version]
  5. Awe, O.A.; Church, E.M. Project flexibility and creativity: The moderating role of training utility. Manag. Decis. 2020, in press. [CrossRef]
  6. Annosi, M.C.; Marchegiani, L.; Vicentini, F. Knowledge translation in project portfolio decision-making: The role of organizational alignment and information support system in selecting innovative ideas. Manag. Decis. 2020, 58, 1929–1951. [Google Scholar] [CrossRef]
  7. A Guide to the Project Management Body of Knowledge; PMBOK Guide Sixth Edition; Project Management Institute (PMI), Inc.: Pennsylvania, PA, USA, 2017.
  8. van Fenema, P.C.; Koppius, O.R.; van Baalen, P.J. Implementing packaged enterprise software in multi-site firms: Intensification of organizing and learning. Eur. J. Inf. Syst. 2007, 16, 584–598. [Google Scholar] [CrossRef] [Green Version]
  9. Mitlöhner, J.; Koch, S. Software project effort estimation with voting rules. Decis. Support Syst. 2009, 46, 895–901. [Google Scholar]
  10. Sen, R.; Singh, S.S.; Borle, S. Open source software success: Measures and analysis. Decis. Support Syst. 2012, 52, 364–372. [Google Scholar] [CrossRef]
  11. Raymond, L.; Bergeron, F. Project Management information systems: An empirical study of their impact on project manager and project success. Int. J. Proj. Manag. 2008, 26, 213–220. [Google Scholar] [CrossRef] [Green Version]
  12. Ahlemann, F. Towards a conceptual reference model for project management information systems. Int. J. Proj. Manag. 2009, 27, 19–30. [Google Scholar] [CrossRef]
  13. Karim, A.J. Project Management Information Systems (PMIS) factors: An empirical study of their impact on Project Management Decision Making (PMDM) performance. Res. J. Econ. Bus. ICT 2011, 2, 22–27. [Google Scholar]
  14. Caniels, M.C.J.; Bakens, R.J.J.M. The effects of Project Management Information Systems on decision making in a multi project environment. Int. J. Proj. Manag. 2012, 20, 162–175. [Google Scholar] [CrossRef]
  15. Lee, S.K.; Yu, J.H. Success model of project management information system in construction. Autom. Constr. 2012, 25, 82–93. [Google Scholar] [CrossRef]
  16. Braglia, M.; Frosolini, M. An integrated approach to implement Project Management Information System within the Extended Enterprise. Int. J. Proj. Manag. 2014, 32, 18–29. [Google Scholar] [CrossRef]
  17. Wale, P.M.; Jain, N.D.; Godhani, N.R.; Beniwal, S.R.; Mir, A.A. Planning and Scheduling of Project using Microsoft Project (case study of a building in India). IOSR J. Mech. Civ. Eng. 2015, 5, 57–63. [Google Scholar]
  18. Obodoh, D.A.; Mbanusi, E.C.; Obodoh, C.M. Impact of Project Management Software on the Project Failure Rates in Nigerian Construction Industry. Int. J. Sci. Eng. Appl. Sci. 2016, 2, 358–367. [Google Scholar]
  19. Nguyen, T.D.; Nguyen, D.T.; Nguyen, T.M. Information System Success: The Project Management Information System for ERP Projects. In Lecture Notes of the Institute for Computer Sciences; Springer: Cham, Switzerland, 2016. [Google Scholar]
  20. Jaffari, A.; Manivong, K. Towards a smart project management information system. Int. J. Proj. Manag. 1998, 16, 249–265. [Google Scholar]
  21. Liberatore, M.J.; Pollack-Johnson, B.; Smith, A.C. Project Management in construction: Software use and research directions. J. Constr. Eng. Manag. 2001, 127, 101–107. [Google Scholar] [CrossRef]
  22. Nitithamyong, P.; Skibniewsky, M.J. Web-based construction project management systems: How to make them successful? Autom. Constr. 2004, 13, 491–506. [Google Scholar] [CrossRef]
  23. Liberatore, M.J.; Pollack-Johnson, B. Factors influencing the usage and selection of project management software. IEEE Trans. Eng. Manag. 2003, 50, 164–174. [Google Scholar] [CrossRef]
  24. Enea, M.; Muriana, C. An AHP-based approach to PMISs assessment. Int. J. Bus. Environ. 2014, 7, 32–60. [Google Scholar] [CrossRef]
  25. Hino, A.; Imai, R. Ranking and Rating: Neglected Biases in Factor Analysis of Postmaterialist Values. Int. J. Public Opin. Res. 2019, 31, 368–381. [Google Scholar] [CrossRef] [Green Version]
  26. Reynolds, T.J.; Jolly, J.P. Measuring personal values: An evaluation of alternative methods. J. Mark. Res. 1980, 17, 531–536. [Google Scholar] [CrossRef]
  27. Krosnick, J.A. Maximizing questionnaire quality. In Measures of Political Attitudes: Volume 2 of Measures of Social Psychological Attitudes; Robinson, J.P., Shaver, P.R., Wrightsman, L.S., Eds.; Academic Press: San Diego, CA, USA, 1999; pp. 37–57. [Google Scholar]
  28. Roux, M. A Comparative Study of Divisive and Agglomerative Hierarchical Clustering Algorithms. J. Classif. 2018, 35, 345–366. [Google Scholar] [CrossRef] [Green Version]
  29. Akalin, A. Computational Genomics with r; Chapman and Hall/CRC: New York, NY, USA, 2020. [Google Scholar]
  30. Bora, D.J.; Gupta, A.K. Effect of Different Distance Measures on the Performance of K-Means Algorithm: An Experimental Study in Matlab. Int. J. Comput. Sci. Inf. Technol. 2014, 5, 2501–2506. [Google Scholar]
  31. Acito, F.; Jain, A.K. Evaluation of Conjoint Analysis Results: A Comparison of Methods. J. Mark. Res. 1980, 17, 106–112. [Google Scholar] [CrossRef]
  32. Saaty, T.L. Axiomatic foundation of the analytic hierarchy process. Manag. Sci. 1986, 32, 841–855. [Google Scholar] [CrossRef]
  33. Fischer, G.W.; Luce, M.F.; Jia, J.; Frances, M.; Jianmin, L.; Carolina, N. Attribute conflict and preference uncertainty: Effects on judgment time and error. Manag. Sci. 2000, 46, 88–103. [Google Scholar] [CrossRef]
  34. Souza, J.P.E.; Alves, J.M.; Damiani, J.H.S.; Silva, M.B. Design of Experiments: Its importance in the efficient Project Management. In Proceedings of the 22nd International Conference on Production Research, Iguassu Falls, Brazil, 28 July–1 August 2013. [Google Scholar]
  35. Barron, F.H.; Barrett, B.E. Decision quality attribute using weights ranked. Manag. Sci. 1996, 42, 1515–1523. [Google Scholar] [CrossRef]
  36. Kwong, C.K.; Bai, H. Determining the importance weights for the customer requirements in QFD using a fuzzy AHP with an extent analysis approach. IIE Trans. 2003, 35, 619–626. [Google Scholar] [CrossRef]
  37. Barone, S.; Lombardo, A.; Tarantino, P. A weighted logistic regression for conjoint analysis and kansei engineering. Qual. Reliab. Eng. Int. 2007, 23, 689–706. [Google Scholar] [CrossRef]
  38. Barone, S.; Lombardo, A.; Tarantino, P. A heuristic method for estimating attribute importance by measuring choice time in a ranking task. Risk Decis. Anal. 2012, 3, 225–237. [Google Scholar] [CrossRef]
  39. Barone, S.; Errore, A.; Lombardo, A. Prioritization of Alternatives with AHP Plus Response Latency and Web Surveys. Qual. Total Qual. Manag. Bus. Excell. 2013, 25, 953–965. [Google Scholar] [CrossRef]
  40. Barone, S.; Lombardo, A. Service quality design through a smart use of conjoint analysis. Asian J. Qual. 2004, 5, 34–42. [Google Scholar] [CrossRef]
  41. Chowdeswari, C.; Satish Chandra, D.; Asadi, S. Optimal planning and scheduling of high rise buildings. Int. J. Civ. Eng. Technol. 2017, 8, 312–324. [Google Scholar]
  42. Taniguchi, A.; Onosato, M. Use of Project Management Information System to Initiate the Quality Gate Process for ERP Implementation. Int. J. Inf. Technol. Comput. Sci. 2017, 12, 1–10. [Google Scholar] [CrossRef] [Green Version]
  43. Zambare, P.; Dhawale, A. Project Management Information System in construction industry. Int. J. Eng. Sci. Res. Technol. 2017, 6, 54–60. [Google Scholar]
Figure 1. Methodological approach.
Figure 1. Methodological approach.
Applsci 11 11233 g001
Figure 2. Prototype example.
Figure 2. Prototype example.
Applsci 11 11233 g002
Figure 3. Hierarchical structure.
Figure 3. Hierarchical structure.
Applsci 11 11233 g003
Figure 4. Dendrogram related to “Resource Planning”.
Figure 4. Dendrogram related to “Resource Planning”.
Applsci 11 11233 g004
Figure 5. Histograms related to features.
Figure 5. Histograms related to features.
Applsci 11 11233 g005
Figure 6. Histograms related to “Risk Analysis” sub-features.
Figure 6. Histograms related to “Risk Analysis” sub-features.
Applsci 11 11233 g006
Figure 7. Residual analysis.
Figure 7. Residual analysis.
Applsci 11 11233 g007
Figure 8. Interactions’ plots between features and respondents.
Figure 8. Interactions’ plots between features and respondents.
Applsci 11 11233 g008
Table 1. Fractional factorial design of experiments for features.
Table 1. Fractional factorial design of experiments for features.
FeatureActivity PlanningResource PlanningControlRisk AnalysisReportingComm. Manag.UtilityAccess Permits
Prototype
1−1−1−1−1−1−1−1−1
2+1−1−1−1−1+1+1+1
3−1+1−1−1+1−1+1+1
4+1+1−1−1+1+1−1−1
5−1−1+1−1+1+1+1−1
6+1−1+1−1+1−1−1+1
7−1+1+1−1−1+1−1+1
8+1+1+1−1−1−1+1−1
9−1−1−1+1+1+1−1+1
10+1−1−1+1+1−1+1−1
11−1+1−1+1−1+1+1−1
12+1+1−1+1−1−1−1+1
13−1−1+1+1−1−1+1+1
14+1−1+1+1−1+1−1−1
15−1+1+1+1+1−1−1−1
16+1+1+1+1+1+1+1+1
Table 2. Obtained clusters and respondents.
Table 2. Obtained clusters and respondents.
Feature/Sub-FeatureNumber of Respondents per Cluster
Features levelCluster 1 = 37; Cluster 2 = 21; Cluster 3 = 5; Cluster 4 = 4
Activity Planning sub-featureCluster 1 = 17; Cluster 2 = 17; Cluster 3 = 16; Cluster 4 = 17
Resource Planning sub-featureCluster 1 = 31; Cluster 2 = 15; Cluster 3 = 12; Cluster 4 = 9
Control sub-featureCluster 1 = 48; Cluster 2 = 19
Risk Management sub-featureCluster 1 = 27; Cluster 2 = 33; Cluster 3 = 7
Reporting sub-featureCluster 1 = 24; Cluster 2 = 36; Cluster 3 = 18
Communication Management sub-featureCluster 1 = 37; Cluster 2 = 12; Cluster 3 = 18
Utility sub-featureCluster 1 = 46; Cluster 2 = 13; Cluster 3 = 8
Table 3. Total features weights and per cluster.
Table 3. Total features weights and per cluster.
FeatureCluster 1Cluster 2Cluster 3Cluster 4TOT
Activity Planning0.2290.1890.2350.1410.221
Resource Planning0.1710.1630.2060.0510.167
Control0.1710.1350.0590.1150.167
Risk Analysis0.1430.1350.0880.0900.139
Reporting0.1140.1080.1180.1160.111
Communication Management0.0860.1890.0880.1670.111
Utility0.0570.0540.0290.1410.056
Access Permits0.0290.0270.1770.1790.028
Table 4. Weights of sub-features.
Table 4. Weights of sub-features.
FeatureSub-FeatureWeight
Activity PlanningCPM0.181
WBS0.273
Gantt Chart0.273
Milestones0.273
Resource PlanningAllocation of resources0.239
Scheduling with balancing of resources0.190
CCPM0.143
Cost Management0.190
Calendar0.143
Resource chart0.095
ControlProgress control0.667
Quality Management and Validate0.333
Risk AnalysisPERT0.285
Simulation0.285
Risk/Issue Management0.430
ReportingReport0.285
Version Tracking0.285
Dashboard0.430
Communication Managemente-mail0.500
Chat/Forum0.333
Video & Audio0.167
UtilityTo do list0.500
Filters0.333
Customized Fields0.167
Table 5. Total and relative weights of “Risk Analysis” sub-features.
Table 5. Total and relative weights of “Risk Analysis” sub-features.
Risk AnalysisCluster 1Cluster 2Cluster 3TOT
PERT0.3330.1670.3330.285
Simulation0.1670.3330.5000.285
Risk/Issue Management0.5000.5000.1670.430
Table 6. Regression model.
Table 6. Regression model.
SourceDFAdjSSAdjMSF-Valuep-Value
Regression63478.6247.597210.600.000
Activity Planning12.2502.25003.140.079
Resource Planning16.2506.25008.720.004
Control112.25012.250017.090.000
Risk Analysis171.90871.9076100.320.000
Reporting15.5015.50137.670.006
Communication Management162506.25008.720.004
Utility15.1685.16807.210.008
Access Permits13.9393.93885.500.021
Interviewed1132.7152.97414.150.000
Activity Planning Interviewed1117.5141.59222.220.017
Resource Planning Interviewed1115.0771.37061.910.043
Control Interviewed1118.7641.70582.380.010
Communication Management Interviewed1114.5661.32421.850.053
Error12891.7500.7168
Total191570.374
Model SummarySR2R2adj
0.84663983.91%76%
Table 7. Features weights.
Table 7. Features weights.
FeatureCoefficients RegressionWeights
Activity Planning0.7500.104
Resource Planning1.2500.175
Control1.7500.243
Risk Analysis1.2240.171
Reporting0.3390.047
Communication Management1.2500.174
Utility0.3280.046
Access Permits0.2860.040
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Micale, R.; La Fata, C.M.; Lombardo, A.; La Scalia, G. Project Management Information Systems (PMISs): A Statistical-Based Analysis for the Evaluation of Software Packages Features. Appl. Sci. 2021, 11, 11233. https://doi.org/10.3390/app112311233

AMA Style

Micale R, La Fata CM, Lombardo A, La Scalia G. Project Management Information Systems (PMISs): A Statistical-Based Analysis for the Evaluation of Software Packages Features. Applied Sciences. 2021; 11(23):11233. https://doi.org/10.3390/app112311233

Chicago/Turabian Style

Micale, Rosa, Concetta Manuela La Fata, Alberto Lombardo, and Giada La Scalia. 2021. "Project Management Information Systems (PMISs): A Statistical-Based Analysis for the Evaluation of Software Packages Features" Applied Sciences 11, no. 23: 11233. https://doi.org/10.3390/app112311233

APA Style

Micale, R., La Fata, C. M., Lombardo, A., & La Scalia, G. (2021). Project Management Information Systems (PMISs): A Statistical-Based Analysis for the Evaluation of Software Packages Features. Applied Sciences, 11(23), 11233. https://doi.org/10.3390/app112311233

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop