Next Article in Journal
NikshayChain: A Blockchain-Based Proposal for Tuberculosis Data Management in India
Next Article in Special Issue
Increasing System Reliability by Applying Conceptual Modeling and Data Analysis—A Case Study: An Automated Parking System
Previous Article in Journal
Design and Analysis of Guidance Function of Permanent Magnet Electrodynamic Suspension
Previous Article in Special Issue
Human-in-Loop Decision-Making and Autonomy: Lessons Learnt from the Aviation Industry Transferred to Cyber-Physical Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Conceptual Framework for Data Sensemaking in Product Development—A Case Study

Faculty of Technology, Natural Sciences and Maritime Sciences, Campus Kongsberg, University of South-Eastern Norway, 3616 Kongsberg, Norway
*
Author to whom correspondence should be addressed.
Technologies 2023, 11(1), 4; https://doi.org/10.3390/technologies11010004
Submission received: 24 October 2022 / Revised: 15 December 2022 / Accepted: 17 December 2022 / Published: 22 December 2022
(This article belongs to the Special Issue Human-Centered Cyber-Physical Systems)

Abstract

:
The industry acknowledges the value of using data and digitalization approaches to improve their systems. However, companies struggle to use data effectively in product development. This paper presents a conceptual framework for Data Sensemaking in Product Development, exemplified through a case study of an Automated Parking System. The work is grounded in systems engineering, human centered-design, and data science theory. The resulting framework applies to practitioners and researchers in the early phase of product development. The framework combines conceptual models and data analytics, facilitating the range from human judgment and decision-making to verifications. The case study and feedback from several industrial actors suggest that the framework is valuable, usable, and feasible for more effective use of data in product development.

1. Introduction

Large-scale, highly complex systems being developed in telecom, space, transportation, and energy industries depend on systems with connections that trigger emergent behaviors. High-tech industries must adapt quickly to shifting market needs. Customers and users desire new, integrated systems that live up to their increased quality expectations [1,2,3].
Engineers find it challenging to comprehend and visualize the behavior of highly complex systems, especially before the production phase [4,5]. Such systems tend to be linked to numerous internal and external sources of data, information, and knowledge. Internal sources are what the company possesses and are readily available in its organization. External sources are those owned and generated by external actors outside the product development team’s boundaries. Examples of external sources are environmental-, suppliers-, and social media data. In large companies, the boundaries can occur already at the department level, where they have data, information, and knowledge possessed by other groups within the organization. For instance, if a team in the test department needs failure data from customers, then the team needs to contact the aftermarket department. These data are considered external for the test department. Such data rarely reach the designers, developers, and engineers, even though they have the potential to accelerate the quality and speed of product development [6]. Commercial actors find it difficult to derive business and customer value from the myriad of digital data, initiatives, and possibilities [6,7,8].
Figure 1 illustrates the product development bridge that utilizes various sources of data, information, and knowledge to develop a product. The internal resources include stored data and information, explicit and tacit knowledge. The external sources are data, information, and knowledge accessible outside the immediate organizational systems. We propose to support the product development process by utilizing internal and external sources with two methods, conceptual modeling and data analytics, shown with dashed lines. The iterative and recursively usage of conceptual modeling and data analysis are the fundamentals of the conceptual framework for data sensemaking in product development.

Research and Case Study Context

The new conceptual framework for data sensemaking is founded on the research project H-SEIF 2. The first stage (H-SEIF 1) paved the way for a Human Systems Engineering Innovation Framework (H-SEIF). The H-SEIF 2 builds on that by emphasizing the value of utilized using (big) data and digitization [9]. The collaborative approach aids designers and engineers in developing cutting-edge technology usable by and for people. The project is a collaboration initiative with nine industrial companies and two universities researching enabling data-supported early decisions. The project accomplishes this by obtaining unspoken information and turning it into practical knowledge early in the design and construction of systems. The vision is to lower costs and risks during product development, improve performance, and create more appealing products. Artifacts and findings from this research are verified and validated by the industry partners searching for quicker product development and shorter time-to-market.
This paper uses one of the nine partners to exemplify the conceptual framework. The Company is a small business that provides installation, maintenance, and management of Automated Parking Systems (APS). We selected this company and its product because of its small, controllable organization with a familiar system that other practitioners can relate to. The Company desires to create a condition-based monitoring and predictive maintenance system by collecting data from the APS [10]. Further, the Company needs to deploy more sensors, install infrastructure, and have a method for using the data to achieve its goal. Former research has shown that this Company and the other H-SEIF partners seek a framework to understand the data and information they need to collect and how to utilize it [6,8].
We use best practices for data sensemaking, analytics, and conceptual modeling to elicit a conceptual framework. Additionally, we investigate how such a framework could bring value to the industry. In other words, this study’s contribution aims to develop a framework to aid practitioners in using conceptual modeling and data analysis to improve product development.
The following section introduces literature on product development, conceptual modeling, and data sensemaking. The successive section is the research method used for this study. Section 4 starts with a list of criteria for a data sensemaking framework and evaluates existing solutions. Following is the presentation of the conceptual framework for data sensemaking based on the criteria. Next is a step-by-step through a case study from the industry. Consecutively, the paper discusses the research findings with the criteria for a data sensemaking framework. Finally, are the conclusions and future research.

2. Literature

2.1. Early Phase Product Development

Product development transforms a “market opportunity into a product available for sale” [11]. In the early stages of product development, a company can significantly influence the outcome of the final product and its operational phase. In these stages, concepts and components have a low cost [12]. However, the number of unknowns and uncertainties is high [13].
The early stages of product development serve as the foundation for this article. The INCOSE Systems Engineering Handbook summarizes the generic life cycle stages [14]. We define the “early product development phase” as the “Study Period” in typical high-tech commercial systems integrators and manufacturers and the “Project Planning Period” in the US Department of Energy. The early phase aims to define the problem- and need space, explore the environment, and validate solutions. The main activities in this phase are defining key requirements, stakeholder needs, conceptual design, and specification of the systems. These activities can be translated to understanding key drivers for the business and customer, the system context, and the System of Interest (SOI).

2.2. Conceptual Modeling

We define a conceptual model as a platform consisting of static and dynamic representations [15,16,17] drawn by a person’s observation and realization of the real world [18,19] for others to share the beholders’ understanding [20,21]. We define the art of conceptual modeling as a process that reduces the complexity of systems to a level of abstraction [22] that makes an individual or a group able to understand, explore and validate the System of Interest [17,20,23]. In short, conceptual modeling is the process of creating one or more conceptual models and their related activities concerning the SOI’s context.
Conceptual modeling supports product development in the industry [24,25,26]. Models’ foremost benefits are communication, coordination, design assistance, and exploration [27]. Conceptual models are helpful in the early phase of product development when there is a high level of unknowns and uncertainties, as it is a way to communicate, understand, explore, and validate through iterations with stakeholders.
Numerous conceptual models with different purposes and capabilities are derived from various fields, such as systems engineering, computer science, business, management, and innovation. Examples of conceptual models are timelines, and workflows, Systemigram [28], Gigamapping [29], Illustrative ConOps [30], and storytelling [4].

2.3. Data Sensemaking

There are various definitions for (big) data. Several authors and practitioners define big data related to the V’s. The term 3V’s of Big Data stands for Volume, Velocity, and Variety [31,32,33]. “Volume” refers to a vast amount of data. “Velocity” stands for the speed at which new data is generated. “Variety” represents different data types. Later, a fourth V was added, standing for Value [34,35,36]. “Value” accounts for how we can benefit from big data by turning it into value. Veracity has been suggested to be the fifth V of Big Data [37]. “Veracity” incorporates the bias and encompasses the data’s integrity level.
Apart from the five V-dimensions, The Method for an Integrated Knowledge Environment (MIKE2.0) project introduces a seemingly contradictory notion of big data: “Big Data can be very small, and not all large datasets are big” [38]. This definition attributes complexity and not size as the dominant factor. In other words, data or big data is a collection of both structured and unstructured data we collect and analyze to turn it into value. This value consists of transferring data into information and further knowledge and wisdom. One framework that illustrates the importance of exploiting the data is the 127 data, information, knowledge, and wisdom—the DIKW hierarchy [39,40].
A paper that investigated data usage in product development emphasized using internal and external data to aid decision-making during the early phase of product development [41]. Understanding the data, also called data sensemaking, is a conscious effort to understand data and the event(s) within its context [42]. Data sensemaking can be seen as a two-way process, aiming to fit data into a frame (mental model) and a frame around the data [43]. This process is iterative until data and frame unite to aid decision-makers in making more data-driven decisions by increasing the understanding of both data and context. Data sensemaking must be conducted during several iterations to improve the understanding and avoid oversimplifications [43]. Sensemaking is a process including different functions such as prediction (projecting the future), forming an explanation, seeing relationships or correlations (connecting the dots), anticipatory thinking, problem identification, and detection [42].
Research shows that the industry struggles with processing data and digitalization into value for product development [4,5,6,7,8]. Transforming data into information, knowledge, wisdom, and vice versa is a process illustrated in Figure 2 DIKW pyramid. In the DIKW hierarchy, the data transforms into information and further to knowledge and wisdom and vice versa. Data Analytics focuses on the lifecycle between data gathered from the real-world environment, transforming it into information, and using it for knowledge [38,39,40]; see the dashed line in Figure 2. Conceptual modeling uses the information to develop knowledge, which triggers understanding and reasoning [20,21]—This understanding and reasoning we use as a foundation for wisdom; see dotted lines in Figure 2. Wisdom is the tip of the hierarchy and supports the “when” and “why” of using data and further implementing it. In that sense, Data Analytics and Conceptual Modelling overlap and have synergy in the DIKW hierarchy and sensemaking.

3. Methods

Figure 3 shows the research methodology which is tailored for socio-technical research projects for industry-academia collaboration [44]. The first step contains a literature review, expert input, and industry analysis. In the second step, we acquired knowledge to draft a version of the suggested framework with its process, methods, and tools. The third step is the testing of the framework in the industry through a case study. This third step includes embedded units of analysis where we did numerous iterations from the problem domain to the solution domain inspired by the Industry-as-Laboratory [45] research methodology. This was a part of our planned adapted action research cycle: Design/Plan, Test/Act, Observe and Analyze/Reflect. Our cases consist of socio-technical systems; thus, our research must adapt to the technical and social aspects. The first four steps in each case study are an iteration of Design, Test, Observe and Analyze, and cover the technical aspects. The Act and Reflect cover the social aspects. The final step is to update the framework based on the results from the case study. We iterate the complete process three times. The framework was implemented and tested by researchers, with support of industry representatives.

4. Results

4.1. Frameworks for Data Sensemaking

Conceptual Modeling frameworks have been a well-researched topic [46]. However, there is no “one agreed way” of executing a conceptual modeling process. Such an aim might be fruitless as engineers are integrated into many fields and application domains. Therefore, developing general conceptual frameworks based on domain, application, and activities might be more in line. One such area is model simplification with data sensemaking support in the product development of complex systems.
Developing strategies for assisting the conceptual modeling activity could be more straightforward if one concentrates on a particular domain. The investigation of model simplification techniques is a potentially fruitful sub-theme of conceptual modeling frameworks.

4.2. Criteria for Conceptual Data Sensemaking Framework

We have established criteria for utilizing (big) data and digitalization in different industries with complex socio-technical systems. These criteria we discovered based on observations, interviews, and workshops with industry partners in the H-SEIF 2 consortium [6,8]. Additionally, these criteria are elicited based on literature and subject matter expert input from industry and academia. These seven criteria are:
  • Stepwise process
  • Iterative process
  • Top-bottom and Bottom-up friendly
  • Abstraction capabilities
  • Multi-view approach
  • Data-centric
  • Soft-aspect approach

4.2.1. Stepwise Process

By stepwise process, we mean that the framework supports the developer or practitioner with steps to follow. These steps should include a description of what is needed to be done, how, and when. A stepwise process makes it easier and quicker for practitioners to integrate it into their daily job activities. Explaining the steps eases communication among the implementers, especially when guidance is available. Considering a project or organization that is a Very Small Entity (VSE), there is a need for low-cost solutions and readily usable processes supported with guides, templates, and tools. Additionally, standardized communication and guidance in selection and implementation [47].

4.2.2. Iterative Process

With an iterative process, we mean that the phases and steps in the framework shall have the option to conduct several iterations. Iterations make the framework flexible in terms of time. The number of iterations is context-dependent and is based on acceptance of the risk of moving to the next phase, i.e., after early phase product development. It tends to be associated with when the practitioners feel satisfied with their knowledge of the subject. Iterations increase the incremental understanding and learning process; thus, the successive iterations give higher quality but cost additional time. Conducting an iterative process increase the understanding of the data and context and aid in avoiding oversimplification [43].

4.2.3. Top-Down & Bottom-Up

The top-down and bottom-up criteria mean that the framework can be conducted in two ways. Top-down starts with the holistic need or opportunity. Bottom-up begins with accessible data and assets. These two manners can be performed in the acting of balance to achieve the desired goal or value to utilize (big) data in the early phase of product development. For instance, a top-down approach is suitable for projects with more considerable uncertainties or new technology as it gives a structured breakdown approach. For projects with legacy systems or mature technology, a bottom-up approach can be used as one can utilize data, information, and knowledge from previous implementations and operations.

4.2.4. Abstraction Capabilities (Vertical Views)

This criterion means that the framework allows for jumping between different levels of abstraction. These levels of abstraction, also known as different world levels, can be the whole system as the SOI, the most critical subsystem as an SOI, or the entire organization as SOI. An example is Transportation System (high-level), Parking System, Gate for the parking system, and Components for the gate (low-level).

4.2.5. Multi-View Approach (Horizontal Views)

The Multiview approach means including the different perspectives horizontally to grasp an opportunity or value. It implies including the horizontal level of the subsystem(s), part(s), or stakeholder(s). An example of the value of customer service is that you must include the perspectives of maintenance personnel, end-user, the error reporting process, and the APS. For example, we have horizontally connected systems such as trains, cars, buses, bikes, and pedestrians within a Transportation System.

4.2.6. Data-Centric

This criterion means that the framework includes the data aspect, also known as the hard aspect. The hard aspects include analytical and quantitative procedures best suited for dealing with clearly defined situations, reliable data, and specific objectives [48]. This aspect is one of the most central criteria as we aim to develop and test the developed model by implanting it. This model aims to integrate (big) data into the organization and its processes, focusing on the early phase of product development. The data as an aspect needs to guide and support the models, i.e., conceptual molds. Simultaneously, conceptual models support and navigate data and its analysis through sensemaking. Data in itself has no value. Data must be combined with models for communication and increased understanding, which brings value.

4.2.7. Soft Aspect Approach

By soft aspect approach criteria, we mean the human aspect. The soft approaches are more suited to tackle unstructured problems involving incomplete data, unclear goals, and open questions [48]. This criterion enhances the understanding, communication, and decision-making process as it complements the data or hard aspect. For instance, workshops and interviews aid in articulating the critical stakeholders’ tacit knowledge into data and visualization using systems engineering and systems thinking methodology. Systemigram, workflow, Gigamapping, or other conceptual models can utilize the soft aspect.

4.3. Sensemaking Frameworks from the Literature

Table 1 shows the most relevant frameworks based on the literature review and subject matter experts from our research in the industry and academia. “X” means the framework covers the criteria. “/” indicates a partial criteria cover, such as only covered when reading through the theory behind it. See Section 4.2 for the criteria. This coverage is determined based on discussion and workshops among the authors of this article. Thus, we aim to develop a framework that covers all the needed criteria to utilize (big) data and digitalization in the early phase of product development for the industry that develops cyber physical complex systems.
Table 1 includes the following framework from the literature:
  • Data-frame theory of sensemaking. The Data-frame theory of sensemaking [42] presented a sensemaking process in a natural setting. Data frame theory describes the relationship between the data or signals of an event and the cognitive frame (mental models) or explanatory structure that considers the data and guides the search for more data. As shown in Table 1, the Data-frame theory of sensemaking lacks the coverage of the following criteria: Stepwise process, Multiview approach, and soft-aspect approach. The framework lacks a straightforward stepwise process to implement it. In addition, the framework does not include the different perspectives concerning its context. Further, the framework does not include humans through, for instance, interviews, workshops, and observation to articulate their tacit knowledge.
  • Quadruple Diamond. The Hybrid model originated from joint research between academia and larger enterprises, combining Big Data and Design Thinking through mixed teaming [49]. Its combination brings increased efficiency and effectiveness in the innovation process, combines deeper customer insights with deep learning from data, and generates synergies between qualitative and quantitative methods. The Quadruple Diamond was an extension of the Hybrid Model through the addition of Systems Thinking. The approach suggests working in a mixed team with iterations between the three mindsets, design thinking, data analytics, and systems thinking to understand the problem from all perspectives. The book gives four high-level examples of jumping between mindsets, such as sequential and mixed approaches. They highlight that an experienced facilitator and team should only perform the mixed quadruple diamond approach. The approach revolves around many iterations; therefore, its suitability to work in companies that cannot iterate as fast can be questioned. However, the figure itself is linear and does not show the iterative element.
  • Cognitive processes of sensemaking. The framework that depicts the cognitive processes of sensemaking [50,51] is based on being data-driven and structure-driven. Data-driven is seen as the inductive bottom-up, and structure-driven is the logical top-down approach. The framework goal is to utilize the cognitive process to get a big-picture view for knowledge creation, organization, and sharing in the sensemaking process. The stepwise process as guidance to take practitioners through the framework is missing. Additionally, it does not emphasize the co-creation and validation aspect for development in complex systems, such as verification and validation from key stakeholders.
  • CAFCR. The CAFCR model [52] offers a top-level decomposition of an architecture. CAFCR stands for Customer Objectives, Application, Functional, Conceptual, and Realization. The “why from the customer” is provided by the “Customer Objectives” view and the “Application view.” The “Functional view” describes the “what of the product,” which includes the non-functional requirements. The “how of the product” is described in the “Conceptual and Realization” view. CAFCR, as a framework, lacks an explicit process of integrating the hard aspects and how to utilize the data stored in the organization. Additionally, the framework needs a stepwise process. However, this is explained in the text but not in the figure.

4.4. The Framework Explained

In this section, we explain the proposed Conceptual Framework for Data Sensemaking step by step, see Figure 4. The framework consists of four phases. The first phase is to explore the business and customer value proposition and define the System of Interest (SOI). The second phase uses conceptual modeling to understand and explore the SOI and the context of the systems with key drivers and qualities. The third phase, data analytics, is the acquisition and analysis of information and data. The fourth phase is verifying and validating the sensemaking with key stakeholders.
We visualize our framework as a sequential step. Yet, we recommend performing these steps in an iterative and recursive process using time-boxing. This time-boxing may vary from one hour to several days.

4.4.1. Phase 1—Value Proposition Investigation

The first phase includes the following goals: exploring the business value proposition, customer value proposition, and defining the system of interest. We recommend conducting brainstorming sessions, workshops, and interviews to achieve the first phase’s goals [53]. These workshops and interviews can include, among others, internal decision-makers, subject matter experts, marketing managers, systems architects, engineers, developers, customers, and end-users. These key stakeholders have the needed knowledge regarding documents such as contracts, specifications, and business strategy.

4.4.2. Phase 2—Conceptual Modeling

The second phase is applying conceptual modeling. Users of the framework bring the value proposition and the definition of the System of Interest boundaries into the modeling phase. Phase 2 starts with transforming the insight from Phase 1 into key drivers and qualities. Phase 2 continues with creating a collection of conceptual models. Several conceptual mods can be developed in this phase following this methodology. Some examples of models are illustrative ConOps, Systemigram, value network, stakeholder relationships, use case, 2D map, black-box modeling, functional models, informational model, subsystem decomposition, and 3D sketch for the system internals. The conceptual models can be used for exploration and reasoning within the context. The type of models and how to use them depend highly on the context. However, we recommend framework users that do the conceptual modeling to (1) be iterative, (2) use time boxing, (3) use feedback, and (4) conduct multi-view perspectives.
Phase 2 ends with a decision on whether the models have adequate credibility. If these conceptual models are verifiable, the framework user proceeds by bringing the findings into the validation and implementation phase. If the models are not verified, the user uses the models as insight for identifying suitable information sources. The information sources are carried over to the Data Analytics phase.

4.4.3. Phase 3—Data Analytics

The third phase starts with collecting and analyzing data needed to verify the conceptual models developed in Phase 2. This phase starts with collecting internal data, followed by data analysis. The data analysis includes the data pre-processing and visualization of data analysis results. After the data analysis, Phase 3 has a decision gate—this decision gate questions whether the framework users have sufficient information to verify the conceptual models.
If the answer from the decision gate is yes, then the framework user conducts the second iteration of Phase 2. The second iteration includes the newly gained information based on the data analysis. The second iteration of Phase 2 may also include the development of additional conceptual models based on the data analysis insight.
If the answer from the decision gate is no, then the framework user conducts a second iteration inside Phase 3. This iteration includes collecting more data and analysis from more internal data and new external data.
We recommend time-boxing for Phases 2 and 3 to avoid using too much time for exploration and perfection. Time-boxing and iteration between Phases 2 and 3 ensure that both phases complement each other iteratively and recursively. Collecting data in Phase 3 may be time-consuming, especially when collecting external data from a supplier. Thus, it is also vital to start collecting and analyzing the available data first. The framework users may conduct the Phase 2 s iteration despite not collecting or analyzing the external data. However, we recommend having two data sources as a minimum for the data analysis process. One source can verify or supplement the other data sources.

4.4.4. Phase 4—Validation and Implementation

The last phase includes validation and implementation. This phase starts after having enough confidence that the models are verified towards the key drivers and qualities. The first step is to validate the insight gained, models created, and data collection and analysis with key stakeholders. The key stakeholders are mentioned in Phase 1. The following step is to evaluate if the framework user has a sufficient understanding of the SOI and its context. If sufficient, integrate the findings and decisions into the ongoing product development. However, if the framework user does not understand sufficiently, initiate a new iteration. The new iteration starts from Phase 1 to update or ensure that the value proposition and SOI are properly defined.

4.5. Framework Tested in Case Study—Automatic Parking Systems

The case study is based on a Small and Medium-sized Enterprise (SME) that delivers Automatic Parking Systems (APS) to the Norwegian market. The semi-automated parking system is the System of Interest (SOI) being used for testing the Framework. A fully automated parking system is a storage space for vehicles that do not require a parking attendant. In contrast, the semi-automated parking system needs a parking attendant to guide the car into the machine after it arrives at the parking installation. The first iteration of this case study was published in [10] and focused on Condition Based Maintenance (CBM) as System-of-Interest (SOI). The findings from the first iteration were that CBM was not an urgent need based on data analysis results. This paper presents the second iteration of applying this conceptual framework.

4.5.1. Phase 1—Understand the System Context

Step 1.1 & 1.2—Explore Value Proposition for Business and Customer: We investigate the business value proposition and customer value proposition and define the SOI. We used workshops, interviews, and observations with the Company’s key persons for this investigation. Those key persons included company management, the head of the maintenance department, maintenance personnel, and system developers.
Step 1.3—Define System of Interest: We conducted a participatory and direct observation of the maintenance process in this step. These observations aided in defining SOI. Additionally, we participated in weekly development meetings to explore the SOI. Figure 5 shows the value propositions.

4.5.2. Phase 2: Conceptual Modeling—First Iteration

Step 2.1—Define Key Drivers and Qualities: In this step, we use the output from Phase 1 (see Figure 5) as an input to develop the (business and customer) key driver model, as seen in Figure 6.
We developed the model based on workshops and interviews with the Company’s key persons. We also developed the qualities model (Figure 7), showing reliability and availability as key drivers.
Step 2.2 & 2.3—Create models & Explore Systems Context: In the first iteration, we created the following conceptual models: value network, workflow for the SOI, functional flow model, and 2D map:
Value network: We developed a value network model. The value network gives an understanding of the critical stakeholder within the development (creation) and operation (use) phases, illustrated in Figure 8. Suppliers generate in-system data, and maintenance personnel generates failure data. Additionally, the two phases, i.e., the creation and use phase, identify environmental data such as weather and traffic density data.
Workflow: Based on the observation and interviews with the Company’s key persons, we noticed that the gate is a key subsystem of the SOI that affects the SOI’s availability and reliability. The system freezes when the gate is open or not closed properly due to safety standards (requirements). Thus, we developed a workflow model (see Figure 9), focusing on the gate to understand its function: This understanding includes identifying the responsibility of opening or closing the gate as a function at car entry or retrieval, i.e., the user or system’s responsibility.
Functional flow model: We developed a functional flow model for the maintenance task of fixing failures in general, including the gate. This model indicates the SOI’s downtime due to unscheduled maintenance tasks, as depicted in Figure 10.
2D map: Based on the observations, we developed a 2D map showing the SOI, its context, and its parts. The 2D map can be seen in Figure 11. This conceptual model facilitated the understanding and communication of the SOI context.
The above-mentioned conceptual models’ aid in exploring the context of the system, especially the 2D map and the workflow. Further, these models revealed that the feedback data regarding failure data could indicate the SOI’s reliability and availability, including the gate as a critical subsystem. Additionally, we believe that external data can aid in discovering hidden patterns and trends to understand the factors that affect the SOI’s reliability and availability. In other words, make sense of the internal data analysis using the weather data to form an explanation and discover correlations between failure events and environmental factors. Using internal and external data permits having two data sources that verify and supplement each other.
Step 2.4—Are models verified? After developing the first iteration of some conceptual models, we need to verify the results using data. We lacked information and knowledge about the SOI as the models did not contain sufficient credibility and insight.
Step 2.5—Identify information sources: Value network facilitated identifying available data or information sources. This facilitation is based on understanding the critical stakeholders in the creation and use phases. The identified data sources include internal data and external data. For this study, we identified failure data as feedback data. On the other hand, we identified weather data as external data to investigate environmental factors on failures. However, we also identified other data and information sources. For instance, in-system and sensor data as internal data and traffic density data as external data. We considered the collected data, including failure and weather data, as (big) data due to its complexity and size [38]. Failure data is complex because it is a free-text format manually logged by various maintenance personnel for six years. Additionally, weather data increases the size of the collected data.

4.5.3. Phase 3: Data Analytics

Step 3.1—Collect internal data: We collected failure data (internal feedback data) over six years (2016–2021). The Company had stored failure data in an Excel file with 36 sheets. Each sheet belonged to one parking system and included the following parameters: date (for a maintenance event), time, telephone number (for the maintenance personnel who investigated the failure event), tag number (for the user (car owner)), car number, place number (for which parking lot the failure event occurred), reason (possible reasons for the failure event), conducted by (includes initials of the maintenance person’s name that conducted the failure or maintenance event), and finally invoiced yes/no (if the failure event is invoiced as it is not included within the maintenance agreement with the Company, or not). The failure data had been manually logged and included descriptions of the failure events.
Step 3.2—Analyze internal data: We pre-processed the data prior to the data analysis. As a part of the pre-processing process, we generated a template that included all the Excel sheets. The resulting template contained approximately 7000 rows × 9 columns. The pre-processing process also included processing the missed data entry and unifying data formats. This time-consuming process took approximately 80% of the total data analysis period. Nevertheless, manual pre-processing was necessary to increase the data quality and succeed with the data analysis.
The failure data contained a free-text format manually logged by the maintenance personnel. Thus, we analyzed the data using Natural language processing (NLP). NLP is a known method for analyzing text entry fields [54]. The data analysis included determining the frequency of most repeated words from the failures, calculating failure rates, Mean Time Between Failures (MTBF), and calculating the frequency of failures within hours of the day, day, and month of the year.
Finally, we visualized the results of the data analysis. In Figure 12 and Figure 13, we visualized the most significant analysis results. The first bar chart (a) in Figure 12 shows the frequency in the percentage of the most repetitive words in the whole description field entry in failure data for all parking systems. We manually looked through the failure events, and discovered that the maintenance personnel used a word such as system to describe the gate or a segment that includes two or three gates. The gate generally constituted approximately 50% of the words.
The second bar chart (b) visualizes the MTBF for all paring systems in the years 2016–2021. The estimated average MTBF is 11 h. With more than two errors per day in average, the company wants to improve the system’s reliability.
Step 3.3—Sufficient information?: In this step, we have a decision gate regarding having sufficient information to verify the developed conceptual models in Phase 2. When we went through a manual classification of failure data, we observed that environmental factors such as humidity are mentioned among the failures. Thus, we collected and analyzed failure data to investigate the effect of environmental factors on failures.
Step 3.4—Collect external data: We collected environmental data, mainly weather data, to investigate any correlation between the environmental factors and failure events. We collected weather data for the same period of the failure data (2016–2021) for the parking system locations, which are allocated in eight different cities. Weather data includes, among others, the following parameters: location (City), DateTime, temperature, perception, snow, windspeed, visibility, sunrise, sunset, humidity, condition, and condition description. We added those parameters to the template mentioned above, which ended up being approximately 7000 rows × 20 columns.
Step 3.5—Analyze external data: Figure 13a–c shows the most significant results of the weather data analysis:
The graph in (a) portrays the temperature distribution for the failure events and indicates that most failures occurred within −5 to 5 degrees Celsius. However, we also found that the temperature distribution is within the same range for those locations (cities). In other words, this temperature distribution is location biased.
The second graph (b) displays a positive correlation between humidity and failures. In other words, high values of humidity result in more failures.
The final graph (c) shows a negative correlation between precipitation (snow and rain) and failures. In other words, more rain results in fewer failures.
After conducting the internal and external data analysis, we presented these results to the Company’s key persons. The feedback indicated that the results make sense. The head of the maintenance department mentioned that salting roads in Norway could be one of the reasons that form an explanation for the weather data analysis. Therefore, we investigated the authorities’ requirements for salting roads in Norwegian cities. We found that authorities are salting the roads in the following temperature range: between −5 and 5 Celsius, above −3 Celsius, −3 to −6 Celsius, −6 to −12 Celsius, while under −12 Celsius in special circumstances [55]. These requirements gave us an explanation of the environmental factors with failure events. Thus, data analysis showed a negative correlation between perception and failures as perception washes the salt away. Thus, we concluded that salt is an environmental factor affecting failures.

4.5.4. Conceptual Modeling—Second Iteration

Step 2.2—Create Models (Second iteration): We conducted a second iteration on the developed conceptual models in Step 2.2 first iteration (Section 4.5.2) and enhanced them based on new information from data analysis. Some new models were Black Box, workflow model to utilize data, and a 3D sketch showing a suggestion or realization for the SOI based on the data analysis results. In the following, we briefly describe these developed conceptual models:
Black Box: We developed a black box model (see Figure 14). This black box has the primary function in the center, minimizing the system’s failures. On the left is the input to achieve this function, and on the right is the output. We also have a mutual arrow on the upper and lower side of the black box. The upper arrow indicates the interfaces, SOI, and environment, while the lower side illustrates technology, design, and parameters to measure to achieve the primary function. The black box elicits the input, output, and constraints related to minimizing the system’s failures, which increases the system’s reliability. The system’s reliability is one of the main key drivers, as we observe from the key driver model. The overview of the ongoing and outgoing links is input to other conceptual models that cover, for instance, the conceptual view of the SOI.
Workflow model to utilize data: We created a functional workflow model (see Figure 15) showing the steps to conduct the input for the Black Box. The primary function of the black box is to minimize the systems’ failures, as illustrated in the former description of the model. This workflow shows the steps of collecting and analyzing internal and external data. The internal data is failure data, and the external is weather data to investigate the environmental factors concerning failures in this context. Further, the model depicts the process of implementing the data analysis results.
3D sketch of the SOI: We show a 3D sketch of the SOI, visualizing a suggestion for the SOI based on the data analysis results (see Figure 16). This sketch depicts a car brush that cleans the car wheels, allocated before the gate. This suggestion is one of the concepts to wash the salt from the wheels to reduce the failure event frequency. Additionally, Company substituted the gate’s metal grey for a plastic one to avoid corrosion, decreasing the gate’s failure events. We include this change in the 3D sketch. We emphasize that the data analysis results of internal and external data guided us to that the local authorities’ salts on the roads within a specific temperature range correlate with the gate’s failure events. The 3D sketch aided in visualizing the concepts within the solution domain. This visualization assists in validating that the concepts can be adapted within the SOI, including the brush concept, and substitute the material for the gate’s relay from metal to plastic.
Additionally, we updated the models developed in Phase 2 First Iteration. For instance, we modified the functional flow model to show the maintenance process in the Company, focusing on the gate as a use case. However, we show only the last version, Figure 10: Functional flow model. The failure data analysis shows that the gate is the most repetitive word. It tells us that the gate is the most critical subsystem within the SOI. This conclusion concurs with our initial hypothesis, where data supported it to exclude any gut feelings and adopt a more data-driven decision-making methodology.
Further, we updated the 2D map to show where the gate is within the SOI and its context (Figure 11: 2D map). These two models aided in understanding the gate as a subsystem within its context. Additionally, it gives an understanding of the period for a maintenance task in general and the gate in specific. We can translate this period into maintenance costs.
Step 2.4—Are models verified? (Second iteration): With the information from the data and the knowledge from the models, we felt in the second internal iteration that we had sufficient insight to bring to the key stakeholders.

4.5.5. Validation and Integration

Step 4.1—Validation of insight: In Phase 4, we started validating the findings with Company’s key persons. The results were presented and discussed during workshops and interviews with in-house company decision-makers, maintenance personnel, subject matter experts within the industry’s product development, and experts on the architecture of complex systems from academia.
Step 4.2 & 4.3—Sufficient understanding and Integration:
The final stage was integrating the newly acquired knowledge into the Company’s knowledge base and suggestions to their product development team. This feedback resulted in an update of the Failure Modes and Effects Analysis (FMEA). For instance, future parking systems have changed the failure component from metal to plastic as a design change parameter to avoid corrosion. Through a design verification step, this change is proven to be more resistant to environmental factors, i.e., salt, humidity, and temperature. Furthermore, we recommended continuous health monitoring of the most critical system, i.e., the gate, through suggested sensors such as a microphone sensor. These changes (design and sensor) can be applied to the second critical subsystem, e.g., the platform. Moreover, installing proper sensors to the most critical subsystem, then the next one can be adapted by the Company as a bottom-up sensor strategy towards CBM to further develop predictive maintenance through developing a digital twin.

5. Discussion

5.1. The Framework, According to the Criteria

This research paper explores how to use a conceptual framework for data sensemaking to enhance the product development of complex systems. We explored seven properties for our Framework and tested them in a case study. These properties are discussed below.

5.1.1. Top-Down and Bottom-Up Friendly

The Framework can be conducted in top-down and bottom-up manners. The bottom-up manner starts with collecting and analyzing the data. While the top-down starts with interviews, workshops, and observation with the Company’s key persons and customers. The top-down and bottom-up should identify the business and customer value proposition and aid in understanding the context. However, we cannot exclude one approach from the other. Combining these two manners is significant to achieve value for the industry. This combination can be conducted by acting in a balanced way of thinking. This balance we manage through iterations.

5.1.2. Iterative Process

The Framework’s process was conducted in an iterative and recursive process. The number of iterations ends by accepting the risk of moving to the next product development phase. The first iteration can be an input for the successive ones. One of our concerns is the number of people that should be included and the time available to develop the conceptual models and conduct data analysis. We used time-boxing for these iterations. The first iteration has several hours of time-box, while we assigned several days as a time-box for the other iterations. Conducting iterations using time-boxing has proved to be valuable in increasing knowledge and understanding through early feedback. This early feedback we used to enhance the implementation of the step within the frameworks’ phases.

5.1.3. Stepwise Process

We developed the Framework with the industry in mind. Therefore, the framework layout was in a stepwise workflow representation, divided into phases. Each phase was divided based on a different type of expertise. The Conceptual Modeling phase revolved around reasoning, while data analytics focused on collecting and presenting internal and external data to support the models. At the same time, conceptual modeling guided the data analysis by, for instance, identifying available and significant data sources. The steps in the framework make it easier for multiple stakeholders to understand what phase a specific developer or researcher in a case is in and what will come next. Each of the phases and stages has the potential to be implemented or converted into internal product development processes.

5.1.4. Multi-View Approach

Complex systems tend to have dynamic behavior due to interactions between various systems and stakeholders. Thus, this Framework emphasizes the activity of exploration to gain insight. The use of conceptual models makes it possible to explore the system of interests’ context. Additionally, the data analysis gives verification and discoveries. Such insight gives the user of the framework various points of view from stakeholders and systems.

5.1.5. Abstraction Capabilities

The framework, including its steps and phases, can be conducted using different levels of abstraction. The data analysis and conceptual molding can be used using a critical subsystem as an SOI, the whole system as an SOI, or a System of Systems such as CBM as an SOI. However, the framework is conducted through several iterations. Through these iterations, we can use several abstraction levels which depend on the Company’s context and key drivers. The different abstraction levels are valuable to Zoom-in and Zoom-out for the Company’s early product development phase. The Zoom-in and Zoom-out also aid in understanding the specific design parameters, forming an explanation and correlation for its root cause in the early stages, and connecting it to the whole product development process. Additionally, different levels of abstraction aid in including the different aspects within the product development process, especially early stages, such as soft and hard aspects (i.e., data).

5.1.6. Soft Aspect and Data-Centric—Tacit and Explicit Knowledge

We moved between hard and soft aspects to cover the socio-technical aspect of the system. The soft aspects include the tacit knowledge from the subject matter experts, the Company’s customers, and other key internal stakeholders. This soft aspect included interviews and workshops with subject matter experts and visualizing the output of these interviews and workshops. We document data analysis results using Systems Engineering and Systems Thinking methodology and their tools, such as A3 Architecture Overview, Systemigram, workflows, Stakeholder Analysis, risk analysis, and more. In other words, we articulate the tacit knowledge to the explicit through documentation and models, which let us move from the soft to the hard aspect. The documentation and models include written text and visualized models.
Additionally, we integrate the hard aspects in the framework through identifying, collecting, and analyzing the appropriate data. We also went from explicit knowledge to tacit knowledge through verification and validation with stakeholders. These two aspects are the foundation for our framework.

5.2. Achieving Sensemaking

Sensemaking includes different functions such as prediction (projecting the future), forming an explanation, and seeing relationships or correlations (connecting the dots). The combination of data analysis and conceptual modeling in an iterative process aid in understanding the data and its context, also called sensemaking—this understating aid in exploring an exploration of the system’s reliability and the factors affecting it and investigating the correlation of these factors. These factors include environmental, human–machine interface, mechanical, and software issues.
In the presented case study, Automatic Parking System (APS), we developed at over 15 different conceptual models, showing the ten most significant in Section 4.5. Due to Company’s confidentiality, some models are an extraction, such as the key driver model. The models shown are examples of implementing our framework, but its core message is the same. The number of conceptual models is a function of several factors, such as the expertise and experience of people involved in developing the models, the context of the system of interest, the data available, and more. We recommend conducting the first iteration through brainstorming sessions, workshops, and interviews with the system’s developers, system architect, and customers. Kjørstad indicates a collection of types and ways to conduct such sessions, interviews, and workshops [53]. In developing the conceptual models and the data analytics, the companies have limited time and allocated effort for this type of work. Therefore, we focused on maintaining the conceptual framework with data sensemaking in a superlight architectural way. We used different views during conceptual modeling based on various levels of abstraction and multi-view. Even though we have one critical subsystem, i.e., the gate that we control and engineer, we need to evaluate the whole system of interest (here, the APS) and its environment (the world around APS). This results in jumping between the various systems that influence and are being appreciated by the SOI. In other words, we focused on a subsystem such as a gate and the APS during the different views. We see that jumping between abstraction levels and having a multi-view approach is essential to achieve acceptable sensemaking.

5.3. Framework Implementation and Concerns

One of the concerns is that the Company may need to generate a significant number of conceptual models and data analysis methods in the first iterations. Further, focusing on the most meaningful models and analysis methods that connect to the most significant key drivers. In other words, the framework may need quantity before focusing on quality. Achieving quantity may take time. However, we believe this agile approach can achieve valid results quicker than comprehensive simulations. These comprehensive simulations have high fidelity but are usually time-consuming and costly and may not include the most significant aspects of the business and customer value proposition.
Data quality is essential in this framework as we integrate data with conceptual modeling. One of the concerns can be data availability, shareability, and data legacy. For instance, we used time to collect the data. We also aimed to use internal data sources, i.e., in-system (sensor) and failure data, to test our framework. Unfortunately, we could only collect failure data as it was not technically possible to get in-system data for the same period of the failure data. Having two internal data sources could increase the accuracy and validity of the data analysis results and may develop our framework.
Further, we consumed 80% of the data analysis period (one year) to pre-process the data. This pre-processing included cleaning data such as unifying format and dealing with empty data entries. Additionally, the pre-processing included understanding the data and its metadata. We conducted workshops, observations, and interviews with the Company and Subject Matter Experts for this purpose. In parallel, we conducted several iterations (9 iterations) to develop our Framework. Some of these iterations are a result of the data analysis results and feedback from Subject Matter Experts from academia and industry.

5.4. Research Limitations and Future Studies

Research in a product development environment is complex and has several factors that can influence the outcome. All organizations and people are different and interact in various ways, the products are highly contextual, and the environment around them has different influential forces. We acknowledge that a limitation is the soft aspect of human relations. We would also focus on the need for longitudinal research over multiple case studies. However, one of the industry partners conducted the first iterations using this framework, and the results are promising.
We have yet to test the framework in the hands of employees in various companies. This framework is mainly for engineers who develop and design products. Thus, it needs to be implemented and studied while engineers use the conceptual framework of data sensemaking.

6. Conclusions

The paper’s main contribution is a data sensemaking framework developed from industry-need and literature and tested in a real industry problem case. The Framework aid practitioners in using conceptual modeling and data analysis iteratively and recursively to enhance the product development process. We verify and validate the Framework in a real industry problem case.
The industry sees value in the use of data in product development. However, they struggle with utilizing the data. The industry needs to have an integrated (big) data approach and digitalization into its organization for data sensemaking. We explored the state of the art’s data sensemaking framework that can be used into the product development process. We elicited seven criteria for a data sensemaking framework from the literature and industry partners. These criteria are stepwise process, iterative process, top-down and bottom-up approaches, abstraction capabilities, multi-view approach, data-centric, and soft-aspect approach. The soft (human) and data-centric mean moving between tacit and explicit knowledge. The most relevant frameworks did not fulfill all the set criteria, which resulted in the development of the presented conceptual framework in this study.
The new conceptual framework for data sensemaking consists of four phases. The first phase involves understanding the context through interviews and workshops with various stakeholders. The outputs are key drivers and targeted qualities. The framework user can apply suitable conceptual models in the second phase based on the drivers and qualities from the business and customers’ perspectives. The models function as a platform for reasoning and communication around the System of Interest. The framework user identifies, collects, and analyzes data supporting the conceptual models in the third phase. The data analysis integrates two data sources, i.e., internal and external. Based on the data analysis, the framework recommends additional iterations with the conceptual models in the second phase. Phases two and three guide each other iteratively and recursively. Data analysis can trigger the development of newly needed conceptual models. The fourth and final phase is validation and implementation. The framework suggests validating the findings from previous phases with key stakeholders. Additionally, this phase proposes integrating the newly acquired knowledge, which can be utilized in ongoing development or future products.
We tested the framework in an industry case study. The Company’s key persons validate the results. The feedback is that the framework is valuable, especially in integrating feedback data into the product development process.
For the framework user, the primary concerns can be data availability and quality. Data quality and availability are essential in this framework as we integrate data with conceptual modeling. Additionally, defining and collecting the appropriate data and conceptual modeling can be challenging as it is context-dependent and requires expertise within the two domains, i.e., data analysis and conceptual modeling from industry and academia.
In future research, we are applying the Framework to similar case studies in other companies and types of technical systems in Norwegian high-tech industry environments. The research will be collected and analyzed individually and as a cross-study.

Author Contributions

Conceptualization, H.B.A., T.L.; methodology, H.B.A., T.L.; software, H.B.A.; validation, H.B.A., T.L.; formal analysis, H.B.A., T.L.; investigation, H.B.A., T.L.; resources, H.B.A., T.L.; data curation, H.B.A., T.L.; writing—original draft preparation, H.B.A., T.L.; writing—review and editing, H.B.A., T.L., K.F.; visualization, H.B.A., T.L.; supervision, K.F.; project administration, K.F.; funding acquisition, K.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Norwegian Research Council, grant number 317862.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data is not available due to confidentiality and privacy.

Acknowledgments

The authors are grateful to the company’s people who have participated in this research.

Conflicts of Interest

The author declares no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Tani, G.; Cimatti, B. Technological Complexity: A Support to Management Decisions for Product Engineering and Manufacturing. In Proceedings of the 2008 IEEE International Conference on Industrial Engineering and Engineering Management, Singapore, 8–11 December 2008; pp. 6–11. [Google Scholar]
  2. Wiendahl, H.-P.; Scholtissek, P. Management and Control of Complexity in Manufacturing. CIRP Ann. 1994, 43, 533–540. [Google Scholar] [CrossRef]
  3. Lieberman, H.; Paternò, F.; Klann, M.; Wulf, V. End-User Development: An Emerging Paradigm. In End User Development; Lieberman, H., Paternò, F., Wulf, V., Eds.; Human-Computer Interaction Series; Springer: Dordrecht, The Netherlands, 2006; pp. 1–8. ISBN 978-1-4020-5386-3. [Google Scholar]
  4. Madni, A.M.; Spraragen, M.; Madni, C.C. Exploring and Assessing Complex Systems’ Behavior through Model-Driven Storytelling. In Proceedings of the 2014 IEEE International Conference on Systems, Man, and Cybernetics (SMC), San Diego, CA, USA, 5–8 October 2014; pp. 1008–1013. [Google Scholar]
  5. Ibrahim, M.H.M.; Mahmoud, A.; Ali, R.B.M. Factors Affecting Systems Engineering Complexity during Developmental Phase: Systems Practitioners, Developers, and Researchers’ Perspectives. Int. J. Innov. Res. Eng. Multidiscip. Phys. Sci. 2020, 8, 16. [Google Scholar]
  6. Salim, F.; Ali, H.B.; Langen, T.; Wettre, A.; Muller, G.; Falk, K. State of Affair in Terms of Big Data Utilization in Complex System Engineering Organizations. In Proceedings of the MODERN SYSTEMS 2022: International Conference of Modern Systems Engineering Solutions, Saint-Laurent-du-Var, France, 24–28 July 2022. [Google Scholar]
  7. Zicari, R.V. Big Data: Challenges and Opportunities. In Big Data Computing; Chapman and Hall: London, UK; CRC: Boca Raton, FL, USA, 2013; ISBN 978-0-429-10136-6. [Google Scholar]
  8. Langen, T.; Falk, K.; Mansouri, M. A Systems Thinking Approach to Data-Driven Product Development. Proc. Des. Soc. 2022, 2, 1915–1924. [Google Scholar] [CrossRef]
  9. H-SEIF 2—Big Data Meets Systems Engineering. Available online: https://www.usn.no/english/research/our-research/technology/norwegian-industrial-systems-engineering-research-group/h-seif-2/ (accessed on 5 October 2022).
  10. Ali, H.B.; Muller, G.; Salim, F.A. Applying Conceptual Modeling and Failure Data Analysis for “Actual Need” Exploration. In Proceedings of the MODERN SYSTEMS 2022: International Conference of Modern Systems Engineering Solutions, Saint-Laurent-du-Var, France, 24–28 July 2022. [Google Scholar]
  11. Krishnan, V.; Ulrich, K.T. Product Development Decisions: A Review of the Literature. Manag. Sci. 2001, 47, 1–21. [Google Scholar] [CrossRef] [Green Version]
  12. Reid, S.E.; Brentani, U.D. The Fuzzy Front End of New Product Development for Discontinuous Innovations: A Theoretical Model. J. Prod. Innov. Manag. 2004, 21, 170–184. [Google Scholar] [CrossRef]
  13. de Weck, O.; Eckert, C.M.; Clarkson, P.J. A Classification of Uncertainty for Early Product and System Design. In Proceedings of the DS 42: Proceedings of ICED 2007, the 16th International Conference on Engineering Design, Paris, France, 28–31 July 2007; pp. 159–160. [Google Scholar]
  14. INCOSE. INCOSE Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities; John Wiley & Sons: Hoboken, NJ, USA; Incorporated: New York, NY, USA, 2015; ISBN 978-1-118-99940-0. [Google Scholar]
  15. Wand, Y.; Weber, R. Research Commentary: Information Systems and Conceptual Modeling—A Research Agenda. Inf. Syst. Res. 2002, 13, 363–376. [Google Scholar] [CrossRef] [Green Version]
  16. Robinson, S.; Arbez, G.; Birta, L.G.; Tolk, A.; Wagner, G. Conceptual Modeling: Definition, Purpose and Benefits. In Proceedings of the 2015 Winter Simulation Conference (WSC), Huntington Beach, CA, USA, 6–9 December 2015; pp. 2812–2826. [Google Scholar]
  17. Muller, G. Tutorial Architectural Reasoning Using Conceptual Modeling. In Proceedings of the at INCOSE International Symposium, Seattle, WA, USA, 13–16 July 2015. [Google Scholar]
  18. Balci, O.; Ormsby, W.F. Conceptual Modelling for Designing Large-Scale Simulations. J. Simul. 2007, 1, 175–186. [Google Scholar] [CrossRef]
  19. Fujimoto, R.; Bock, C.; Chen, W.; Page, E.; Panchal, J.H. (Eds.) Research Challenges in Modeling and Simulation for Engineering Complex Systems; Simulation Foundations, Methods and Applications; Springer International Publishing: Cham, Switzerland, 2017; ISBN 978-3-319-58543-7. [Google Scholar]
  20. Hoppenbrouwers, S.J.; Proper, H.A.; Weide, T.P. A Fundamental View on the Process of Conceptual Modeling. In Conceptual Modeling—ER 2005. ER 2005. Lecture Notes in Computer Science; Delcambre, L., Kop, C., Mayr, H.C., Mylopoulos, J., Pastor, O., Eds.; Springer: Berlin, Heidelberg, 2005; Volume 3716, pp. 128–143. [Google Scholar]
  21. Zhang, L.; Zhou, L.; Horn, B.K.P. Building a Right Digital Twin with Model Engineering. J. Manuf. Syst. 2021, 59, 151–164. [Google Scholar] [CrossRef]
  22. Harrison, N.; Waite, W.F. Simulation Conceptual Modeling Tutorial. In Proceedings of the Summer Computer Simulation Conference, Genoa, Italy, 8–11 July 2012. [Google Scholar]
  23. Lavi, R.; Dori, Y.J.; Dori, D. Assessing Novelty and Systems Thinking in Conceptual Models of Technological Systems. IEEE Trans. Educ. 2020, 64, 155–162. [Google Scholar] [CrossRef]
  24. Borches Juzgado, P.D. A3 Architecture Overviews: A Tool for Effective Communication in Product Evolution. Ph.D. Thesis, University of Twente, Enschede, The Netherlands, 2010. [Google Scholar]
  25. Haveman, S.P. Collective Understanding of System Behavior: Incorporating Simulations in Conceptual Systems Design. Ph.D. Thesis, University of Twente, Enschede, The Netherlands, 2015. [Google Scholar]
  26. Heemels, M.; Muller, G. Boderc: Model-Based Design of High-Tech Systems; Embedded Systems Institute: Eindhoven, The Netherlands, 2006. [Google Scholar]
  27. Maier, M.W. The Art of Systems Architecting, 3rd ed.; CRC Press: Boca Raton, FL, USA, 2009; ISBN 978-0-429-19614-0. [Google Scholar]
  28. Boardman, J.; Sauser, B. Systems Thinking: Coping with 21st Century Problems; CRC Press: Boca Raton, FL, USA, 2008; ISBN 978-1-4200-5492-7. [Google Scholar]
  29. Sevaldson, B. Visualizing Complex Design: The Evolution of Gigamaps. In Systemic Design: Theory, Methods, and Practice; Jones, P., Kijima, K., Eds.; Translational Systems Sciences; Springer Japan: Tokyo, Japan, 2018; pp. 243–269. ISBN 978-4-431-55639-8. [Google Scholar]
  30. Gorod, A.; Hallo, L.; Ireland, V.; Gunawan, I. Evolving Toolbox for Complex Project Management, 1st ed.; Complex and Enterprise Systems Engineering; Auerbach Publications: New York, NY, USA, 2019; ISBN 978-0-367-18591-6. [Google Scholar]
  31. Kwon, O.; Sim, J.M. Effects of Data Set Features on the Performances of Classification Algorithms. Expert Syst. Appl. 2013, 40, 1847–1857. [Google Scholar] [CrossRef]
  32. Russom, P. Big Data Analytics. TDWI Best Pract. Rep. Fourth Quart. 2011, 19, 1–34. [Google Scholar]
  33. Laney, D. 3D Data Management: Controlling Data Volume, Velocity and Variety. META Group Res. Note 2001, 6, 1. [Google Scholar]
  34. Gantz, J.; Reinsel, D. The Digital Universe in 2020: Big Data, Bigger Digital Shadows, and Biggest Growth in the Far East. IDC Iview IDC Anal. Future 2012, 2007, 1–16. [Google Scholar]
  35. Dijcks, J.-P. Oracle: Big Data for the Enterprise. Oracle White Pap. 2012, 16, 1–14. [Google Scholar]
  36. Gogia, S.; Barnes, M.; Evelson, B.; Hopkins, B.; Kisker, H.; Yuhanna, N.; Anders, D.; Malholtra, R. The Big Deal about Big Data for Customer Engagement; Forrester Research, Inc.: Singapore, 2012. [Google Scholar]
  37. White, M. Digital Workplaces: Vision and Reality. Bus. Inf. Rev. 2012, 29, 205–214. [Google Scholar] [CrossRef]
  38. Big Data Definition—MIKE2.0, the Open Source Methodology for Information Development. Available online: http://mike2.openmethodology.org/wiki/Big_Data_Definition (accessed on 9 October 2022).
  39. Rajpathak, T.; Narsingpurkar, A. Managing Knowledge from Big Data Analytics in Product Development; White Paper; Tata Consultancy Services: Mumbai, India, 2013. [Google Scholar]
  40. Unhelkar, B. Big Data Strategies for Agile Business; CRC Press: Boca Raton, FL, USA, 2017. [Google Scholar]
  41. Ali, H.B.; Helgesen, F.H.; Falk, K. Unlocking the Power of Big Data within the Early Design Phase of the New Product Development Process. INCOSE Int. Symp. 2021, 31, 434–452. [Google Scholar] [CrossRef]
  42. Klein, G.; Phillips, J.K.; Rall, E.L.; Peluso, D.A. A Data-Frame Theory of Sensemaking. In Expertise out of Context, Proceedings of the Sixth International Conference on Naturalistic Decision Making; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 2007; pp. 113–155. ISBN 978-0-8058-5510-4. [Google Scholar]
  43. Weick, K.E. Sensemaking in Organizations; Sage: Thousand Oaks, CA, USA, 1995; Volume 3. [Google Scholar]
  44. Ali, H.B.; Langen, T.; Falk, K. Research Methodology for Industry-Academic Collaboration—A Case Study. INCOSE Int. Symp. 2022, 32, 187–201. [Google Scholar] [CrossRef]
  45. Potts, C. Software-Engineering Research Revisited. IEEE Softw. 1993, 10, 19–28. [Google Scholar] [CrossRef]
  46. Robinson, S. Conceptual Modelling for Simulation: Progress and Grand Challenges. J. Simul. 2020, 14, 1–20. [Google Scholar] [CrossRef]
  47. Laporte, C.; Vargas, E.P. The Development of International Standards to Facilitate Process Improvements for Very Small Entities. Available online: https://www.igi-global.com/chapter/content/www.igi-global.com/chapter/content/77760 (accessed on 6 October 2022).
  48. SEBoK Editorial Board. Guide to the Systems Engineering Body of Knowledge (SEBoK); The Trustees of the Stevens Institute of Technology: Hoboken, NJ, USA, 2022; Volume 2.6. [Google Scholar]
  49. Lewrick, M. The Design Thinking Playbook: Mindful Digital Transformation of Teams, Products, Services, Businesses and Ecosystems; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2018; ISBN 978-1-119-46747-2. [Google Scholar]
  50. Zhang, P.; Soergel, D. Towards a Comprehensive Model of the Cognitive Process and Mechanisms of Individual Sensemaking. J. Assoc. Inf. Sci. Technol. 2014, 65, 1733–1756. [Google Scholar] [CrossRef]
  51. Zhang, P.; Soergel, D. Cognitive Mechanisms in Sensemaking: A Qualitative User Study. J. Assoc. Inf. Sci. Technol. 2020, 71, 158–171. [Google Scholar] [CrossRef]
  52. Muller, G. CAFCR: A Multi-View Method for Embedded Systems Architecting; Balancing Genericity and Specificity. Ph.D. Thesis, Technical University of Delft, Delft, The Netherlands, 2004. [Google Scholar]
  53. Kjørstad, M. Exploration and Early Validation in Systems Engineering: A Study on Combining Systems and Design Practices in Systems Development towards Innovations in Norwegian High-Tech Industries. Ph.D. Thesis, University of South-Eastern Norway, Notodden, Norway, 2022. [Google Scholar]
  54. Stenström, C.; Al-Jumaili, M.; Parida, A. Natural Language Processing of Maintenance Records Data. Int. J. COMADEM 2015, 18, 33–37. [Google Scholar]
  55. Statens Vegvesen. Statens Vegvesen Fellesdokument Driftskontrakt Veg D2: Tegninger Og Supplerende Dokumenter D2-ID9300a Bruk Av Salt. Available online: https://www.mercell.com/m/file/GetFile.ashx?id=151465704&version=0 (accessed on 14 December 2022).
Figure 1. Bridging Data, Information, Knowledge, and Product development.
Figure 1. Bridging Data, Information, Knowledge, and Product development.
Technologies 11 00004 g001
Figure 2. DIKW Pyramid with Conceptual Modeling and Data Analytics.
Figure 2. DIKW Pyramid with Conceptual Modeling and Data Analytics.
Technologies 11 00004 g002
Figure 3. Research methodology for a case study with an embedded unit of analysis.
Figure 3. Research methodology for a case study with an embedded unit of analysis.
Technologies 11 00004 g003
Figure 4. Data Sensemaking Framework.
Figure 4. Data Sensemaking Framework.
Technologies 11 00004 g004
Figure 5. Value Proposition for Business and Customer.
Figure 5. Value Proposition for Business and Customer.
Technologies 11 00004 g005
Figure 6. Business and customer Key Drivers.
Figure 6. Business and customer Key Drivers.
Technologies 11 00004 g006
Figure 7. Qualities model.
Figure 7. Qualities model.
Technologies 11 00004 g007
Figure 8. Value network.
Figure 8. Value network.
Technologies 11 00004 g008
Figure 9. Workflow for car entry (left) and car retrieval (right).
Figure 9. Workflow for car entry (left) and car retrieval (right).
Technologies 11 00004 g009
Figure 10. Functional flow model.
Figure 10. Functional flow model.
Technologies 11 00004 g010
Figure 11. 2D map.
Figure 11. 2D map.
Technologies 11 00004 g011
Figure 12. Internal data include (a) frequency of most repetitive words (b) MTBF per year.
Figure 12. Internal data include (a) frequency of most repetitive words (b) MTBF per year.
Technologies 11 00004 g012
Figure 13. Analysis of external data including (a) failure events versus temperature (b) humidity versus city (c) precipitation versus city.
Figure 13. Analysis of external data including (a) failure events versus temperature (b) humidity versus city (c) precipitation versus city.
Technologies 11 00004 g013
Figure 14. Black Box.
Figure 14. Black Box.
Technologies 11 00004 g014
Figure 15. Workflow model to utilize data.
Figure 15. Workflow model to utilize data.
Technologies 11 00004 g015
Figure 16. 3D Sketch of the SOI.
Figure 16. 3D Sketch of the SOI.
Technologies 11 00004 g016
Table 1. Frameworks for Data Sensemaking in Product Development.
Table 1. Frameworks for Data Sensemaking in Product Development.
Criteria/FrameworkData-Frame Theory of Sense-
Making
Quadruple
Diamond
Cognitive
Processes of Sensemaking
CAFCR
Stepwise process /
Iterative processX/XX
Top-down/
Bottom-up
XXXX
Abstraction
capabilities
XXXX
Multiview approach X X
Data-CentricXXX
Soft aspect approach X X
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Langen, T.; Ali, H.B.; Falk, K. A Conceptual Framework for Data Sensemaking in Product Development—A Case Study. Technologies 2023, 11, 4. https://doi.org/10.3390/technologies11010004

AMA Style

Langen T, Ali HB, Falk K. A Conceptual Framework for Data Sensemaking in Product Development—A Case Study. Technologies. 2023; 11(1):4. https://doi.org/10.3390/technologies11010004

Chicago/Turabian Style

Langen, Tommy, Haytham B. Ali, and Kristin Falk. 2023. "A Conceptual Framework for Data Sensemaking in Product Development—A Case Study" Technologies 11, no. 1: 4. https://doi.org/10.3390/technologies11010004

APA Style

Langen, T., Ali, H. B., & Falk, K. (2023). A Conceptual Framework for Data Sensemaking in Product Development—A Case Study. Technologies, 11(1), 4. https://doi.org/10.3390/technologies11010004

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop