Next Article in Journal
COVID-19 and Islamic Stock Index: Evidence of Market Behavior and Volatility Persistence
Next Article in Special Issue
A Focus on Ethical Value under the Vision of Leadership, Teamwork, Effective Communication and Productivity
Previous Article in Journal
Development of Social Cost and Benefit Analysis (SCBA) in the Maqāṣid Shariah Framework: Narratives on the Use of Drones for Takaful Operators
Previous Article in Special Issue
Exploring Critical Success Factors of Competence-Based Synergy in Strategic Alliances: The Renault–Nissan–Mitsubishi Strategic Alliance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Misfit? The Use of Metrics in Innovation

by
Ilse Svensson de Jong
Innovation Engineering, Department of Design Sciences, Lund University, 223 62 Lund, Sweden
J. Risk Financial Manag. 2021, 14(8), 388; https://doi.org/10.3390/jrfm14080388
Submission received: 15 July 2021 / Revised: 5 August 2021 / Accepted: 13 August 2021 / Published: 19 August 2021
(This article belongs to the Collection Business Performance)

Abstract

:
Measuring innovation is a challenging but essential task to improve business performance. To tackle this task, key performance indicators (KPIs) can be used to measure and monitor innovation. The objective of this study is to explore how KPIs, designed for measuring innovation, are used in practice. To achieve this objective, the author draws upon literature on business performance in accounting and innovation, yet moves away from the functional view. Instead, the author focuses explicitly on how organizational members, through their use of KPIs in innovation, make sense of conflicting interpretations and integrate them into their practices. A qualitative in-depth case study was conducted at the innovation department of an organization in the process industry that operates production sites and sales organizations worldwide. In total, 28 interviews and complementary observations were undertaken at several organizational levels (multi-level). The empirical evidence suggests that strategic change, attributed to commoditization, affects the predetermined KPIs in use. Notably, these KPIs in innovation are used, despite their poor fit to innovation subject to commoditization. From a relational perspective, this study indicates that in innovation, KPIs are usually complemented by or supplemented with other information, as stand-alone KPIs exhibit a significant degree of incompleteness. In contrast to conventional studies in innovation and management accounting, this study explores the use of key performance indicators (KPIs) in innovation from an interpretative perspective. This perspective advances our understanding of the actual use of KPIs and uncovers the complexity of accounting and innovation, which involve numerous angles and organizational levels. Practically, the findings of this study will inform managers in innovation about the use of KPIs in innovation and the challenges individual organizational members face when using them. In innovation, KPIs appear to be subjective and used in unintended ways. Thus, understanding how KPIs are used in innovation is a game of reading between the lines, and these KPIs can be regarded as misfits.

1. Introduction

“All too apparently accounting is a phenomenon which is what it isn’t and can become what it wasn’t!”
Creating innovation and capturing value from doing so have considerably changed over the past two decades (Haefliger and Poetz 2016). An increasing focus on innovation is one of the most prevalent changes in the current business environment (Melnyk et al. 2014). Innovation, new business models, and value-driven competition are all considered to be of high importance both now and in the future, and their impact on the management control systems are considered to be significant (Barros and da Costa 2019; Chenhall and Moers 2015; Feeney and Pierce 2018).
Each specific industry context poses different demands on how innovation is measured, managed, and controlled (Bromwich and Scapens 2016; Messner 2016). The processing industry, for example, which is the case context of this study, is subject to commoditization. Commoditization is defined as a unique phenomenon of evolving marketing competition characterized by increasing homogeneity of products, higher price sensitivity among customers, lower switching costs, and greater industry stability (Matthyssens and Vandenbempt 2008; Reimann et al. 2010). Commoditization increases the pressure to innovate while simultaneously securing a price-competitive and high-quality raw material supply base with a long-term perspective, in addition to the development of such raw materials (Lager 2016; Reimann et al. 2010). Commoditization, as a strategic change, affects the control systems that help managers accomplish innovation (Chenhall and Moers 2015).
The literature on innovation and accounting presents several research streams (Barros and da Costa 2019; Feeney and Pierce 2018; Major et al. 2018). One strand of research in management accounting and control (MAC) has a functionalist perspective on innovation and accounting (Davila and Oyon 2009; Moll 2015). This stream of MAC focuses on KPIs having the right “fit” (Melnyk et al. 2014), rather than what is actually used in practice (Bourne et al. 2018; Richtnér et al. 2017). Innovation, however, requires KPIs to continually adjust to complex situations in various contextual settings (Fagerlin and Lövstål 2020; Okwir et al. 2018). Thus, to advance our understanding of the actual use of KPIs, this study used a qualitative method to uncover the complexity of accounting and innovation, which involve numerous angles and organizational levels (Barros and da Costa 2019; Fagerlin and Lövstål 2020).
Despite the increased recognition in the literature of the relevance of accounting in innovation (Feeney and Pierce 2018; Major et al. 2018), studies of performance indicators (such as KPIs) in innovation show inconsistencies, emphasizing that understanding KPIs in innovation is far from complete (Fried et al. 2017; Moll 2015). The objective of this study was, therefore, to explore how KPIs, designed for measuring innovation, are used in practice. In this paper, the researcher draws upon literature on accounting and innovation, yet moves away from the functional view on the design of KPIs in innovation (Laine et al. 2016; Major et al. 2018). Instead, the author focuses explicitly on how organizational members, through their use of KPIs in innovation, make sense of conflicting interpretations and integrate them into their practices (Carlsson-Wall et al. 2016; Laine et al. 2016).
This study contributes to the literature in several ways. Firstly, it adds to the body of knowledge of accounting in innovation. Earlier, in many of these studies on accounting in innovation, special (and sometimes almost exclusive) attention has been given to accounting in research and development (R&D) and new product development (NPD) (Abernethy and Brownell 1997; Feeney and Pierce 2018; Moll 2015; Rockness and Shields 1984). Today, technologies, organizational structures, and enhanced awareness of the business context are acknowledged as important elements to help achieve innovation (Chenhall and Moers 2015; Major et al. 2018). Overall, in MAC research, there is still a weak understanding of how the management accounting and control practice has evolved as a response to this shift in innovation, from being a departmental duty to becoming an organization-wide effort (Davila et al. 2009; Major et al. 2018). In response to this shift in understanding, this study tried to extend the current body of knowledge by proposing a broader view that innovation includes the chain of transforming ideas into value. This view on innovation influences the way innovation is perceived and measured in practice, as the findings of this study show.
Secondly, this study focuses on the actual use of KPIs. Drawing upon previous accounting studies (Ahrens and Chapman 2007; Busco and Quattrone 2017; Mouritsen et al. 2009; Mouritsen and Kreiner 2016), this paper tries to study, analyze, and interpret accounting in the contexts in which it operates (Hopwood 1972, 1974, 1983; Laine et al. 2016). For this purpose, following previous accounting studies (Ahrens and Chapman 2007; Busco and Quattrone 2017; Laine et al. 2016), this study draws upon a more interpretative and relational perspective on accounting. This interpretative and relational perspective contrasts the traditionally functionalist perspective on innovation and accounting (Davila and Oyon 2009; Fried 2017; Moll 2015). Earlier studies in innovation and accounting used contingency theory to explain a fit or misfit (Burkert et al. 2014; Fried 2017; Gerdin and Greve 2004; Otley 2016). Inspired by Carlsson-Wall et al. (2016) and Laine et al. (2016), this study shifts its analytical lens from how KPIs are designed using contingency theory to how KPIs are understood and interpreted at several levels of the organization. This shift allows the researcher to enhance the understanding of the role KPIs play in innovation and, more specifically, how KPIs may shape the way organizational actors make sense of competing interpretations, and prioritize and integrate them into their organizational practices (Laine et al. 2016).
Another contribution of this study is uncovering the consequence of using KPIs in innovation that are incomplete accounting representations and what is done with them in practice. The KPIs in this study appear to portray a misfit with the underlying context of innovation. Changes in the internal and external operating environment of the firm require KPIs in innovation to be adopted, which, in practice, does not appear to be the case. This results in the use of KPIs in innovation that are supplemented with or complemented by information, thus following the reasoning that KPIs should be seen as packages and not as standalone metrics (Grabner and Moers 2013). The official KPIs in this study thus appear to be interpreted and used as incomplete accounting representations (Busco and Quattrone 2017; Jordan and Messner 2012; Wouters and Wilderom 2008). In line with other MAC scholars, this study tries to contribute to the debate regarding the incompleteness of an accounting representation.

2. Theory Background

2.1. Defining Innovation

Innovation has been studied in a variety of ways in accounting (Barros and da Costa 2019; Major et al. 2018). In earlier years, the focus in this research was on a more traditional departmental view of innovation, R&D (research and development), and NPD (new product development) (Abernethy and Brownell 1997; Davila 2000; Jørgensen and Messner 2009, 2010; Rockness and Shields 1984). In recent years, innovation has been studied to include a broader view, encompassing innovation from idea generation to execution and value capture (Davila et al. 2012). As a result, the accounting literature does not have an agreed definition of innovation and this issue creates confusion. This issue also limits the potential for researchers to compare different studies in this field (Franco-Santos et al. 2012).
Considering the research scope of this paper and to overcome this limitation, this paper adopts a broader view of innovation. In this paper, innovation includes the chain of transforming ideas into value (Davila et al. 2009). This view is a synthesis and inclusive view of innovation that includes the adoption of any new product, process, and administrative innovation. In this definition, both the research streams of new product development (NPD) and research and development (R&D) are included, reflecting the overall tendency towards an organizational view of innovation rather than the traditional departmental view, or as a part of commercialization or engineering (Adams et al. 2006).

2.2. Defining KPIs—Key Performance Indicators

Scholars from various functional disciplines have examined a wide range of issues related to the design, implementation, use, and review of KPIs (Goshu and Kitaw 2017; Ittner and Larcker 2003; Micheli and Mari 2014; Neely 2005). However, important innovation KPIs particularly suitable for the strategic management of innovation are still very much absent in MAC research (Fried 2017; Keupp et al. 2012). Scholars have provided many explanations; one such explanation may be that MAC research on KPIs has been more closely related to a positivistic epistemology in which the emphasis has been on the creation of rational early warning control systems based on leading indicators (Bititci et al. 2012). Another explanation is the complexity of the underlying process, which makes it difficult to assess innovation quantitatively (Haldma et al. 2012). Jørgensen and Messner (2010) mention a worrying lack of knowledge of what it actually means to account for and control innovation as an explanation.
Due to the lack of agreement on a definition of performance measurement in management accounting research (Franco-Santos et al. 2007; Micheli and Mari 2014) and, more particularly, on KPIs (key performance indicators), a clear conceptualization is necessary. The definition of a key performance indicator is an instrument used to quantify the efficiency and/or effectiveness of action that is both quantifiable and verifiable (Melnyk et al. 2014).

2.3. KPIs in Innovation

KPIs in innovation are used to boost and benchmark innovation performance (Richtnér et al. 2017). Several streams have contributed to understanding innovation and KPIs (Barros and da Costa 2019; Lövstål and Jontoft 2017; Major et al. 2018). In the MAC literature, little attention has been paid to the KPIs used in innovation during the use stage of performance metrics and their systems (Bourne et al. 2014; Chenhall et al. 2013; Franco-Santos et al. 2012; Fried 2017; Goshu and Kitaw 2017; Micheli and Mari 2014). In MAC there is substantial research on the contingency approach in innovation and performance measurement (Burkert et al. 2014; Chenhall 2003; Chenhall and Moers 2015; Gerdin and Greve 2004; Otley 2016). Over the years, the trend to apply the contingency approach in MAC research has risen (Granlund and Lukka 2017; Otley 2016). Recently, however, it has been questioned whether the contingency approach suggests an appropriate fit in MAC, innovation, and performance measurement (Fried 2017; Granlund and Lukka 2017; Melnyk et al. 2014).
Although the contribution of the contingency approach to MAC is significant, in practice, the contingency approach in innovation and MAC may produce a limited picture of the complexity of real-life organizational settings, potentially leading to mixed results and problems in the development of the theory (Chapman 1997; Granlund and Lukka 2017). This is one of the reasons why other MAC researchers recommend applying a less deterministic and functional approach in studying KPIs in innovation at the use stage (Davila et al. 2009; Davila and Oyon 2009). As an alternative, several scholars have applied a more interpretative and relational perspective to accounting and innovation (Ahrens and Chapman 2007; Busco and Quattrone 2017; Mouritsen et al. 2009; Mouritsen and Kreiner 2016). This perspective can overcome shortcomings of the functional approaches in innovation and accounting, and be a response to the call for greater usage of qualitative research KPIs in innovation (Barros and da Costa 2019; Lövstål and Jontoft 2017). In this manner, the interaction between context, innovation, and KPIs is studied at several organizational levels (Ahrens and Chapman 2007; Busco and Quattrone 2017; Mouritsen et al. 2009; Mouritsen and Kreiner 2016). This may provide insight into how KPIs are used and sustained under all explanatory contextual circumstances, and thus pressured to “fit” (Busco and Quattrone 2017; Mouritsen and Kreiner 2016).
An interpretative or relational perspective on KPIs’ usage in innovation is not about the ideal fit of KPIs with the underlying context (Fried 2017). Instead, this view identifies KPIs as an accounting inscription, which outlines the social-political aspects of accounting (Robson and Bottausci 2018). By regarding KPIs as an accounting inscription, one acknowledges the ability of accounting numbers to be “interpreted” and “translated” by organizational members and creates a transformation through which an entity becomes material into a sign, an archive, a document, a piece of paper, or a trace (Busco and Quattrone 2017, 2018; Latour 1983). Viewing accounting as an inscription enables one to make sense of its limits as a representational device, of the impossibility for accounting to work as an “answer machine” (Burchell et al. 1980; Busco and Quattrone 2018). Latour (1984, 1987, 1996) is associated with the introduction of the focus upon networks of (accounting) inscriptions, inscription devices, and calculative action in MAC, and bridges a key dimension that had often seemed lacking in social-politically inspired research in accounting (Robson and Bottausci 2018).
Referring to KPIs as an accounting inscription may assist in theoretically making sense of the limits as a representational device (Busco and Quattrone 2018). These representational properties of KPIs may enable them to mediate the paradoxes of governance or tensions in innovation (Lövstål and Jontoft 2017; Michaud 2014; Mouritsen et al. 2001; Mouritsen et al. 2009). In parallel with Michaud (2014), to date, MAC research appears to show what KPIs in innovation should do, but tells rather little about what they actually do and how they actually are used. It indicates the fact that to refer to something is always a difficult, active, and, above all, a generative process (Busco and Quattrone 2018). Accounting inscriptions create new spaces of representations that regulate and transform the ethos, boundaries, and operations of the organization, in addition to the human agencies of its employees (Robson and Bottausci 2018). Referring to KPIs in innovation as being an inscription makes it possible to look at the features of the inscription itself, in which mediation of paradoxes such as control-creativity and control-collaboration in innovation can be observed (Lövstål and Jontoft 2017; Michaud 2014).
This interpretative view in MAC, where KPIs are seen as inscriptions, assists in understanding the influences of accounting on visibility, the shaping of behaviors, and, indeed, the unintended consequences that accounting numbers can create—often referred to as “biasing effects” or “what you measure is what you get”(Robson and Bottausci 2018). Jordan and Messner (2012) argue that “accounting information”—even if available in detailed form—provides only for a limited understanding and handling of the complexity of organizational life (Chapman et al. 1997). They also argue that this is one of the reasons why organizational members do not rely “blindly” on accounting information (Jordan and Messner 2012). In their paper, they indicate that organizational members instead contextualize or complement “accounting information” by drawing upon inscriptions or forms of knowledge (Jordan and Messner 2012).
Drawing upon previous accounting studies (Ahrens and Chapman 2007; Busco and Quattrone 2017; Mouritsen et al. 2009; Mouritsen and Kreiner 2016), this paper documents how KPIs are studied, analyzed, and interpreted in the contexts in which they operate (Hopwood 1972, 1974, 1983; Laine et al. 2016). Observations will thus be made of how organizational members make sense of competing interpretations, and prioritize and integrate them into their organizational practices (Laine et al. 2016).

2.4. Commoditization, Strategic Change, and KPIs in Innovation

Commoditization and strategy are intimately related and, in the processing industry, commoditization can be seen as a driver of strategic change. In the processing industry, the empirical context of this case study, commoditization has increased the pressure to innovate while securing a price-competitive and high-quality raw material supply base with a long-term perspective, in addition to the development of such raw materials (Lager 2016; Reimann et al. 2010). Commoditization, as a strategic change, affects the control systems that help managers accomplish innovation (Chenhall and Moers 2015).
Industry context has recently been adopted as a phenomenon that can be found in many industries and creates a particular set of challenges when controlling for innovation (Bromwich and Scapens 2016; Messner 2016). The occurrence of commoditization challenges the organization to innovate to create value, which in turn is reflected in the KPIs and their use. Various studies have examined the effects of commoditization on strategy, and suggest that organizations faced with commoditization usually choose strategies that create value and differentiation (Matthyssens and Vandenbempt 2008). Auguste et al. (2006) add that companies should not only understand the new strategic rules of commoditization, but they (companies) should integrate the rules in their existing internal operations. This implies that KPIs in innovation, as certain control practices, will be affected as a result of industry pressures such as commoditization (Bromwich and Scapens 2016; Messner 2016).
Several strategies can be implemented in response to the competitive pressures of commoditization. In this context of commoditization, the existing literature suggests shifting from basic product offerings to service-based value concepts (Matthyssens and Vandenbempt 2008). Both academia and practice highlight the importance and difficulty in this transition to service-based value offerings (Kowalkowski et al. 2015). In recent decades, scholars have suggested several strategies to make this transition to service-based value concepts and, recently, digitalization and Industry 4.0 have been suggested as tools in this transition (Kowalkowski et al. 2015; Kowalkowski et al. 2017; Vendrell-Herrero et al. 2017).
In essence, there are many strategic implications of commoditization, and several value-added strategies can be used to regain competitiveness (Kashani 2006). Kowalkowski et al. (2015) identify three service growth strategies, i.e., to become (1) an availability provider; (2) a performance provider; and (3) an “industrializer”. Other de-commoditization strategies are suggested in Matthyssens and Vandenbempt (2008). Here, de-commoditization is the process of restoring differentiation between a company’s products and services and those offered by competitors (Matthyssens and Vandenbempt 2008). De-commoditization strategies can be developed based on three value propositions, i.e., differentiation based on (1) product innovation and superior product qualities; (2) service innovation and customer bonding; and (3) operational excellence and fair value solutions (Matthyssens and Vandenbempt 2008). Newer strategies try to combine innovation and strategy to counter the effects of commoditization, as illustrated by Goffin et al. (2021). They illustrate four strategies; in summary, the data show that innovation and innovativeness is the key means for companies to address commoditization (Goffin et al. 2021).
In the context of this case study, commoditization and associated strategic change are relevant. The interaction between commoditization, strategy, innovation, and KPIs is a unique industry phenomenon to be captured (Messner 2016). This interaction, at different organizational levels, requires KPIs to be in alignment with other controls and strategies (Barros and da Costa 2019). According to MAC research, continuously building new managerial control instruments is the best means to deal with these uncertainties and complexities (Fried 2017; Fried et al. 2017). This industry context may thus explain why there is a difference between the KPIs designed for innovation and those used in practice (expected vs. actual use). The used KPIs may not provide the accounting information necessary to understand the complexity of organizational life (Chapman et al. 1997).

3. Research Setting and Methods

3.1. Research Design

To capture the usage of KPIs in innovation, a qualitative case study was conducted at an innovation department in a single organization. Among qualitative methods, case studies play a particularly important role, because they represent one of the most adopted qualitative methods in business performance and other organizational studies (De Massis and Kotlar 2014; Eisenhardt 1989). This qualitative research was chosen to reveal new insights that have the potential to introduce theorizing in completely new directions (Bansal et al. 2018).
The current paper is among the first to present a case study on KPIs in the processing industry and provides insights into how KPIs in innovation are used. This case study provides insight into how KPIs in innovation are used in this specific industry, which strategically needs to adapt to commoditization (Messner 2016). It shows KPIs as an accounting practice that is a dynamic process, and not a static process. By including several levels of observation (Moll 2015), the dynamics of industry context, strategy (commoditization), and KPIs can be captured less functionally and in a more relational manner.
Several studies show that there is a need for case studies that explore how KPIs within an organization evolve in response to changes in the organization’s internal and external operating environments, especially when innovative responses are required (Bourne et al. 2014; Zarzycka et al. 2019). Because the existing literature on KPI usage in innovation in the processing industry is still scarce, exploratory research using a case study was chosen as the preferred methodology to build knowledge about the phenomenon (Bromwich and Scapens 2016; Fried 2017; Messner 2016; Stebbins 2001; Yin 2013). Exploration, according to Stebbins (2001), is a broad-ranging, purposive, systematic, prearranged undertaking designed to maximize the discovery of generalizations, leading to a description and understanding of an area of social science (such as MAC).
This research took place at the innovation department, which was instated at least a decade ago, within a company in the process industry that operates global production sites and sales offices. The parent company generates annual revenues of more than EUR 1.5 billion, and the company’s headquarters are located in a Nordic country. It has sales and production in the Americas, Asia, and Europe, but the majority of its business is within European countries. The organization has end customers in a large number of sectors, including Automotive, Construction, Electronics, Medical, Feed and Food. Like those at many other large organizations, the innovation department and its teams are cross-functional. In addition to R&D and engineering, people from marketing and manufacturing are also involved in the projects to obtain a nuanced view. Because the innovation department is diffused globally, project members usually work internationally, located in different subsidiaries. The innovation department is a dispersed organization with multiple departments functioning in several regional markets. As a result, the innovation KPIs and organization in this study can be thought of as a singular example with multiple subcases (Yin 2013). The case was chosen based on the results of a pre-study undertaken in 2014 and recommendations from industry experts.

3.2. Data Collection

The empirical data was composed of semi-structured one-on-one interviews involving 28 individuals (see Table 1). The data collection took place between September 2017 and June 2018, with visits to several locations several times each month. Although semi-structured interviews made up the primary source of data collection, additional sources of data (websites, press releases, internal presentations/documentation, and project reports) were used to triangulate the findings. To prepare the interview rounds and select the case, two preliminary meetings were initiated at the organization’s headquarters. First, exploratory meetings were held with the responsible innovation controller and the senior innovation advisor. Notes were taken during the meeting, and internal documentation regarding the organization’s history of innovation was provided. Secondly, the respondents were selected to give their views on the existing KPIs in innovation. Interviewees were chosen based on a snowball criterion by the key informants in the company. All respondents were sent a personal invitation to participate in the interviews, which took place in the conference room at their work place or their office. In total, the researcher conducted 28 interviews with innovation personnel, project management, business/sales personnel, and senior management, mostly lasting between 20 minutes and one hour (see Table 1).
The initial questions related to the role of the individual, their responsibilities, their tasks in the organization, and their perceptions/definitions of innovation. Next, the respondents were asked about the KPIs of innovation in use, and the KPIs’ functionality, development possibilities, and challenges. Different organizational members were asked similar questions to gain different perspectives on the same topic and/or confirm individual accounts (See Table 1). In the last stage, the referenced internal documents, such as presentation slides from meetings or distributed after meetings, and public documents, such as the parent company’s recent annual reports, were gathered and compared to supplement the interviews.

3.3. Data Analysis

The data analysis aimed to understand what KPIs were used to measure innovation, and how these identified KPIs were interpreted and used by the interviewees. Data analysis consisted of a content analysis method based on the guidance of the prior literature (Bloomberg and Volpe 2018; Brinkmann and Kvale 2015; Schreier 2012). In this study, the researcher moved back and forth between data, theory, and related literature to make sense of their observations (Ahrens and Chapman 2006; Jørgensen and Messner 2010). Through a series of iterations between data collection, analysis, and literature review, the data analysis progressed incrementally from raw data to theoretical interpretation (Eisenhardt and Graebner 2007; Graebner et al. 2012). Several steps were taken to ensure data trustworthiness. First, multiple sources of data were used to triangulate perspectives (Eisenhardt 1989). Second, considerable time was spent in the field, using multiple methods of observation, and being aware of one’s behavior as a field researcher (Ahrens and Chapman 2006; McKinnon 1988). Third, to create a thorough audit trail, the researcher meticulously recorded all of the acquired data (interviews, observations, papers, and emails). Fourth, research participants were shown selected results by the researcher for them to confirm the accuracy of the information and narrative description. Fifth, to improve the reliability and validity of interpretations, the researcher solicited feedback from colleagues both formally (at conferences and seminars) and informally on the emergent concepts and themes. Although the data analysis process was far from linear, three steps of data analysis of the material were followed: (1) coding; (2) condensing/categorizing the material into themes; and (3) interpreting the condensed data in the previous stage considering the literature on innovation and KPIs.
More specifically, iterative steps of coding analysis and data reduction shaped this analytical process, which was based on themes identified by existing scholarly literature (a priori categories), in addition to topics that emerged from the empirical data (in vivo codes). The a priori categories are described in the preceding sections and consist of the perceptions and interpretations of organizational members of innovation and associated KPIs. The influence of industry context and strategy on innovation and KPIs is another theme that was identified in the observations (in vivo) and later matched to the a priori themes. For example, during the interviews, it became apparent that innovation was changing and that there was a demand for service provided by the organizational members working with innovation. An explanation for these tensions between a priori and in vivo codes was found by going back to literature on the subject of commoditization and strategic change. Furthermore, systematic readings and re-readings, in addition to writing up the thematically structured data, were used for analysis and additional data reduction.

4. Findings

The empirical material presented in the following sections illustrates how KPIs of innovation are used. To begin, the interaction between the innovation context and the KPIs in use is captured. Here, this interview series explores how innovation has changed. This is followed by a review of the current KPIs of innovation in use within the company. In the last section, the paper more closely examines the current KPIs in innovation and how they are used in combination with supplementary and complementary information, referred to as reading-between-the-lines information.

4.1. Innovation Organization

To understand the usage of KPIs in innovation, the respondents were first asked to present themselves and their role with or in the innovation department. In addition, they were asked to describe innovation in their current organization.
Before we start controlling and measuring innovation, we first need to define what we mean by innovation[…]I do not think we have come much further right now in defining innovation and discuss what we mean by innovation in total? What do we mean by innovation in each of our business areas? That is in some way the basis for how we want to measure and control this [innovation] […] innovation can be so many things […]it can be technical innovation, but also logistics […] innovation can be so incredibly difficult.
—Business controller
Each of the respondents appears to have their unique view and perception of what innovation is. This is reflected in their answers:
I think [innovation is] about taking care of the ideas that exist and ensure that we can develop and improve things and come up with new things. We need to handle it both in the short term and in the long-term
—senior innovation staff
Mostly they [innovation staff] work in new innovation projects, new products that they develop. I do not work with this in my project, but this with processes, development projects, and existing products [….]Right now I am working with innovation and new processes and we are trying to duplicate our facilities from one site to another. Three geographical locations are involved in this process. All locations have different prerequisites to test in the lab, so I need to assign it where the capacity is
—project leader
There are different aspects to [innovation organization]. It depends on what part of innovation you are talking about. The one I am talking about is technology development, that is, technology. We manufacture some products. We must have technology to manufacture, and my role is precisely in that development process and it then overlaps, with other innovation resources and with technology and operational resources. The whole process towards the development process is going from innovation and then eventually overlaps with operational group.
—senior innovation staff
Innovation at this company is complex. It is much more complex than where I worked before. From what I have seen up to now it is important for innovation to create products and applications that are going from commodities to high-end specialty productions.
—senior innovation staff
It’s a bit of a difficult question, but in general, I still find the organization quite smooth and fast. In general, I think it can be much better. I’m relatively happy with how things work [in innovation], but I think there’s a huge potential for development.
—project leader
The definition of [innovation] is to create profitable growth and at the same time be technical support in the relationship with customers. Provide technical support for our production and being involved in investment projects covered the long-term technical part of investment projects.
—director business
I have worked a few decades for this company, in the innovation department. In the beginning, we just were just a department at one location. Nowadays, innovation is conducted in several departments, locations, and organizational levels in the company.
—senior innovation consultant
We assess technology, product, or application [in innovation]. It can be any type. Do not forget that innovation is more than just a product. But the financial innovations and business models are not handled by us
—senior innovation staff
The innovation group is quite huge and includes other project managers with quite a lot of interaction, innovation. We have quite good interaction with other departments, and we are a forced to do
—project manager/innovation staff
Before we had an inside-out approach to innovation. Now with the new board, we have prioritized an outside-in approach.
The company decided to focus strategically on creating and delivering customer value, making delivering customer value an important guideline for innovation. This change of focus has had an impact on innovation, because with this new strategic focus employees are encouraged to think creatively about customer needs (official documentation). However, the inside-out approach has been discouraged, as innovation should not be done as a standalone process; it should be done with an understanding of the customer promise and to deliver on it (official documentation).
The official documentation underlines that innovation is not a standalone process and that innovation requires organizational effort, not only a departmental effort. The official documentation, in addition to the respondents, illustrate that innovation in this organization encompasses the entire chain from idea to value (Davila et al. 2012).

4.2. The KPIs in Use in Innovation: Expected vs. Actual

The following explores how the respondents use the information derived from the “official” KPIs in their daily work. A goal alignment program was implemented to ensure that innovation became an organizational effort. This program aligned the mission, vision, and subsequent KPIs at all levels of the company, and ensured the focus on the customer at each level (interview CEO, director, staff). In the previous section, the written documentation shows that the innovation organization and its strategy have been subject to change due to commoditization. In this part of the case study, the focus is on how the “official” KPIs are actually used by each of the respondents.
“In 2014, a member of the board and I decided to change the way we follow up innovation, and this KPI for innovation is still found in our organization today.”
—CEO
These interviews were conducted in 2017, at the time when the “official” KPIs in innovation had been in place and in use for quite a while (2014–2017). Asking the respondents about the use of these “official” KPIs, and what information they derive from them, led to unexpected responses:
I get no information from the [official] KPIs. […] To make it more concrete, I do use pipeline value in my job so it is clear that those with the highest pipeline value are given a higher priority than those with a lower pipeline value. That way I use it, but it gives no extra information so to speak.
—director of business
To be honest, I have not been looking at the KPIs for innovation, because at the moment they do not give me any information that I can use. I really hope that future work will develop relevant KPIs for this area of our business.
—Business controller
Yes, we get values [from the KPIs]. […] Sure, it’s a measure of how we work, but it’s not the whole measure. Because we have a limited number of KPIs, it does not cover everything. And sometimes you get it right with a KPI and sometimes you get it wrong with the KPI.
—innovation director
At a senior level, these official KPIs are questioned. In particular, how they are calculated and derived appears to be a discussion point in some of the interviews. At middle manager and staff levels, the official KPIs are also somewhat used as something that is used by top management but not influenced by their actions.
I get very little information from the KPIs. They [the KPIs] control very little of my everyday life, I think. Because [the KPIs] are mostly used by top management, top management follows them [KPIs] up. My opinion is that they are not communicated so clearly downwards. I’m probably not the only one who says so.
—Project leader¨
Since I have worked here less than a year, I have not really understood which KPIs in innovation are used in the company and how I am expected to use them. At my previous employer, this was clearer from the beginning because we have had many similar projects and predetermined KPIs. This organization has a much broader innovation portfolio that maybe explains why it is difficult for me to understand the innovation KPIs.
—Senior innovation staff
Yes, as I said, I’m pretty new to my job. but I have not encountered them [KPIs] very much. I have heard that they [KPIs] were mentioned at the presentation and I know them. [….] I have written down those I know, so to speak. What I may be missing is that they are not really so gathered in one place. I think the KPIs are anonymous, I do not think people take to know them very well. If you gather [KPIs] at one place, make them uniform in some standardized way they would be easily manageable. This could be a suggested improvement.
—Project leader
Matching the official KPIs with the KPIs that are used in day-to-day work yields a surprising insight. A review of the responses shows that approximately 90 percent of the respondents do not practically, literally use official KPIs for innovation in their day-to-day job. Respondents are aware that there are KPIs in innovation, but do not appear to know how to apply them in daily work. The value of each of the KPIs is difficult to assess and comprehend because it is aggregated and cannot be influenced by individual action. The pipeline value, for example, is only adjusted quarterly and moves marginally.

4.3. KPIs and Reading-between-the-Lines Information

In the previous interviews, it became clear that the “official” KPIs in innovation do not measure everything, and in most cases provide little information to the organization. The Vice President questions what and how they can measure everything that is financially hidden in innovation:
Can we measure the value of what we [innovation] do on the Intellectual Property side? How can we measure the value of what we do on the Open Innovation side? At the moment it is not visible in these KPIs. How can we measure the value of what we [innovation] do on the support and service side because it is not visible in these KPIs and we spend about 30 percent of our time there?
—Vice President
When exploring the use of the official KPIs and the suggestions for future KPIs for innovation, a native language word is encountered, which translates in this context to mean reading-between-the-lines information.
I think this (official) KPI has reading-between-the-lines information. For example, this other unit has assigned their project a much higher value than we have. This has led us to believe that this KPI is not an exact figure but rather it is relative to what the procedure is.
—Technology and Business manager
However, there are other areas of innovation where there is reading-between-the-lines information, for example, the fuzzy front end of innovation, or the open-ended processes of innovation. Here, it shows that the KPI only partly provides the required information and therefore is complemented with reading-between-the-lines information:
Now concerning the fuzzy part of innovation, how we value the ideation process, this is not clear in the (official) KPI. We try to contribute to the KPI here [points at the whiteboard]. This is the point where we need to contribute at least three projects, but of course, we needed to review between 75–150 ideas to make this contribution.
—Senior researcher
This confirms the first interview response in this section; an identified contribution is made by the fuzzy front end of innovation to the innovation project portfolio. The reading-between-the-lines-information provides supplementary information to the official KPIs, and an explanation for the part that remains unrecorded in the official KPIs.
The intellectual property and intangibles are another contribution of innovation value; here, another source of reading-between-the-lines information can be found. This reading-between-the-lines information comes from the patent office. Although statistics about filed patents are proudly shown and reflected in official statements, their value is less obviously incorporated in the “official” KPIs—as confirmed in the first statement of this section.
Here is my presentation that I gave at one of my meetings about intangible assets. As you can see, the value of intangible assets in companies has risen steeply over the last decade. This is why patent management is so important. We provide and safeguard a lot of value for the organization. I report very detailed information on patents and their value to management, but this is mostly outside the regular KPI reporting.
—Patent manager
Another instance of information not being part of any KPI concerns delivering customer value (see the VP’s statement above). Because management wants to increase customer focus and the delivery of service solutions, the KPIs in place should reflect delivery on this promise. However, the customer value created is measured and quantified by increased sales and appears to ignore the 30 percent of innovation staff time allocated to customer service.
With the aid of a prioritization tool, we (innovation staff) are trying to make decisions about which customer gets what time, and when they have priority. This is not part of our normal systems and KPIs, but we are trying to integrate the prioritization that we now do manually into the new system, which will automatically show which customers have priority. I am not sure if this is going to help. Sometimes it feels like demands are made without prior notice? and are given priority by the company or management and then the system is useless.
—Innovation staff
This section shows that, although official KPIs are used in innovation, they provide little information on innovation and the value it adds to the rest of the organization. The official KPIs, although they risk being ill-suited for the innovation organization and strategy, in practice, are not discarded. An explanation of why these KPIs are not discarded may be the reading-between-the-lines information found in the interviews. This scorekeeping and unrecorded information are used in addition to the official KPIs to support the value that innovation provides to the bottom line.

5. Discussion

This paper explores the use of official KPIs of innovation in an organization strategically challenged by commoditization. In so doing, it may extend the previous understanding of KPI usage in innovation, which focuses mostly on the KPI usage in NPD, R&D, and advanced research (Major et al. 2018; Moll 2015). The findings confirm that accounting in innovation features multiple stakeholders and individual managerial actors (Barros and da Costa 2019; Major et al. 2018). In the company examined in the case study, innovation is diffused globally, involving multiple stakeholders and managerial actors located in different subsidiaries.
As one of the project managers at the innovation department comments: we are trying to test a couple of things in the lab here on distance, we have two other [remote] locations involved to see whether the new facilities will work and the products can be produced.
Earlier MAC research has primarily focused on NPD and R&D, but more recently, the importance of extending the concept of innovation to issues, such as the business model, service, and open innovation, in addition to collaborative environments, have been stressed (Bedford 2015; Chenhall and Moers 2015; Davila et al. 2012; Fried et al. 2017; Moll 2015). In this study, it is shown that innovation is indeed an organization-wide effort, and to achieve innovation the individual and collective contributions of several stakeholders and individual managerial actors is necessary (Major et al. 2018). Innovation thus includes many processes from idea generation to execution and value capture located in different subsidiaries and countries (Davila et al. 2012).
In this case study, the organization and strategy of innovation were subject to change over time. Earlier, the answers of the senior innovation consultant and the CEO underlined this change in innovation. The innovation organization and its strategy appear to have shifted as a response to commoditization (Kashani 2006; Reimann et al. 2010). The focus areas of the case company’s innovation organization and strategies appear to correspond with all three of the value-added strategies in this figure. These explained that change in innovation, at different levels, requires KPIs to be in alignment with other controls and strategies (Barros and da Costa 2019). MAC research recommends continuously developing new managerial control instruments to retrieve these uncertainties and complexities in innovation (Fried 2017). In practice, alignment between the innovation strategy and organization and the KPIs is reflected in the information derived from the “official” KPIs in innovation. The evidence gathered shows that KPIs in innovation do not always align with shifts in innovation strategy and organization. This study presented the reasons why these “official” KPIs are still in place, although these could be removed, or updated to align with the context.
Next, this study examined the situated use of performance indicators using a more interpretative and relational perspective on accounting (Ahrens and Chapman 2007; Busco and Quattrone 2017; Mouritsen et al. 2009; Mouritsen and Kreiner 2016). In so doing, it may extend the previous understanding in MAC research of KPI usage, which has had more a functional, contingency approach (Bititci et al. 2006; Bourne et al. 2000; Fried et al. 2017; Kennerley and Neely 2002; Moll 2015). Drawing upon previous accounting studies (Ahrens and Chapman 2007; Busco and Quattrone 2017; Mouritsen et al. 2009; Mouritsen and Kreiner 2016), the findings show how to study, analyze, and interpret KPIs in the contexts in which they operate (Hopwood 1972, 1974, 1983; Laine et al. 2016). This shift has allowed the researcher to observe the role KPIs play in innovation and, more specifically, how KPIs may shape the way organizational actors make sense of competing interpretations, and prioritize and integrate them into their organizational practices (Laine et al. 2016). The statement of the innovation direction is an example of this: Yes, we get values [from the KPIs]. […] Sure, it’s a measure of how we work, but it’s not the whole measure. Because we have a limited number of KPIs, it does not cover everything. And sometimes you get it right with a KPI and sometimes you get it wrong with the KPI. At each moment, the innovation director can decide whether the KPI is wrong or right, and makes sense of competing interpretations. Ultimately, these interpretations are prioritized and integrated into the next action steps.
The interpretative view, as an alternative to a functional view, shows how official KPIs, as inscriptions, translate innovation (Mouritsen et al. 2009; Robson and Bottausci 2018). The differences between these views—functional and interpretative—can be illustrated with the findings in the study. From a functional point of view, the official innovation KPIs, designed in 2014, appear to be clear and appropriate, as the citations of the CEOs highlight. The CEO states in the interview: “In 2014, a member of the board and I decided to change the way we follow up innovation, and this KPI for innovation is still found in our organization today”. On paper, the official innovation KPIs thus appear to have the ideal fit with external and internal conditions, and are adapted to firm-specific objectives of measurement (Brattström et al. 2018). When these “official” KPIs are used in innovation, however, a different story is told, displaying something more similar to a quasi-fit or even a misfit (Fried 2017). The interpretative perspective uncovers that KPIs used in innovation are translated differently by other respondents. As one of the respondents’ comments: “I am aware that we have innovation KPIs at this company, but I do not see how I can use them in my daily tasks. I would like to have the KPIs available in one place and more visible”. The interpretative view shows that “official” KPIs in innovation are used and interpreted differently by each of the respondents in the organization. Here, the representational value of a KPI becomes apparent, where the official KPI is “translated” differently at different organizational levels by different respondents, at different points of time (Mouritsen et al. 2009; Robson and Bottausci 2018).
Each organizational member appeared to have their own use and translation of the official KPIs in innovation, depending on their role and level. This is illustrated by the diversity of responses in the section on KPI usage. The different translations of the KPIs in innovation throughout the organization may show parallels with the study conducted by Mouritsen et al. (2009) In their empirical account, Mouritsen et al. (2009) show that “management accounting calculations do not calculate innovation activities per se but they mediate it. They hardly make the innovation more transparent because they do not model it; rather they mediate between innovation activities and firm-wide concerns and influence the intensity and direction of innovation [activities]. Management accounting calculations add a new perspective to innovation [activities]”. Applying this reasoning to this study, innovation KPIs present a representational value and do not calculate an exact value or amount; thus, they provide mediation in innovation between levels of the organization (Mouritsen et al. 2009). It could be argued that in innovation, KPIs in use can mediate between the paradoxes of organizational governance and tension between control and creativity (Lövstål and Jontoft 2017; Major et al. 2018; Michaud 2014).
The previous empirical analysis section showed that, in innovation, although the official KPIs in use appear not to provide the necessary information, they are not discarded. Thus, apparently, poor fit metrics do not necessarily need to be ineffective, harmful, or destructive (Bourne et al. 2014; Fried 2017; Melnyk et al. 2014). This study shows that there is indeed a value in “lagging” performance indicators (Melnyk et al. 2014) and companies might have reason to sustain them. Reading-between-the-lines information found in the interviews might explain why KPIs with a poor fit continue to be in place and not discarded. In the empirical poor fit, KPIs produce processes through which incompleteness, partial references, failed accounts, lacking observations, and barren signs all tightly fit together in generating not only faulty and partial inventories (i.e., representations), but also re-combinations, inventions, and alternatives (Busco and Quattrone 2015, 2017, 2018). The fact that these poor fit and incomplete KPIs are not discarded can be seen as an extension of the findings by Busco and Quattrone (2017), which show that “incompleteness provides a space for both reducing organizational complexity (to pragmatically manage it) and expands our knowledge of it through a maieutic process of interrogation”. This may be a first step in understanding how accounting does not need to “ideally” fit the context of innovation but instead shows how accounting in a quasi-fit or misfit can manage the paradoxes and tensions in innovation (Jordan and Messner 2012; Lövstål and Jontoft 2017; Michaud 2014; Robson and Bottausci 2018).
These empirical observations lead to another important insight concerning the incompleteness of KPI accounting representations (Jordan and Messner 2012; Wouters and Wilderom 2008). Specifically, this study finds support that KPIs are not used in isolation (Rowe et al. 2012). The reading-between-the-lines information supplemented or complemented the official KPIs and hardened soft data (Rowe et al. 2012). As explained earlier, one of the explanations of why the official KPIs in use are rendered incomplete may be the shift towards innovation becoming an organization-wide effort. This reading-between-the-lines information explains the unmeasurable or not-measured in innovation, such as intangibles, customer support, and value for example (see Section 4.3). The existence of the reading-between-the-lines information may show that the KPIs in innovation may not deliver what they promise, but generate effects other than those its designers wanted them to produce (Busco and Quattrone 2015; Mouritsen and Kreiner 2016).

6. Conclusions

The findings of this study explore the usage of KPIs in innovation and advance our understanding of MAC research. This study highlights that “official” and “predetermined” KPIs in innovation are used, despite their poor fit to innovation (Fried 2017; Melnyk et al. 2014). Thus, this study has explored how, under the strategic pressures of commoditization, KPIs are being used that are lagging or incomplete in relation to their underlying context (Jordan and Messner 2012; Melnyk et al. 2005; Melnyk et al. 2014). It highlights that there is a certain misfit between KPIs in innovation, where KPIs are not contingent upon the changes in innovation strategy and organization. This lack of fit, misfit, or quest for the perfect quasi-fit illustrated here, does not need to hinder using KPIs or render KPIs obsolete (Busco and Quattrone 2017; Fried 2017).
This interview series involved multiple organizational levels of analysis and uncovered that the KPIs being used in innovation are associated with processes of “inscription”, “translation”, in addition to processes of “contingency” and “fit”(Busco and Quattrone 2018; Fried 2017; Mouritsen et al. 2001; Mouritsen et al. 2009). This study highlights that the perceived view and definition of innovation at different organizational levels and different points of time affect the use of KPIs in innovation. The findings show that innovation is no longer a departmental duty, but concerns all organizational levels responsible for the chain from idea to value (Davila et al. 2012). The empirical evidence indicates that the evolution of innovation towards an organization-wide effort has affected the KPIs that are used and their perceived completeness and fit with innovation (Chenhall and Moers 2015; Fried 2017). When KPIs were used in innovation, they did not fit the context of innovation where commoditization and associated strategic rules were implemented, and thus were used in unintended ways. This study concludes that KPIs that poorly fit innovation are little used as a standalone metric, which was unexpected (Busco and Quattrone 2015, 2017; Rowe et al. 2012). KPIs with a poor fit in innovation display incompleteness in usage, but surprisingly, this fact did not lead to ineffective, harmful, or destructive consequences (Bourne et al. 2014; Fried 2017; Melnyk et al. 2014).
The implications of the findings for management are, first, that it is important to review the use of KPIs in organizations regularly, especially for KPIs that are operating in innovation. The intangible aspects of innovation, in certain types of innovation, e.g., service innovation, are problematic to assess with KPIs, which has implications for implementing appropriate KPIs. Secondly, management should ensure that everyone working on innovation is on the same page, which implies that the definition of innovation and how it is accomplished in the organization should be a focal point of discussions before any metrics should be put in place (Adams et al. 2006; Edison et al. 2013). Thirdly, not only design, but also KPI usage, functionality, and updates/reviews, are necessary to keep up with the rising complexity of innovation and metrics (Braz et al. 2011; Nudurupati et al. 2016; Okwir et al. 2018). Fourthly, KPIs in innovation, especially those used to evaluate and reward an individual, team, or organizational actions, will subsequently be used as a social-political tool, or, as is suggested in this paper, as an inscription. Thus, KPIs are “translated”, “interpreted”, and “complemented” to understand or to establish power. Managers should be aware that KPIs are not just reporting accounting information, but can be used as a tool in political games in organizations (Kirsner 2015). Similarly, an implication for policymakers or shareholders/stakeholders is that KPIs in innovation only tell part of the story of innovation performance. Policymakers should be aware that “past performance is no guarantee for future success” especially in the use of KPIs in innovation.
Furthermore, implications for policymakers can be derived from this study, because promoting and measuring innovation in organizations is also of utmost importance for them. In the process of screening and overseeing companies’ innovation, policymakers should be aware of the problems associated with measuring innovation, and clarify together with companies the type of innovation that is being measured and the part of innovation being measured, and its input, output, outcome, or process (Davila et al. 2012; Edison et al. 2013). Another implication for policymakers is to not take the measured value, the accounting number, as a given, and allow organizational members to complement and supplement these static and quantitative inscriptions with other types of qualitative data. To assess innovation performance in organizations, policymakers should encourage and apply slack in their KPIs; this will provide organizations with the leeway to regroup and manage innovation with “degrees of failure” and an “error margin”. Here, the conversational value of KPIs in innovation is more important than an exact number (Brattström et al. 2016). In addition to its contributions, this paper is subject to limitations and provides possibilities for future research. This study only involves a single case study at one organization and can be seen as explorative. The challenges found in using KPIs in innovation provide the researcher with the possibilities to explore and explain: (1) how innovation influences KPI usage; (2) how understanding KPI usage can influence future KPI review, update, and design; (3) how multiple level analysis in KPI usage can influence future KPI review, update, and design; (4) how strategic pressures and industry context can influence KPI usage and their alignment with innovation; (5) how incomplete KPIs still can have the explanatory power and still be useful in practice; and (6) how applying a “coordination approach” or “orchestrating approach” to existing MCS, existing KPIs, and new KPIs, assists MAC researchers in departing from the traditional greenfield approach (Carnes et al. 2017; Lohman et al. 2004). The challenge in MAC research, where the KPIs should “fit” or be “contingent” upon its underlying context, shows opportunities for further research in actual rather than “desirable” accounting practice (Chenhall et al. 2013; Chenhall and Moers 2015). Future research into these issues may increase our understanding at several, rather than only one, levels of analysis (Busco and Quattrone 2017; Mouritsen et al. 2009; Mouritsen and Kreiner 2016).

Funding

This research was funded by a group of Swedish companies to celebrate LTHs 50th year of existence and additionally by the Swedish Research School for Management and IT.

Acknowledgments

This work would not be accomplished without valuable comments provided by Lars Bengtsson and Ola Alexandersson. Special thanks go to the editors and the reviewers for their efforts and their critical and constructive feedback, which helped to improve the quality and message of our article.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Abernethy, Margaret A., and Peter Brownell. 1997. Management control systems in research and development organizations: The role of accounting, behavior and personnel controls. Accounting, Organizations and Society 22: 233–48. [Google Scholar] [CrossRef]
  2. Adams, Richard, John Bessant, and Robert Phelps. 2006. Innovation management measurement: A review. International Journal of Management Reviews 8: 21–47. [Google Scholar] [CrossRef]
  3. Ahrens, Thomas, and Christopher S. Chapman. 2006. Doing qualitative field research in management accounting: Positioning data to contribute to theory. Accounting, Organizations and Society 31: 819–41. [Google Scholar] [CrossRef]
  4. Ahrens, Thomas, and Christopher S. Chapman. 2007. Management accounting as practice. Accounting, Organizations and Society 32: 1–27. [Google Scholar] [CrossRef]
  5. Auguste, Byron G., Eric P. Harmon, and Vivek Pandit. 2006. The right service strategies for product companies. McKinsey Quarterly 1: 40. [Google Scholar]
  6. Bansal, Pratima, Wendy K. Smith, and Eero Vaara. 2018. “New ways of seeing through qualitative research”. Academy of Management Briarcliff Manor. Academy of Management Journal 61: 1189–95. [Google Scholar] [CrossRef] [Green Version]
  7. Barros, Rúben Silva, and Ana Maria Dias Simões da Costa. 2019. Bridging management control systems and innovation. Qualitative Research in Accounting & Management 16: 342–72. [Google Scholar]
  8. Bedford, David S. 2015. Management control systems across different modes of innovation: Implications for firm performance. Management Accounting Research 28: 12–30. [Google Scholar] [CrossRef]
  9. Bititci, Umit S., Kepa Mendibil, Sai Nudurupati, Patrizia Garengo, and Trevor Turner. 2006. Dynamics of performance measurement and organisational culture. International Journal of Operations & Production Management 26: 1325–50. [Google Scholar]
  10. Bititci, Umit, Patrizia Garengo, Viktor Dörfler, and Sai Nudurupati. 2012. Performance Measurement: Challenges for Tomorrow*. International Journal of Management Reviews 14: 305–27. [Google Scholar] [CrossRef] [Green Version]
  11. Bloomberg, Linda Dale, and Marie Volpe. 2018. Completing Your Qualitative Dissertation: A Road Map from Beginning to End. 4 vols. Temecula: Sage. [Google Scholar]
  12. Bourne, Mike, John Mills, Mark Wilcox, Andy Neely, and Ken Platts. 2000. Designing, implementing and updating performance measurement systems. International Journal of Operations & Production Management 20: 754–71. [Google Scholar]
  13. Bourne, Mike, Steven A. Melnyk, Umit Bititci, Ken Platts, and Bjørn Andersen. 2014. Emerging issues in performance measurement. Management Accounting Research 2: 117–18. [Google Scholar] [CrossRef]
  14. Bourne, Mike, Steven Melnyk, and Umit S. Bititci. 2018. Performance measurement and management: Theory and practice. International Journal of Operations & Production Management 28: 2010–21. [Google Scholar] [CrossRef] [Green Version]
  15. Brattström, Anna, Johan Frishammar, Anders Daniel Richtnér, Jennie Bjork, and Mats Magnusson. 2016. Boxing-In and Box-Breaking of Attention: A Process Model of Innovation Measurement. Academy of Management. [Google Scholar] [CrossRef]
  16. Brattström, Anna, Johan Frishammar, Anders Richtnér, and Dane Pflueger. 2018. Can innovation be measured? A framework of how measurement of innovation engages attention in firms. Journal of Engineering and Technology Management 48: 64–75. [Google Scholar] [CrossRef]
  17. Braz, Renata Gomes Frutuoso, Luiz Felipe Scavarda, and Roberto Antonio Martins. 2011. Reviewing and improving performance measurement systems: An action research. International Journal of Production Economics 133: 751–60. [Google Scholar] [CrossRef]
  18. Brinkmann, Svend, and Steinar Kvale. 2015. Conducting an interview. Interviews. Learning the Craft of Qualitative Research INTERVIEWING 3: 149–66. [Google Scholar]
  19. Bromwich, Michael, and Robert W. Scapens. 2016. Management accounting research: 25 years on. Management Accounting Research 31: 1–9. [Google Scholar] [CrossRef]
  20. Burchell, Stuart, Colin Clubb, Anthony Hopwood, John Hughes, and Janine Nahapiet. 1980. The roles of accounting in organizations and society. Accounting, Organizations and Society 5: 5–27. [Google Scholar] [CrossRef]
  21. Burkert, Michael, Antonio Davila, Kandarp Mehta, and Daniel Oyon. 2014. Relating alternative forms of contingency fit to the appropriate methods to test them. Management Accounting Research 25: 6–29. [Google Scholar] [CrossRef]
  22. Busco, Cristiano, and Paolo Quattrone. 2015. Exploring how the balanced scorecard engages and unfolds: Articulating the visual power of accounting inscriptions. Contemporary Accounting Research 32: 1236–62. [Google Scholar] [CrossRef] [Green Version]
  23. Busco, Cristiano, and Paolo Quattrone. 2017. In Search of the “Perfect One”: How accounting as a maieutic machine sustains inventions through generative ‘in-tensions’. Management Accounting Research 39: 1–16. [Google Scholar] [CrossRef]
  24. Busco, Cristiano, and Paolo Quattrone. 2018. Performing business and social innovation through accounting inscriptions: An introduction. Accounting, Organizations and Society 67: 15–19. [Google Scholar] [CrossRef] [Green Version]
  25. Carlsson-Wall, Martin, Kalle Kraus, and Martin Messner. 2016. Performance measurement systems and the enactment of different institutional logics: Insights from a football organization. Management Accounting Research 32: 45–61. [Google Scholar] [CrossRef]
  26. Carnes, Christina Matz, Francesco Chirico, Michael A. Hitt, Dong Wook Huh, and Vincenzo Pisano. 2017. Resource orchestration for innovation: Structuring and bundling resources in growth-and maturity-stage firms. Long Range Planning 50: 472–86. [Google Scholar] [CrossRef] [Green Version]
  27. Chapman, Christopher S. 1997. Reflections on a contingent view of accounting. Accounting, Organizations and Society 22: 189–205. [Google Scholar] [CrossRef]
  28. Chapman, Ross L., Peter Charles Murray, and Robert Mellor. 1997. Strategic quality management and financial performance indicators. International Journal of Quality & Reliability Management 14: 432–48. [Google Scholar]
  29. Chenhall, Robert H. 2003. Management control systems design within its organizational context: Findings from contingency-based research and directions for the future. Accounting, Organizations and Society 28: 127–68. [Google Scholar] [CrossRef]
  30. Chenhall, Robert H., and Frank Moers. 2015. The role of innovation in the evolution of management accounting and its integration into management control. Accounting, Organizations and Society 47: 1–13. [Google Scholar] [CrossRef]
  31. Chenhall, Robert H., Matthew Hall, and David Smith. 2013. Performance measurement, modes of evaluation and the development of compromising accounts. Accounting, Organizations and Society 38: 268–87. [Google Scholar] [CrossRef] [Green Version]
  32. Davila, Antonio, and Daniel Oyon. 2009. Introduction to the special section on accounting, innovation and entrepreneurship. European Accounting Review 18: 277–80. [Google Scholar] [CrossRef]
  33. Davila, Antonio, George Foster, and Daniel Oyon. 2009. Accounting and control, entrepreneurship and innovation: Venturing into new research opportunities. European Accounting Review 18: 281–311. [Google Scholar] [CrossRef]
  34. Davila, Tony. 2000. An empirical study on the drivers of management control systems‘ design in new product development. Accounting, Organizations and Society 25: 383–409. [Google Scholar] [CrossRef]
  35. Davila, Tony, Marc Epstein, and Robert Shelton. 2012. Making Innovation Work: How to Manage It, Measure It, and Profit from It. Upper Saddle River: FT Press. [Google Scholar]
  36. De Massis, Alfredo, and Josip Kotlar. 2014. The case study method in family business research: Guidelines for qualitative scholarship. Journal of Family Business Strategy 5: 15–29. [Google Scholar] [CrossRef]
  37. Edison, Henry, Nauman Bin Ali, and Richard Torkar. 2013. Towards innovation measurement in the software industry. Journal of Systems and Software 86: 1390–407. [Google Scholar] [CrossRef] [Green Version]
  38. Eisenhardt, Kathleen M. 1989. Building theories from case study research. Academy of Management Review 14: 532–50. [Google Scholar] [CrossRef]
  39. Eisenhardt, Kathleen M., and Melissa E. Graebner. 2007. Theory building from cases: Opportunities and challenges. Academy of Management Journal 50: 25–32. [Google Scholar] [CrossRef]
  40. Fagerlin, Wen Pan, and Eva Lövstål. 2020. Top managers’ formal and informal control practices in product innovation processes. Qualitative Research in Accounting & Management 17: 497–524. [Google Scholar]
  41. Feeney, Orla, and Bernard Pierce. 2018. Accounting and new product development. Qualitative Research in Accounting & Management 15: 251–79. [Google Scholar] [CrossRef]
  42. Franco-Santos, Monica, Lorenzo Lucianetti, and Mike Bourne. 2012. Contemporary performance measurement systems: A review of their consequences and a framework for research. Management Accounting Research 23: 79–119. [Google Scholar] [CrossRef] [Green Version]
  43. Franco-Santos, Monica, Mike Kennerley, Pietro Micheli, Veronica Martinez, Steve Mason, Bernard Marr, Dina Gray, and Andrew Neely. 2007. Towards a definition of a business performance measurement system. International Journal of Operations & Production Management 27: 784–801. [Google Scholar]
  44. Fried, Andrea. 2017. Terminological distinctions of ‘control’: A review of the implications for management control research in the context of innovation. Journal of Management Control 28: 5–40. [Google Scholar] [CrossRef] [Green Version]
  45. Fried, Andrea, Uwe Götze, Klaus Möller, and Paulo Pecas. 2017. Innovation and management control. Journal of Management Control 28: 1–4. [Google Scholar] [CrossRef]
  46. Gerdin, Jonas, and Jan Greve. 2004. Forms of contingency fit in management accounting research—A critical review. Accounting, Organizations and Society 29: 303–26. [Google Scholar] [CrossRef]
  47. Goffin, Keith, Aleksei Beznosov, and Matthias Seiler. 2021. Countering Commoditization Through Innovation Challenges for European B2B Companies: B2B companies can use a Commoditization-Innovativeness Matrix to identify actions to counteract the pervasive threat of commoditization that exists in many B2B markets. Research-Technology Management 64: 20–28. [Google Scholar] [CrossRef]
  48. Goshu, Yitagesu Yilma, and Daniel Kitaw. 2017. Performance measurement and its recent challenge: A literature review. International Journal of Business Performance Management 18: 381–402. [Google Scholar] [CrossRef]
  49. Grabner, Isabella, and Frank Moers. 2013. Management control as a system or a package? Conceptual and empirical issues. Accounting, Organizations and Society 38: 407–19. [Google Scholar] [CrossRef]
  50. Graebner, Melissa E., Jeffrey A. Martin, and Philip T. Roundy. 2012. Qualitative data: Cooking without a recipe. Strategic Organization 10: 276–84. [Google Scholar] [CrossRef]
  51. Granlund, Markus, and Kari Lukka. 2017. Investigating highly established research paradigms: Reviving contextuality in contingency theory based management accounting research. Critical Perspectives on Accounting 45: 63–80. [Google Scholar] [CrossRef]
  52. Haefliger, Stefan, and Marion Poetz. 2016. Leadership in Open and Distributed Innovation. Paper presented at the The Academy of Management Annual Meeting, Briarcliff Manor, NY, USA, August 8. [Google Scholar]
  53. Haldma, Toomas, Salme Nasi, and Giuseppe Grossi. 2012. Background and scope of the special issue on innovations in accounting, performance measurement and management: International trends. Baltic Journal of Management 7: 352–54. [Google Scholar] [CrossRef]
  54. Hopwood, Anthony G. 1972. An empirical study of the role of accounting data in performance evaluation. Journal of Accounting Research 10: 156–82. [Google Scholar] [CrossRef]
  55. Hopwood, Anthony G. 1974. Leadership climate and the use of accounting data in performance evaluation. The Accounting Review 49: 485–95. [Google Scholar]
  56. Hopwood, Anthony G. 1983. On trying to study accounting in the contexts in which it operates. Accounting, Organizations and Society 8: 287–305. [Google Scholar] [CrossRef]
  57. Ittner, Christopher D., and David F. Larcker. 2003. Coming up short on nonfinancial performance measurement. Harvard Business Review 81: 88–95. [Google Scholar]
  58. Jordan, Silvia, and Martin Messner. 2012. Enabling control and the problem of incomplete performance indicators. Accounting, Organizations and Society 37: 544–64. [Google Scholar] [CrossRef]
  59. Jørgensen, Brian, and Martin Messner. 2009. Management control in new product development: The dynamics of managing flexibility and efficiency. Journal of Management Accounting Research 21: 99–124. [Google Scholar] [CrossRef]
  60. Jørgensen, Brian, and Martin Messner. 2010. Accounting and strategising: A case study from new product development. Accounting, Organizations and Society 35: 184–204. [Google Scholar] [CrossRef]
  61. Kashani, Kamran. 2006. Fighting commoditization strategies for creating novel customer values. Perspectives for Managers 137: 1. [Google Scholar]
  62. Kennerley, Mike, and Andy Neely. 2002. A framework of the factors affecting the evolution of performance measurement systems. International Journal of Operations & Production Management 22: 1222–45. [Google Scholar]
  63. Keupp, Marcus Matthias, Maximilian Palmié, and Oliver Gassmann. 2012. The strategic management of innovation: A systematic review and paths for future research. International Journal of Management Reviews 14: 367–90. [Google Scholar] [CrossRef]
  64. Kirsner, Scott. 2015. What Big companies get wrong about innovation metrics. Harvard Business Review 6. Available online: https://hbr.org/2015/05/what-big-companies-get-wrong-about-innovation-metrics (accessed on 17 August 2021).
  65. Kowalkowski, Christian, Charlotta Windahl, Daniel Kindström, and Heiko Gebauer. 2015. What service transition? Rethinking established assumptions about manufacturers’ service-led growth strategies. Industrial Marketing Management 45: 59–69. [Google Scholar] [CrossRef] [Green Version]
  66. Kowalkowski, Christian, Heiko Gebauer, and Rogelio Oliva. 2017. Service growth in product firms: Past, present, and future. Industrial Marketing Management 60: 82–88. [Google Scholar] [CrossRef]
  67. Lager, Thomas. 2016. Managing innovation & technology in the process industries: Current practices and future perspectives. Procedia Engineering 138: 459–71. [Google Scholar]
  68. Laine, Teemu, Tuomas Korhonen, and Miia Martinsuo. 2016. Managing program impacts in new product development: An exploratory case study on overcoming uncertainties. International Journal of Project Management 34: 717–33. [Google Scholar] [CrossRef]
  69. Latour, Bruno. 1983. Give me a laboratory and I will raise the world. In Science Observed: Perspectives on the Social Study of Science. New Delh: SAGE Publications, pp. 141–70. [Google Scholar]
  70. Latour, Bruno. 1984. The powers of association. The Sociological Review 32: 264–80. [Google Scholar] [CrossRef]
  71. Latour, Bruno. 1987. Science in Action: How to Follow Scientists and Engineers through Society. Cambridge: Harvard University Press. [Google Scholar]
  72. Latour, Bruno. 1996. On actor-network theory: A few clarifications. Soziale Welt 47: 369–81. [Google Scholar]
  73. Lohman, Clemens, Leonard Fortuin, and Marc Wouters. 2004. Designing a performance measurement system: A case study. European Journal of Operational Research 156: 267–86. [Google Scholar] [CrossRef]
  74. Lövstål, Eva, and Anne-Marie Jontoft. 2017. Tensions at the intersection of management control and innovation: A literature review. Journal of Management Control 28: 41–79. [Google Scholar] [CrossRef] [Green Version]
  75. Major, Maria, Petri Suomala, and Teemu Laine. 2018. Introduction to the special issue on accounting and innovation. Qualitative Research in Accounting and Management 15: 154–60. [Google Scholar] [CrossRef] [Green Version]
  76. Matthyssens, Paul, and Koen Vandenbempt. 2008. Moving from basic offerings to value-added solutions: Strategies, barriers and alignment. Industrial Marketing Management 37: 316–28. [Google Scholar] [CrossRef]
  77. McKinnon, Jill. 1988. Reliability and validity in field research: Some strategies and tactics. Accounting, Auditing & Accountability Journal 1: 34–54. [Google Scholar]
  78. Melnyk, Steven A., Roger J. Calantone, Joan Luft, Douglas M. Stewart, George A. Zsidisin, John Hanson, and Laird Burns. 2005. An empirical investigation of the metrics alignment process. International Journal of Productivity and Performance Management 54: 312–24. [Google Scholar] [CrossRef]
  79. Melnyk, Steven A., Umit Bititci, Ken Platts, Jutta Tobias, and Bjørn Andersen. 2014. Is performance measurement and management fit for the future? Management Accounting Research 25: 173–86. [Google Scholar] [CrossRef]
  80. Messner, Martin. 2016. Does industry matter? How industry context shapes management accounting practice. Management Accounting Research 31: 103–11. [Google Scholar] [CrossRef]
  81. Michaud, Valérie. 2014. Mediating the paradoxes of organizational governance through numbers. Organization Studies 35: 75–101. [Google Scholar] [CrossRef]
  82. Micheli, Pietro, and Luca Mari. 2014. The theory and practice of performance measurement. Management Accounting Research 25: 147–56. [Google Scholar] [CrossRef]
  83. Moll, Jodie. 2015. Special issue on innovation and product development. Management Accounting Research 28: 2–11. [Google Scholar] [CrossRef]
  84. Mouritsen, Jan, Allan Hansen, and Carsten Ørts Hansen. 2009. Short and long translations: Management accounting calculations and innovation management. Accounting, Organizations and Society 34: 738–54. [Google Scholar] [CrossRef]
  85. Mouritsen, Jan, and Kristian Kreiner. 2016. Accounting, decisions and promises. Accounting, Organizations and Society 49: 21–31. [Google Scholar] [CrossRef] [Green Version]
  86. Mouritsen, Jan, Heine T. Larsen, and Per N. D. Bukh. 2001. Intellectual capital and the ‘capable firm’: Narrating, visualising and numbering for managing knowledge. Accounting, Organizations and Society 26: 735–62. [Google Scholar] [CrossRef]
  87. Neely, Andy. 2005. The evolution of performance measurement research: Developments in the last decade and a research agenda for the next. International Journal of Operations & Production Management 25: 1264–77. [Google Scholar]
  88. Nudurupati, Sai S., Sofiane Tebboune, and Julie Hardman. 2016. Contemporary performance measurement and management (PMM) in digital economies. Production Planning & Control 27: 226–35. [Google Scholar]
  89. Okwir, Simon, Sai S. Nudurupati, Matías Ginieis, and Jannis Angelis. 2018. Performance measurement and management systems: A perspective from complexity theory. International Journal of Management Reviews 20: 731–54. [Google Scholar] [CrossRef] [Green Version]
  90. Otley, David. 2016. The contingency theory of management accounting and control: 1980–2014. Management Accounting Research 31: 45–62. [Google Scholar] [CrossRef] [Green Version]
  91. Reimann, Martin, Oliver Schilke, and Jacquelyn S. Thomas. 2010. Toward an understanding of industry commoditization: Its nature and role in evolving marketing competition. International Journal of Research in Marketing 27: 188–97. [Google Scholar] [CrossRef]
  92. Richtnér, Anders, Anna Brattström, Johan Frishammar, Jennie Björk, and Mats Magnusson. 2017. Creating Better Innovation Measurement Practices. MIT Sloan Management Review 59: 45. [Google Scholar]
  93. Robson, Keith, and Chiara Bottausci. 2018. The sociology of translation and accounting inscriptions: Reflections on Latour and Accounting Research. Critical Perspectives on Accounting 54: 60–75. [Google Scholar] [CrossRef]
  94. Rockness, Howard O., and Michael D. Shields. 1984. Organizational control systems in research and development. Accounting, Organizations and Society 9: 165–77. [Google Scholar] [CrossRef]
  95. Rowe, Casey, Michael D. Shields, and Jacob G. Birnberg. 2012. Hardening soft accounting information: Games for planning organizational change. Accounting, Organizations and Society 37: 260–79. [Google Scholar] [CrossRef]
  96. Schreier, Margrit. 2012. Qualitative Content Analysis in Practice. Edited by Uwe Flick. New York: Sage Publications. [Google Scholar]
  97. Stebbins, Robert A. 2001. Exploratory Research in the Social Sciences. 48 vols. New York: Sage Publications. [Google Scholar]
  98. Vendrell-Herrero, Ferran, Oscar F. Bustinza, Glenn Parry, and Nikos Georgantzis. 2017. Servitization, digitization and supply chain interdependency. Industrial Marketing Management 60: 69–81. [Google Scholar] [CrossRef] [Green Version]
  99. Wouters, Marc, and Celeste Wilderom. 2008. Developing performance-measurement systems as enabling formalization: A longitudinal field study of a logistics department. Accounting, Organizations and Society 33: 488–516. [Google Scholar] [CrossRef]
  100. Yin, Robert K. 2013. Case Study Research: Design and Methods. 5 vols. New York: Sage Publications. [Google Scholar]
  101. Zarzycka, Ewelina, Justyna Dobroszek, Lauri Lepistö, and Sinikka Moilanen. 2019. Coexistence of innovation and standardization: Evidence from the lean environment of business process outsourcing. Journal of Management Control 30: 251–86. [Google Scholar] [CrossRef] [Green Version]
Table 1. Details of the case study.
Table 1. Details of the case study.
Interviewee PositionLevel in HierarchyNo. of Persons InterviewedLength of Interviews in Total (h)
Top management
CEOExecutive11
Vice-presidentExecutive11
Director of InnovationExecutive23
Director of BusinessExecutive35
Director of controllingExecutive11
Global director of engineeringExecutive12
Middle management
Senior innovation staffMiddle46
Project leaderMiddle56
Staff
Innovation staffLow1013
Total 2838
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

de Jong, I.S. Misfit? The Use of Metrics in Innovation. J. Risk Financial Manag. 2021, 14, 388. https://doi.org/10.3390/jrfm14080388

AMA Style

de Jong IS. Misfit? The Use of Metrics in Innovation. Journal of Risk and Financial Management. 2021; 14(8):388. https://doi.org/10.3390/jrfm14080388

Chicago/Turabian Style

de Jong, Ilse Svensson. 2021. "Misfit? The Use of Metrics in Innovation" Journal of Risk and Financial Management 14, no. 8: 388. https://doi.org/10.3390/jrfm14080388

APA Style

de Jong, I. S. (2021). Misfit? The Use of Metrics in Innovation. Journal of Risk and Financial Management, 14(8), 388. https://doi.org/10.3390/jrfm14080388

Article Metrics

Back to TopTop