1. Introduction
The 4th Industrial Revolution [
1] is worth being called a “paradigm shift” as far as a pattern of innovation is concerned. This is true for the preceding industrial revolutions that have occurred [
2,
3]. We can formulate these changes as the change in the mode of technological learning. As is well known, the existence of “learning-by-doing” was first emphasized by Arrow [
4]. It is about learning at the manufacturing phase, after a product design has been specified. When it comes to the industries characterized by a high degree of systemic complexity such as the computer industry, however, learning-by-doing becomes a very
subtle process. In this context, Rosenberg [
5] proposed a clear distinction between “learning-by-doing” and “learning-by-using”.
At the end of the 15 years of United States–Japan dialogue (1984–2000) organized by U.S. Academy of Science and Japan Society for Promotion of Science [
6], it was concluded that Japan’s pursuit of automation had been less important in manufacturing than the use of information technology (IT) by U.S. firms to get a better handle on the processes. IBM was worried in the 1980s about Japanese firms as major competitors, but Microsoft and Intel have since surpassed IBM. Japan could never make a Microsoft. (The interview by Martha Harris with Harold Brown (former Secretary of Defense at Carter Administration) on 10 February 1999.) As the United States Computing power has grown 100 times larger in the 1970s and 1980s, the Solow computer paradox emerged in 1987 [
7]: “You can see the computer age everywhere but in the productivity statistics” and was widely discussed. In 2012, however, the total value of the market capitalization of the four major oil distributers (Exon, Shell, BP, and Chevron) was surpassed by that of the four major data giants (Alphabet, Apple, Facebook, and Amazon) (Nikkei, 3 April 2018). This was because of the transformation of computer technology: the computer, originally conceived as an isolated calculating device, was
reborn as a means of communication [
8]. IT has obviously brought about new modes of technological learning to both IT giants and module suppliers. Additionally, their strategies were also different from the conventional wisdoms.
It is widely said that the 4th Industrial Revolution is characterized by the
fusion of technologies that blur the lines between the physical, digital, and biological spheres [
9]. This fusion concept had already been discussed on the 2nd and 3rd Industrial Revolutions [
10]. Therefore, the new elements of the fusion concept with regards to the 4th Industrial Revolution can be discussed in terms of a new leaning mode and of a new strategic concept [
11]. The Japanese government proposed the phrase “society 5.0” in order to combine societal problem-solving together with the economic progress made by industry 4.0 [
12]. This scheme is composed of sensors, actuators, and of multilayered feedback loops including the edge computing.
Therefore, in order to discuss the new mode of technological learning for society 5.0, case studies on sensors, actuators, and on edge computing were created. We find a preemptive mode of learning both in sensors and in actuators, and the
nonconsumption strategy of [
13]. As a prototypical case of edge solution, a case study on the Capabus (Capacitor-driven Trolley Bus) in Shanghai was conducted. In this case of edge solution, we found that a strategic fit [
14] between the business model and technology is a critical factor for success. Since the capacitor technology was
ported [
15] into a new business model of bus operation, we formulated the mode of learning-by-porting as the dominant mode for edge solutions. Since an overall society 5.0 system is a collection of multilayered edge computing feedback loops, we conclude that the development of society 5.0 will follow learning-by-(multilayered, multiple)-porting [
16].
2. Technological Learning in IT Innovations
The United States–Japan dialogues (The author of this paper was one of the Japanese core member of this dialogue.) from 1984 to 2000, sponsored by the US National Academies of Sciences and Engineering (NAS/E) and the Japan Society for the Promotion of Science (JSPS), were part of a larger context of science and technology relationships that took on urgency and complexity when Japan emerged as a technological power in the 1970s. Viewed from a US perspective, the central factor stimulating interest in new initiatives in the late 1970s and early 1980s, was concern about US technology leadership—both in terms of the threat and opportunity that Japan presented (Martha Harris (2000): “Mutual learning in US-Japan science and technology exchanges”, in the review of the activities by the 149 committee (1984–2000), edited by Society for promotion of sciences, March 2000, pp. 140–218.).
Unlike government-to-government meetings, bilateral meetings of the type convened by the NAS/E and the JSPS were not negotiations designed to lead to formal agreements. Nevertheless, these meetings and related activities provided unique opportunities for leaders from the private sectors in the two countries to exchange views about pressing issues. The NAS/E/JSPS dialogues were, however, one stream of activities among many related to United States-Japan relations. What distinguished them, was the participation of leaders from S/T community (including business people) in informal discussions focusing on the science and technology dimensions of the relationship, with particular attention to the economic implications. In the very end of this dialogue, the US chairman of this United States-Japan dialogue, Harold Brown (the former secretary of defense at Carter administration) summarized the dialogue (The interview by Martha Harris with Harold Brown on 10 February 1999.):
In manufacturing, Japan’s pursuit of automation has been less important than the use of information technology by US firms to get a better handle on the processes. US firms took Japanese practices like just-in-time systems and added information technology. This greatly strengthened US industry, which has been comparatively strong in services but also in other areas as well. We leaned, but the Japanese failed to adjust.
Although the Japanese did well in computer hardware, US firms did much better in software where the freewheeling use of younger people made a big difference. IBM was worried in the 1980s about the Japanese firms as major competitors, but Microsoft and Intel brought IBM low. Japan could never make a Microsoft.
This assertion, I would argue, contains substantial implications, when we put this statement in a broader context: (1) why the Japanese did well in the computer hardware; (2) how Microsoft and Intel brought IBM low; and, (3) why the freewheeling use of younger people could make a big difference.
In order to answer these questions, we have to review the arguments made on the unique nature of the computer revolutions. While the computing power of the United States increased by 100 times in the 1970s and 1980s, labor productivity increase slowed from over 3% in the 1960s to roughly 1% in the 1980s. This discovered phenomenon was conceptualized as the
Solow computer paradox in reference to his 1987
wit: You can see the computer age everywhere but in the productivity statistics [
7]. The paradox has been defined as a perceived discrepancy between measures of investment in information technology and measures of output at the national level. In this same article, Solow argued,
[That] would depend not just on the possibilities the technologies represent, but rather on how effectively they are used. …. What everyone feels to have been a technological revolution, a drastic change in our productive lives, has been accompanied everywhere, including Japan, by a slowing-down of productivity growth, not by a step up.
In order to clarify these confusions, we have to bring these arguments into a more general perspective on technology learning. The innovation process can be best formulated as a
learning process by society. As is well known, the existence of “learning-by-doing” was first emphasized by Arrow [
4] in his article “The Economic Implications of Learning by Doing”. This is about the learning at the production phase after a product design has been specified. Learning at this phase comes from developing increasing skills in manufacturing. This reduces the real labor costs per unit of production. When it comes to those industries characterized by a high degree of systemic complexity such as the computer industry, however, learning-by-doing becomes a very
subtle process. In this context, Rosenberg [
5] proposed a clear distinction between “learning-by-doing” and “learning-by-using”.
He argues that one of the basic purposes of the learning-by-using process is to decide the optimal performance of a durable
capital good as they affect the length of useful
life. He also suggested that the creative use of learning-by-using as a business strategy might be an important factor in the computer industry, which relies on complex
software products to make its systems useful to a broader range of users. The effectiveness of software depends on the experience by users. Indeed, computer companies usually provide software support service, by which they can make software modifications when bugs are found by users. However, it is obvious that IT businesses are going beyond the software development businesses and come to the
networking business. Abbate [
8] described network’s best-known legacies as follows:
The introduction of packet switching and other new techniques and the establishment of a unique tradition of decentralized, user-directed development. Electronic mail and the World Wide Web are prominent examples of informally created applications that became popular, not as the result of some central agency’s marketing plan, but through the spontaneous decisions of thousands of independent users.
Only after we had finished the process of learning-by-using in a sufficient way, therefore, we have arrived at the age of information technology (IT). Thereafter, Microsoft and Intel brought IBM low and Japan could never make a Microsoft. If you try to formulate IT innovations as the learning process, therefore, we will find that neither learning-by-doing nor learning-by-using is appropriate: the former is appropriate only in the manufacturing industry and the latter is appropriated only in the software development businesses. And it is also true that the freewheeling use of younger people in the United States made a big difference.
In this context, therefore, we need a much more
systemic explanation on this phenomenon. Apple, for example, procured selectively all the goods for
functional parts from all over the world, and integrated them into their unique systems. Without comprehending the characteristics of each individual component and the interactive relationships among them, it could not have been possible for Apple to succeed in its system development. For Macintosh, Apple utilized the Sony’s Trinitron TV screen and the Canon’s laser printer. For iPad, it used a multitouch technology, by Jobs saying that the best pointer is your finger [
17]. Therefore, I will introduce “learning-by-integration” as the learning method for IT giants. The
evidence for this argument is vividly demonstrated by
Figure 1, in which the cumulative number of companies acquired by the IT big five (Apple, Google, Alphabet, Amazon, and Facebook) is depicted since 2000. As can be seen, the number has increased drastically after 2010. The integration of functional parts procured from all over the world was indeed implemented in the form of their acquisitions of various companies throughout the world.
It is to be noted that IT giants have emerged after the platform leadership was accomplished in critical IT components such as Intel’s MPU. Therefore, we are interested in formulating the learning process which occurred in the Intel’s MPU, especially until MPU 8086 was adopted by IBM PC in 1981. We will discover that Intel successfully
accommodated the different customers’ specific demands one after another. Therefore, we can call this learning as “learning-by-accommodation”. In order to validate these arguments, we have to answer the question about who first found out the use of the MPU i.e., by whom and how the demand for Intel’s MPUs has been
articulated [
18]. Therefore, the chronology of MPU development in the first ten years since Intel was established, was compiled as shown in
Table 1.
As shown in the table, several Japanese companies including Fanuc, were involved in this kind of demand articulation for MPUs in the Intel’s first ten years of embryonic stage. A Japanese company of calculators,
Bisicom (named after “Business computer”), had an idea of designing a calculator on the basis of the general-purpose large-scale integration (LSI), and gave an order to Intel in 1970. However, Bisicom did not hold on in the severe market competition after this order was made. The manner in which Intel developed MPU 4004 by accommodating this order by Bisicom was well documented by Shima [
20], who developed an idea in Bisicom and was later recruited by Intel Corporation. Becoming aware of its future potential, meanwhile, Intel purchased the rights of outside sales from Bisicom in 1971.
The company that brought MPU 4004 into a first mass-produced application was Toshiba Tec Corporation, which succeeded in applying it to its “cash register”. Tec introduced MPU 4004 into its cash register earlier than NCR Corporation of United States did [
21]. In 1973, Tec sold nine thousand units of its register to German gasoline stations. In Japan, unlike the office environment, Tec products were used in a different environment such as in a
fish market. Since the conditions in which Tec’s registers are being used is very demanding in giving a heavy load to the machine, Tec had to solve many malfunctions in its early stage of development. At that time, Intel confronted with a difficult financial situation, therefore, the order of magnitude of several thousand units per a week, did also help Intel in solving this financial problem. In developing the MPU 8008, Seiko Co. of Japan collaborated with Intel, in order to introduce its “programmable calculator”, S-500 model. Indeed, it was the first LSI desktop computer in the world [
19].
Shibata [
22] described in detail how Fanuc has
collaborated with Intel. In 1978, Intel developed the 8086. In 1979, the Fanuc system 6 used the Intel 8086. It was
a tow-way and
reciprocal collaboration between Intel and Fanuc. The success of adopting the 8086 into the System 6 series made Fanuc competitive in the NC machine tool industry. Beginning in 1979, one year after the introduction of the 8086 MPU, Fanuc became its
first high volume user in the world, for its System 6 series NC. It is surprising that a machine-tool supplier utilized MPU technology for product development earlier than the PC (personal computer) industry did.
In 1981, Intel developed the 8088, an improved version of 8086, which was installed into IBM-PCs. Since then, Intel started allocating their resources to MPU businesses, away from DRAM. In that sense, 8086 turned out to be the beginning of Intel’s platform leader for the PC industry.
What does characterize the way in which Intel became a dominant supplier in the PC industry? How did Intel learn the way in which a series of different demands by early customers were accommodated successfully? We can come to the concept of “learning-by-accommodation”; neither learning-by-doing nor learning-by-using. Indeed, we could arrive at the Internet society by having IT giants like Google and Amazon. However, we will notice that their dominance came only after the emergence of dominant module suppliers such as Microsoft and Intel, who gained their dominant position through their learning-by-accommodation. We can learn this important lesson for giving some thoughts on the coming fourth industrial revolution.
3. A Journey toward a New Fusion
The German National Academy of Science and Engineering [
1] published a report about what is now called as “Industry 4.0”. It says that Germany is uniquely positioned to
tap into the potential of a new type of industrialization—Industrie 4.0—since Germany is a global leader in the manufacturing equipment sector. The first three industrial revolutions came about as a result of mechanization, electricity and IT. Now, the introduction of the Internet of Things and Services into the manufacturing environment is leading in a
fourth industrial revolution. In the future, businesses will establish global
networks that incorporate their machinery, warehousing systems and production facilities in the shape of
Cyber-Physical Systems (CPS).
In the manufacturing environment, these Cyber-Physical Systems comprise smart machines, storage systems and production facilities capable of autonomously exchanging information, triggering actions and controlling each other independently. This facilitates fundamental improvements to the industrial processes involved in manufacturing, engineering, material usage and supply chain and life cycle management. … If this is done successfully, Industrie 4.0 will allow Germany to increase its global competitiveness and preserve its domestic manufacturing industry.
In this context, Schwab [
9] asserted that the 4th Industrial Revolution is characterized by a
fusion of technologies that is blurring the lines between the physical, digital, and biological spheres. The relentless shift from simple digitization (the Third Industrial Revolution) to innovation based on combinations of technologies (the Fourth Industrial Revolution) is forcing companies to reexamine the way they do business (The First Industrial Revolution used water and steam power to mechanize production. The Second used electric power to create mass production. The Third one used electronics and information technology to automate production. Now a Fourth Industrial Revolution is building on the Third, the digital revolution that has been occurring since the middle of the last century.). In fact, the author of this paper is cautious about the use of fusion concept, since he made several misjudgments on predicting the future on the basis of the fusion concept. In his previous works [
2,
3,
23,
24]:
There are two possible definitions. A company can invest in R&D that replaces an older generation of technology—the “breakthrough” approach, or it can focus on combining existing technologies into hybrid technologies—the “technology fusion” approach.
And it was assumed that “reciprocity” of R&D investment between two industrial sectors is the essence of technology fusion, since all participants must enter as equals (mutual respect); each must assume responsibility (mutual responsibility); and all must share in the success (mutual benefit). And it can be found that the formation process of
mechatronics: reciprocity of substantial R&D investment between ordinary machinery and electrical machinery began in 1971; around 1973, communications and electronics joined this grouping. In order to perfect mechatronics, however, precision instruments had to be included. Thus, it was not until 1975 that this quadruple connection began to emerge, as depicted graphically in
Figure 2.
Concerning the emergence of biotechnology, it was predicted that the fusion can be extended beyond the physical sciences and chemistry. In 1974, a triangular reciprocity emerged between the food, drug and medicine, and industrial chemical industries. This connection can be interpreted as the emergence of biotechnology. And it was predicted that Japan’s familiarity with fermentation in food industry may explain why it is said that Japan will become a frontrunner in a much shorter time than one might expect [
25]. In actuality, however, it has later led to the “rational drug design” phenomenon, which promises to streamline and enhance drug development and reshape the biotechnology and pharmaceutical industries. The Business Week [
26] reported:
The traditional way of developing pharmaceuticals was to screen thousands of chemicals in an inefficient, time-wasting hit-or-miss search. Drug designers today use biotechnology to help them backward from what biologists know about a disease and how the body fights it. Dozens of companies zeroing in on drugs to fight nervous system disorders have used biotechnology to unravel brain function, and they will probably use chemical synthesis to create their drugs.
Indeed, we came to know that the biotechnology revolution is science-driven rather than engineering-driven. This implies that the Japanese traditional fermentation technologies were not so much effective in biotechnology development as it had been expected.
As to the emergence of IT, it was stated that technology fusion was limited to manufacturing industries. In the future, fusion will go beyond manufacturing. A product vision in the Japanese electronics industry involves a fusion of audio and video hardware and software with the entertainment industry’s creativity and artistry [
10]. Sony acquired Columbia Pictures Entertainment in 1989, and Matsushita purchased MCA Inc., in 1990. However, Steve Jobs [
17] recalled:
What’s really interesting is if you look at the reason that the iPod exists and that Apple’s in that marketplace, it’s because these really great Japanese consumer electronics companies who kind of own the portable music market, invented it and owned it, couldn’t do the appropriate software. It is because an iPod is really just software.
He was also quoted as saying: We do not think that televisions and personal computers are going to merge. You watch television to turn brain
off, and you work on your computer when you want to turn your brain
on. The fusion among the three spheres—the physical, digital, and biological spheres—will be more difficult than expected. Therefore, we have to be prudent in going directly to 4th Revolution, and a one-by-one bottom up approach is necessary. Indeed, the fusion between manufacturing and service was not realized by the simple extension of manufacturing, but by the freewheeling use of young people. In fact, the German Academy, which introduced the idea of “Industry 4.0”, also reminded:
The journey towards Industrie 4.0 will be an evolutionary process. Current basic technologies and experience will have to be adapted to the specific requirements of manufacturing engineering and innovative solutions for new locations and new markets will have to be explored.
In this context, we can learn from the experience of transformation from the computer paradox to the realization of IT revolution. Abbate [
8] summarized the transformation of computer technology: computing technology underwent a dramatic transformation: the computer, originally conceived as an isolated calculating device, was
reborn as a means of communication. When computers were scarce, expensive, and cumbersome, using a computer for communication was almost unthinkable. When it comes to the realization of IT businesses, it took much more years to invent business models of using the computer as a means of communication. Indeed, it was only after 2012 that the total value of market capitalization of the four oil majors (Exon, Shell, BP, and Chevron) was surpassed by that of the four data majors (Alphabet, Apple, Facebook, Amazon), as shown in
Figure 3. The experiences of this IT revolution indicate that we will have a long way to go until the 4th industrial revolution will become a reality.
4. Pre-Emptive Learning by Module Suppliers of the Society 5.0 Program
The Council for Science, Technology and Innovation (CSTI) of Japanese Government proposed an idea of “Society 5.0”. It aims at a societal problem-solving together with the economic progress which was originally proposed by German’s “Industry 4.0”, as shown in
Figure 4. This figure shows that the structure is composed of sensor, actuator, Internet, and
feedback loops, together with artificial intelligence (AI) and Big-data [
12]. It also demonstrates: when the system is augmented with the
sensors and the
actuators, the technology becomes the cyber-physical systems, which also encompass technologies such as smart grids, intelligent transportation and smart cities.
As described before, the experience of IT revolution made it clear that the learning consists of the learning-by-accommodation by a key module suppler and of the learning-by-integration by IT giants. In order for the society 5.0 to be realized, we should analyze who will be dominant module suppliers in the area of sensor and of actuator. And we are also interested on how the fusion of physical and cyber spaces will be conducted.
In 2015, Volkswagen AG had admitted the years of their cheatings on emissions tests. The VW story of using the
defeat device is well documented by Ewing [
27]. However, it is less known, however, that Horiba Co., a Japanese manufacturer of the emission gas testing system, played a key role in this scandal (Bloomberg, 2 October 2015). Horiba was founded in 1945 by Masao Horiba, with a goal of continuing the nuclear physics research that had been disrupted by World War II. During the Japan’s 1960s postwar economic recovery, Horiba made a diversification and developed its first emission analyzer. Its entry into the market was in their response to growing unease about the air pollution. In 1998, Horiba started developing its first onboard emission analyzer, i.e., an analyzer of automotive emission gas while being driven.
Then, how did Horiba become a detector of the Volkswagen’s fraud? In fact, U.S. researchers used Horiba’s onboard measuring systems in a multiyear testing project that ended up catching Volkswagen in a cheating about their engines it had. The Horiba’s equipment helped inform the researchers about a scheme in which Volkswagen group cars around the world polluted more on the road than in stationed tests, above the U.S. limit that the law allows. How can we best describe the Horiba’s accomplishment? Horiba accommodated the
unfulfilled needs for environmental protection, and the onboard emission analysis had been neglected by the US and European providers. And Horiba targeted the new markets of “nonconsumption (Christensen argues: A new-market disruption is an innovation that enables a larger population of people who previously lacked the money or skill now to begin buying and using a product and doing the job for themselves. From this point onward, we will use the terms nonconsumers and
nonconsumption to refer to this type of situation, where the job needs to get done but a good solution historically has been beyond reach. We sometimes say that innovators who target these new markets are competing against nonconsumption.)”, an expression proposed by Christensen and Raynor [
13]. In order to accommodate the
unfulfilled needs for the idea of society 5.0, indeed, module suppliers have to take a strategy of nonconsumption. By developing the portable and onboard emission analyzer, Horiba
pre-empted the innovation for the realization of the clean society. Therefore, the learning mode for the module suppliers of society 5.0 will be “learning-by preemption”.
The validity of the learning mode and the strategy concept implemented by Horiba in the sensor Area—learning-by-preemption and strategy for nonconsumption—can be tested further by examining another important component for the society 5.0 system, i.e., actuator. Nidec Corporation is a Japanese manufacturer of electric spindle motors. Their products are used in hard-disk devices (HDD). As of 2015, the company has 230 subsidiary companies around the world. The company obtained the 42th position on the 2005 edition of Business Week Infotech 100 list. Nidec was also featured on the 2014 Forbes World’s Most Innovative Companies list (Nikkei Business, 24 October 2016).
The Nidec Corp grew up so rapidly, starting from a spindle motor of audio-devices to hard-disk drives for PC, and now expanding their businesses into automobiles, (Nikkei, 5 December 2017). Nidec has managed this expansion by acquiring carefully domestic and overseas companies: it has acquired a total sum of
49 companies in
33 years. (Nikkei Business, 24 October 2016) All the acquisitions have generated
synergies by combining Nidec’s competences with those that were owned by the acquired companies. And these synergies made possible for Nidec to enter into new businesses such as car-related products, and resulted in a drastic change in their portfolio from 2005 to 2015, as seen in
Table 2. As seen in the table, a share of precision motors was 51% of the total company’s sales in 2005, but it was slightly reduced to 38% in 2015. On the other hand, a share of motors used for commerce and industrial companies including car and home appliance manufacturers was only 7% in 2005, but became the largest segment of the Nidec’s total sales (47.1%) in 2015.
Quite recently, Nidec established a joint venture company with a French (Peugeot Citroën) group in developing motor for EV, and in shipping their products to PSA. Nidec is keeping a dominant position in providing electric motors for the power-steering (better in fuel economy than hydraulic power steering), for the Dual Clutch Transmission (better in fuel economy than automatic transmission), and for Electric Oil Pump (keep oil pressure of transmission high while the engine stops running due to idling-stop function). However, the providing the automobiles with traction motors is unknown to Nidec. Therefore, Nidec will take a CEO position in these joint efforts, and its business will grow to the 40% of their total sales (Nikkei 4 December 2017).
The acquisition is an efficient way of obtaining technologies, but this method is sometimes risky for relying on outsourcing. It does not give any incentives for exploiting new business opportunities. What kind of indigenous R&D activities made possible such a drastic growth in such a short time horizon? It is the R&D targeted to make possible the technological independence in the midst of rapid acquisitions. Such a development will be illustrated (drawn from a term paper (2012) by Y. Okayama at Business School of Kwansei University). It was about the bearing technology for HDD. The dominant-design of bearing in HDD had been that of ball bearings. The ordinary ball bearings, however, would cause high noise and vibration. The fluid dynamic bearings (FDB) inserts the fluid substance (oil) to maintain the separation between the bearing races. The dynamic fluid pressure which occurs during rotation is used to sustain the spindle’s rotating. Compared to the ball-bearing, this mechanism makes possible not only higher anti-shock but also the smaller aptitude of vibration. It enhances the precision of rotation, thus, upgrades the memory capacity of HDD. Since there is no physical contact, it is quieter, and the further miniaturization becomes available easily.
The most important feature for HDD is, of course, the capacity for memory. When the capacity becomes bigger, the specification for the motors becomes demanding. The engineers at Nidec become aware that the requirement will be soon above the technical extrapolation of ball bearings. At Nidec, therefore, while they are trying to improve the product based on ball bearings technology, they established the R&D center of FDBs, which was placed far off the R&D center for ball bearings, i.e., under an
ambitious organization for managing
evolutionary and
revolutionary change [
28].
Thus, the successful development of FDBs made for Nidec to replace the dominant design which was kept in almost 20 years since the invention of HDD. This accomplishment put Nidec in a positon of 78 percent of market share in the HDD motors. This pattern follows exactly that we called “learning-by-preemption”, and this strategy aimed at “nonconsumption”, which proved to be effective in the sensor development.
The digitalization of cars’ control lies technologically in the three elements: a
sensor, an
actuator, and an electronic control unit (ECU) that is installed between sensor and actuator. The sensor, meanwhile, detects the signals about the change happening outside the system, and passes it to the ECU. The actuator acts on the system according to the instruction obtained from the ECU. Therefore, the critical element of the cars’ digitalization is the ECU, which corresponds to the computer. As of 1995, approximately
10 ECUs were already installed in a small car including automatic transmission, the steering, the windows, the mirrors, and the air conditioner. The bigger cars carried about 20 ECUs, and the high class cars did more than 30 ECUs [
29]. As Nidec will expand their products into the area of auto-driving automobile systems, their development of actuators will be much more following the pattern of learning-by preemption and strategy of nonconsumption than before. Otherwise, they will not able to obtain a large market segments of their actuator business.
5. Learning for the Fusion between Physical and Cyber Spaces
When it comes to the fusion between physical and cyber spaces in the society 5.0, it will be built around a
multilayered structure consisting of multiple feedback loops. The Japanese CSTI scheme includes the two kinds of feedback loops: local and overall loops. The local loop is called “edge computing”, as shown in
Figure 5. First of all, we will start with the local loop, i.e., edge computing.
As seen in the figure, the bottom layer of the structure is composed by “edge computing” units. Edge computing takes data and computing power away from cloud computing to the logical extremes of a network. For generalization, we had better call it “edge solution”, instead of edge computing. Low cost cluster hardware and freely available cluster management software have made Edge Solution affordable to developing nations. In the catching-up process, therefore, they might produce a technological
leapfrogging in edge solution. We find a prototypical case of edge solution in Capabus (Capacitor-Driven Trolley Bus) in Shanghai, as depicted in
Figure 6. As seen in the figure, a traction vehicle that uses supercapacitors (also called ultracapacitors) to store electricity, and runs without continuous overhead lines by using power stored in large onboard electric double-layer capacitors (EDLCs), which are quickly recharged whenever the vehicle stops at any bus stop (under so-called electric umbrellas), and fully charged in the terminals.
However, our fundamental question is: why do they use capacitor instead of battery? The buses have very predictable routes and need to stop regularly every 3 miles or less, allowing quick recharging at charging stations at bus stops. A collector on the top of the bus rises a few feet and touches an overhead charging line at the stop; within a couple of minutes the ultracapacitor banks stored under the bus seats are fully charged. The dynamic characteristics of supercapacitors fits the bus’s business model very well, in that the routs are predictable, and regular stops are forced every three miles. The buses also need quick recharging at bus stops. Supercapacitors can be compared with batteries in terms of the relationship among the three variables: voltage (V), electrical charge (Q) and energy (E), as shown in
Figure 7, where the relationship of supercapacitors displayed in the upper picture (1) and batterers in the lower picture (2).
Chesbrough [
30] once came to the following conclusion: The value of an idea or a technology depends on its business model. There is no inherent value in a technology
per se. The value is determined instead by the
business model used to bring it to market. An
inferior technology with a better business model will often defeat a better technology commercialized through an inferior business model. A mediocre technology pursued within a great business model may be more valuable than a great technology in a mediocre business model.
In his renowned article of “What is strategy”, Porter [
14] concluded his paper with the statement: strategy is creating “fit” among a company’s activities. If there is not fit among activities, there is no distinctive strategy and little sustainability. The case of capacitor trolley bus in Shanghai tells us the importance of a “fit” between business model and technological choice, when it comes to the strategic formulation of the
edge solution. In this context, a mediocre technology might be pursued within a better business model.
Then, we will find a new mode of learning in the system integration of edge solution. It can be done by
porting an existing technology used for a different purpose into a new business model which is better than existing business models. This mode can be called as “learning-by-porting”, a mediocre module built in one architecture is to be
ported into another architecture and is able to function under this different architecture (Harvard Business School scholars, Baldwin and Clark [
15], in this context, tried to use the computer as the powerful lens through which to observe and study the evolution of designs, and the development of an industry. They found out strikingly: the changes that can be imagined in a modular structure are spanned only by six, relatively simple modular operators. These operators can generate all the possible evolutionary paths for the structure. The six modular operators are: splitting, substituting, augmenting, excluding, inverting, and porting. The “porting” operator, as the name suggests, ports the modules to other systems. The other five operators only work within their respective system. Porting occurs when a hidden module “breaks loose” and is able to function (via translation) in more than one system, under different sets of design rules, i.e., a different architecture.). According to Professor Youichi Hori at University of Tokyo, the rationales behind
porting capacitors instead of batteries to the engine of trolley bus can be summarized [
31]:
- (1)
A cruising distance of 3~4 km can do without battery;
- (2)
the capacitors need only a short charging time;
- (3)
the capacitors have a smooth acceleration since obtaining electricity is easier;
- (4)
it is inexpensive due to no need of storage function;
- (5)
the inserting of inverter makes possible a stable operation in a low voltage range;
- (6)
there is no deterioration due to charge/discharge since no chemical reaction exists;
- (7)
the possibility of wireless charge will enhance the efficiency by 90%; and
- (8)
it is good for stable outdoor charge and safety against electric shock in raining.
Although an illustration of the edge solution is drawn from the Chinese experience of Capabus, the Japanese industries’ interests in the ideas of Industry 4.0 and of Society 5.0, are more or less focused on the edge computing. In order to consolidate factory automation (FA) with IT (information technology), indeed, some Japanese companies established EDGECROSS (
https://www.edgecross.org/) (a consortium for edge computing) in May of 2018. This consortium aims at building IoT (Internet of Things) in manufacturing. More than 180 companies including non-Japanese companies are registered as its supporting members. (The founding members are: Advantech (manufacturer of industrial computer in Taiwan), Omron, NEC, IBM Japan, ORACLE Japan, Hitachi, and Mitsubishi Electric.) Based on this development in Japan, we can confirm that the Japanese approach to society 5.0 is “bottom up” one, alluding to the assumption that a whole IoT would be constructed by a hierarchical structure of edge computing. This approach can be contrasted with the “top down” approach by Siemens and Rockwell Automation (United States).
For the fusion between physical and cyber spaces, therefore, the overall feedback of society 5.0 can be also constructed based on learning-by-porting. In order to complete the system for society 5.0, i.e., fusion between physical and cyber spaces, one porting is not enough. We need porting after porting. Therefore, the evolution into the 4th industrial revolution will proceed through multiple porting. Therefore, the learning mode is learning-by-multiple porting. And the overall strategy should be structured as multilayered porting.
Finally, we can come to a summary table in which learning and strategy can be compared between the 3rd and the 4th industrial revolution, as shown in
Table 3.
6. Conclusions
In this paper, first of all, I described that the IT revolution is more than just computer innovations, and investigated why the Solow computer paradox phenomenon has disappeared. I analyzed this paradox in terms of modes of learning by both IT giants and by a module supplier. Secondly, I moved to the 4th industrial revolution, and described that the 4th revolution is more than the 2nd IT revolution. I analyzed this difference again in terms of learning modes and strategic concepts. For this analysis, I used the society 5.0 scheme proposed by the Japanese government. By focusing on the edge solution in the society 5.0 system, I found a new mode of learning, i.e., learning-by-porting. Additionally, it is also ascertained that the overall system development of the society 5.0 will be also governed by the multilayered, multiple porting.
We also find that the innovation process for the 4th Industrial Revolution will be gradual, incremental, and, most importantly, it is essentially additive, i.e., the value is added continuously. The innovations in the coming industrial revolution, therefore, are not based on conventional creative destruction, but on creative accumulation. When the effects of accumulation go beyond a certain threshold level of fusion, however, we can expect that the “Physical-Cyber Renaissance” will come to us after we had been wandering along through the “dark ages” of the 1st, 2nd, and of the 3rd Industrial Revolutions.