Next Article in Journal
Artificial Intelligence-Based Controller for DC-DC Flyback Converter
Previous Article in Journal
Identifying Areas Sensitive to Wind Erosion—A Case Study of the AP Vojvodina (Serbia)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Application of Neural Networks to Explore Manufacturing Sales Prediction

1
Department of Industrial Engineering and Management, National Kaohsiung University of Science and Technology, Kaohsiung 80778, Taiwan
2
Department of Aeronautical Engineering, Chaoyang University of Technology, Taichung 41349, Taiwan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2019, 9(23), 5107; https://doi.org/10.3390/app9235107
Submission received: 5 November 2019 / Revised: 20 November 2019 / Accepted: 21 November 2019 / Published: 26 November 2019

Abstract

:
Manufacturing sales prediction is an important measure of national economic development trends. The plastic injection molding machine industry has its own independent R and D energy and mass production technology, with all products sold globally through international brands. However, most previous injection molding machine studies have focused on R and D, production processes, and maintenance, with little consideration of sales activity. With the development and transformation of Industry 4.0 and the impact of the global economy, Taiwan’s injection molding machine industry growth rate has gradually flattened or even declined, with company sales and profits falling below expectations. Therefore, this study collected key indicators for Taiwan’s export economy from 2008 to 2017 to help understand the impact of economic indicators on injection molding sales. We collected 35 indicators, including net entry rate of employees into manufacturing industries, trend indices, manufacturing industry sales volume indices, and customs export values. We used correlation analysis to select variables affecting plastic injection machine sales and artificial neural networks (ANN) were applied to predict injection molding machine sales at each level. Prediction results were verified against the correlation indicators, and seven key external economic factors were identified to predict accurate changes in company annual sales prediction, which will be helpful for effective resource and risk management.

1. Introduction

Production and sales volumes for Taiwan’s plastic and rubber machinery in 2017 was $NTD 43.1 B, representing 6.4% growth over 2016. Total export value was $USD 1.163 B, 12.8% increase over 2016, with corresponding import $USD 0.387 B, 8.7% increase over 2016. However, 2018 sales volumes were only $NTD 7.6 B, 8.5% reduction compared with the previous year’s $NTD 8.3 B, for 4,528 units. The largest export country was China, then Vietnam ($NTD 1.34 B, 21.4%) and India ($NTD 0.721 B, 11.4%) [1]. Market changes in recent years have been somewhat volatile since plastic and rubber machinery makers are traditional resource-dense industries and manufacturers sought to reduce manufacturing cost by gradually transferred China production bases to Southeast Asia [2]. Industry 4.0 has rapidly increased the demand for high technology precision machinery and automated equipment, which forced transformations onto plastic and rubber manufacturing industries. It was hoped that the large number of plastic and rubber machinery manufacturers in Taiwan (more than 300 in 2018) could be gradually integrated more to help markets become more competitive. Product classification, reconciled functional differences, quality improvements, and market segmentation, helped reduce friction and prices, but a consensus was, and still is, required for manufacturers to avoid using price bargaining as a competitive market strategy [1].
The plastic injection molding machine industry was one of the few Taiwanese industries with recognized international brands. The FCS Group for example, manufactured a wide range of highly customized products to meet global market demand due to geo-correlation. Final manufacturer production and sales strongly impact many supporting industries and groups, including raw material suppliers, mold fixtures, consumer demand research, advertising promotion, etc. The primary motivation for the current study was how to reduce company costs, where we considered the expected major impact aspects of production profit, sales and service costs, and business management, along with various potentially important factors such as accounts receivable, inventory turnover, foreign currency exchange rates, and sales prediction and planning. These all affect revenue, inventory accumulation, capital backlog, financial interest, etc.
Previous studies regarding machinery industries focused mainly on the tool machine industry, with few considering plastic injection machine industries, and these focused almost exclusively on institutional design or injection molding conditions. Although some of these latter studies considered plastic injection molding machine markets, they did not include sales activities. Sales prediction is an important research area, with sales revenues deeply affected by economic changes, and overall economics strongly affecting goods sales. Commodity market supply and demand should be closely considered to improve sales predictions. Recent and rapid development of technology and information has exacerbated unpredictability for many factors, resulting in sometimes severe sales revenue fluctuations. Therefore, to ensure correct predictions, we must carefully consider future resource issue developments; government and financial sector viewpoints on economic policies, important indicator changes for primary and processing industries, national and sector growth rates, with particular focus on economic impact due to emergencies.
Enterprise sales predictions traditionally employ regression analysis or direct artificial expert prediction, based on many analyses and market environment variables [3]. Although many large and complex models have been developed based on sales and sales structures, others remain relatively simple. Recent markets have been largely dominated by price competition. If a given company’s prediction model was insufficiently accurate, and predicted higher sales volumes than were actually achieved, inventory could increase greatly, with consequential cost increases. In contrast, when sales were above expectations, production eventually is unable to meet customer demands, which causes loss of business opportunity. Thus, how can plastic injection machine manufacturers understand factors affecting the market economy and verify sales volume through different prediction models? This would help improve sales prediction accuracy, reducing inventory, capital, procurement, and logistics costs while improving production efficiency and order delivery rates with consequential benefits for profitability.
Therefore, this study objectives were to:
  • Establish an accurate model;
  • Understand external economic variables’ impacts on prediction models; and
  • Establish a sales prediction framework for practical and feasible plastic injection machine markets to provide assistance for business management decisions.
The remainder of this article is organized as follows. Section 2 reviews the relevant literature, and Section 3 introduces the assessment methodology employed. Section 4 verifies the proposed methodology using an example with practical variables affecting plastic injection machine sales data. Section 5 summarizes and concludes the paper, and discusses some useful directions for future research.

2. Literature Review

Prediction is an important aspect of data exploration based on various statistical methods to identify useful trends or models from historical data, which can then be used to obtain predictions for the next period or cycle [4]. Thus, predictions anticipate future conditions, focusing on practical applications and integration, and generally incorporate mathematical models based on historical data to provide subjective or intuitive future expectations, or a synthesis of these, i.e., a mathematical model incorporating expert judgment and adjustment.
Every artificial neural network (ANN, NN) model can be classified in terms of its architecture and learning algorithm. The architecture (or topology) describes the neural connections, and the learning (or training) algorithm provides information on how the ANN adapts its weight for every training vector [5]. Artificial neural networks (ANNs) have become widely used for many commercial applications [6], but only a few previous studies have focused on the overview of published research results in this important and popular realm. Most previous studies have focused on financial distress and bankruptcy issues, stock price prediction, and decision support with particular emphasis on classification tasks [7]. ANNs have been widely used for commercial prediction, with traditional multilayer feed-forward networks with gradient descent back-propagation and various hybrid networks developed to improve standard model performances [8].

2.1. Sales Prediction

Sales predictions have been used in research and practical application for many years, with the various prediction models broadly divided into qualitative and quantitative approaches. Quantitative approaches collect historical data and analyze developing trends to provide useful input for management decision making. Prediction methods include case based reasoning, ANNs incorporating artificial intelligence and regression analysis, moving average methods, and exponential smoothing methods based on statistical logic. Analyzing key product characteristics and combining previous sales history data helps to provide better sales predictions from limited data volumes.
Sales prediction is also important for e-commerce, critically impacting wise decision making, and helping to manage workforce, cash flow and resources; and optimize manufacturer supply chains. Thus, sales prediction is generally somewhat challenging because sales are affected by many factors, including promotions, price changes, user preferences, etc. [9], and yet impacts significantly on important company policies and plans, with accurate predictions helping to reduce costs, optimize inventory management, and improve product competitiveness [10].

2.2. Relevant Factors

Relevant factors meant the correlation between variables in data and the correlation between variables and variables in view of whether any linear correlation. Most statistical methods consider correlations between independent variables and variables, whereas factor analysis investigates the relevance for multiple variables. Potential variables affecting variables behind were found even if these variables were not directly measured. These inferred potential variables were called factors [11,12]. Previous factor studies have covered commercial, e.g., [13,14]; sales, e.g., [15,16,17]; industry, e.g., [18,19,20]; and manufacturing, e.g., [21,22,23] applications amongst many others, and has been shown to effectively extract important factors that can then be employed for prediction using different methods.
Companies use identified sales factors for overall economic, industry, and company sales predictions. Overall economic prediction includes inflation, unemployment rates, interest rates, consumer spending and savings, investment, government spending, net export, and gross national output; whereas industry prediction uses the sales factors and other industrial environmental indicators to predict industry sales, and company sales are generally predicted by including company market share, new product listings, and marketing activities [24].

2.3. Artificial Neural Network Prediction

Artificial neural networks were developed in 1950s, and the original neuron-like perceptron model was proposed in 1969 [25]. However, the original theory was somewhat simple and was not taken seriously until Hopfield defined the modern ANN form in 1982. Many diverse ANNs have been subsequently proposed incorporating new architectures and theories, and the enormous increase of computing power has allowed modern and powerful ANNs to be widely used, with many practical ANN applications across various industries.
Various market sales prediction models have been proposed based on back-propagation ANNs, incorporating improved methods to compensate for known back-propagation inadequacies, particularly for problems with agnostic rules. Although the PCB industry has significant impact on the Taiwanese economy, serious inventory dumping and material shortages remained. Therefore, Chang et al. [26] established an ANN-based prediction model to solve these inventory and material shortage problems. Au et al. [27] developed an optimized ANN prediction system for clothing sales prediction using two years of clothing sales data. They compared model performance with basic fully connected ANN and traditional predictive models, and showed that the proposed algorithm was superior to traditional prediction methods for products with low demand uncertainty and weak seasonal trends. Thus, the method was suitable for fashion retailers to produce short-term retail predictions.
Kong and Martin [28] used back-propagation networks (BNPs) to predict future food sales for large wholesalers in Victoria. They showed that the proposed BPN method provided superior prediction results compared with traditional methods for trending and market analysis, i.e., simple linear regression models. Hence, the BPN model could provide a useful tool for sales prediction. Thiesing et al. [29] used an ANN trained by BPN to predict future time series values, including weekly demand for supermarket items, considering prices, advertising activity, and holiday impacts.
Convenience stores are an integral part of the retail industry, distributing goods to consumers from suppliers. Some convenience stores were out of stock of customer desired products, where others had excess of the same products. However, customer satisfaction was always significantly impacted where desired goods were out of stock, regardless of how good the service was. Thus, controlling ordering and inventory has become a very important issue for store convenience store management. Therefore, Chen et al. [30] developed an ANN based system to control orders and manage the inventory in convenience stores based on business circle and sales prediction operational characteristics. The proposed system improved order and discard rates to better ensure ordering the right items in the correct quantities.
Vhatkar and Dias [31] considered several sales prediction algorithms and developed an inverse transfer-like ANN model to predict oral care product sales and error rates for several different products. Mo et al. [32] proposed an optimized BPN method to improve supermarket daily average rice sales prediction accuracy. Weron [5] proposed an electricity price forecasting. This review article aims to explain the complexity of available solutions, their strengths and weaknesses, and the opportunities and threats that the forecasting tools offer or may be encountered. Cincotti et al. [33] used an ANN to analyze electricity spot-prices of the Italian Power Exchange.

3. Methodology

3.1. Pearson’s Correlation Coefficient

Pearson’s correlation, Γ, is a commonly employed measure for the linear correlation between two variables [34]:
Γ = i j i j n i j ( i i n j + ) ( j j n + j ) n + + 1 S Q R T { [ i i 2 n i + ( i i n i + ) 2 n + + 1 ] [ j j 2 n + j ( j j n + j ) 2 n + + 1 ] } ,
where n i + is the number of observations in the i-th row, n + i is the number of observations in the i-th column, n i j is the number of observations in the i-th row, j-th column cell, and n + + is the total number of observations. A perfect linear correlation between the variables would have |Γ| = 1, with positive result meaning the two variables increase together, and negative meaning one decreases as the other increases.
Hypothesis testing is also commonly employed to test if the data supports the null hypothesis, i.e., the two variates are not statistically different. The metric for this test is the probability of generating a type I error:
ρ   =   i = 1 n ( X i X ) ( y i y ) ( n 1 ) S X S y ,
where X and y are sample means for the first and second variable, respectively; S X and S y are standard deviations for the first and second variable, respectively; and n is the column length.
Thus, the null hypothesis for Pearson’s correlation is H0: r = 0, and the commonly apply test is to reject the null hypothesis if ρ ≤ 0.05, i.e., at the 95% confidence level.

3.2. Artificial Neural Network Principles

The BPN is one of the most widely used and representative models in current ANN learning models. Werbos [35] proposed the basic concept and Rumelhart et al. [36] defined the underlying theory and algorithm from the propagation or generalized delta learning rule. ANNs simulate biological nerves, using continuous learning and error correction to achieve accurate output. Modern ANNs commonly incorporate large learning databases, parallel processing, nonlinear outputs, and multiple layers.
The basic ANN component is a processing element (PE), where the output from each processing unit is connected to the processing unit of the next layer. Input for the next processing unit originates from corresponding processing units in the upper layer with the sum of the output values. Processing unit output is generally expressed as a function of the sum of the input and linked weighted product:
Y j = f ( n e t ) = f ( i W i j X θ j ) ,
where Y j is the output from a neural network processing unit, X i is the set of input values for the processing unit, n e t are integration functions, θ i is the applied threshold, W i j is the influence of intensity of the i-th processing unit in the previous layer exerted on the j-th processing unit in the latter layer between processing units of neural networks were between the data transfer paths between the processing units, and f is the transfer function of neural network processing units obtained the output value via the processing unit transfer functions by using the sum of the input values input from the other processing units and join weight values.

3.3. Back-Propagation Network Algorithm

Back-propagation networks were widely used in ANN architectures [37], providing supervised learning [36]. BPN architecture uses input layers to represent network input variables, where the number of processing units depends on problems by using a linear transfer function, a number of hidden layers, and a single output layer. Propagation in a neural network can be forward or backward. Through these two processes, output errors can be reduced, and the network can be set to learn how to solve a given problem. Forward propagation starts from the input layer and produces computations at each layer until it reaches the final layer. Reverse propagation starts from the output layer and computes weights and errors for each previous layer until it reaches the input layer. Back-propagation compares the gaps between the targeted output and the actual computed output values. The synapse values are re-adjusted to minimize errors [38]. The propagation rule is the basis of the architecture shown in Figure 1.
Hidden layers represent interactions between input units. There is no specified method to decide upon the appropriate or correct number of processing units or hidden layers, hence most studies used accumulated experience to determine these. Generally, the number of hidden layer processing units increases with increasing problem complexity. However, too many processing units causes memorization of the network and poor inductive power; and too few means the network will be unable to obtain the correct mapping relationship between the input and output.
The number of outer layer processing units also depends on problem complexity. If the output is a single real value, the number of output layer processing units = 1, whereas the number of processing units for classification problems depends on the number of classifications. To refrain the processing of a big sample set of data, an ANN tool including a particular training-validation-test procedure for small datasets has been developed some years ago and recently refined in order to obtain realistic and reliable regression laws [39,40,41].
The basic BPN is to minimize the error function using the gradient steepest descent. For a given training paradigm, the network slightly adjusts link weighting values proportional to the error function’s sensitivity to the weighting values. Thus, error functions are proportional to the magnitude of the partial differential values of weightings [12]. Typically, the error (or energy) function representing learning quality has the form:
Δ W = η E W ,
where η is the learning rate, which controls the magnitude for every change of weighting value; η is learning rate; and E is the error, hence larger E implies poorer network quality:
E 1 2 ( T j Y j ) 2
where T j is the target output value for the j output unit in the output layer, and T j is the inference output value of the j output unit in an output layer.
Substituting Equation (5) into Equation (4) and applying the chain rule:
E W i j = E Y j Y j n e t j n e t j W i j = ( T j Y j ) f ( n e t j ) ,
If we define δj as the error of the j output processing unit in an output layer, then:
δ j = ( T J Y J ) f ( n e t j ) ,
Thus, the correction for W i j is:
Δ W i j = η E W i j = η ( T j Y j ) f ( n e t j ) X i = η δ j X i ,
and the correction to the threshold for the output unit is:
Δ W i j = η E θ j = η δ j .
Backward correction, i.e., Equations (7)–(9), are repeated layer by layer to obtain the weighting value between each layer and threshold for each neuron, network output values and target values were gradually reduced. The error made the establishment of network models reachable.
Generally, the momentum coefficient or inertia factor, α, is added to Equations (8) and (9) to correct the previous weighting value, reducing oscillation during convergence and making training smoother, i.e.:
Δ W i j ( n ) = η δ j X i + α Δ W i j ( n 1 ) ,
and:
Δ θ i j ( n ) = η δ j + α Δ θ j ( n 1 ) .
There are many data splitting methods reported and used in literatures. These methods can be roughly categorized into three different types [42,43,44]: (1) Cross-validation, (2) randomly selecting a proportion of samples and retaining these as a validation set and then using the remaining samples for training, and (3) the Kennard–Stone algorithm. In this study, we choose the random method for the ease of application, and are applied on 7/8 portion of training data. The remaining 1/8 portion is the test data.
The ANN is trained by selecting a training set of samples to repeat the learning steps multiple times until errors converge. However, training loops should not be too long to avoid over-training the ANN, ensuring new input samples produce correct outputs.

3.4. Performance Indicators

Predictive metrics compare predicted and actual values to confirm model feasibility. This study used the root mean squared error (RMSE) [16]:
R M S E = 1 n × i = 1 n ( T i A i ) 2 ,
and mean absolute error (MAE) [17]:
M A E = 1 N j 1 N b j 2 ,
to measure differences between actual and predicted values. Smaller RMSE and MAE shows predicted values were closer to the actual values [45], and hence the model provided better predictive power.

4. Results and Discussion

The collected dataset comprised 120 entries from 2008 to 2017 for key external economic indicators affecting predicted sales, including net employee entry rate into manufacturing and service industries, trend indices, manufacturing industry sales volume indices, and customs export indices, as shown in Table 1.

4.1. Pearson’s Correlation

To improve BPN prediction accuracy the data was pre-processing used Minitab V16 to identify significant correlations, as shown in Table 2.
Pearson’s correlation was considered significant (i.e., ρ < 0.05) for 12 items: X6, X8, X9, X11, X12, X13, X24, X25, X26, X27, X29, and X30. To improve prediction accuracy, it was recommended to use the correlation coefficient with moderate correlation and low correlation indicator items, with p values selected from the significance below 0.001 as an indicator item. The values with negative correlations were not included. The scenario studied contains 120 entries of actual data released by the government [46].
Thus, we included X6, X8, X11, X12, X27, X29, and X31 (7 items), as shown in Table 3.

4.2. Artificial Neural Network Prediction

Study parameters were established from previous ANN studies. Zhang et al. [47] showed that a single hidden layer was sufficient for most nonlinear problems, and Lee and Chen [48] showed there was no standard for setting the number of hidden layer nodes, but recommended to use n, i.e., the column length or number of variates, as the number of nodes in the input layer to test the number of hidden layer nodes. Previous studies considering generally considered setting the learning rate for predictive and estimated problems between 1–10, with subtypes were set between 0.1–1.0 Kuo [49] indicated there was no absolute rule to determine network parameters, and parameter settings providing improved results could only be found through trial and error.
Therefore, we set BPN within Neuro Solutions 7. The network has one hidden layer, with its number neurons as twice the input layer, whereas one neuron in the outer layer is used. The conversion function used is sigmoid, with learning rate, training iterations and error targets as 0.01–1.0, 2000 epochs, and MSE ≤ 10.5, respectively.
The BPN operation steps are as follows:
  • Set the number of input, hidden, and output layers; inertia coefficient, learning rate, learning cycles and other relevant parameters.
  • Randomly determine initial BPN weights.
  • Transfer to the BPN to start calculation.
  • Correct join and bias weights according to errors.
  • If not yet converged, repeat steps 3–5.
After selecting the relevant factors, they were input into the BPN learning using 7/8 of the available data as the training set with the remaining 1/8 served as the test set. The BPN training algorithm generally uses a steep gradient algorithm, but this has slow convergence and can easily fail. Therefore, this study used the Levenberg–Marquardt (LM) algorithm for training, with maximum iterations = 2000. Figure 2 shows the average MSE and standard deviation from three runs.
The training data achieved convergence after 3 runs. All MSEs were below 0.1 with targeted effect reachable shown as Figure 3.
The training parameters were entered into the BPN and the test dataset predicted, as shown as Figure 4. The training set had MSE = 0.025882721 ± 0.000802464., as shown Table 4.

4.3. Performance Assessment

Table 5 summarizes the final BPN performance (for the test dataset). After analysis by relevant factors, the variables with higher homogeneity were clustered together, but the data with significant differences were separated. Therefore, compared with the traditional statistical regression model, the prediction accuracy was very high relatively.

5. Conclusions

The Taiwanese plastic injection machine industry has not previously analyzed correlations between predicted industrial sales accuracy and key economic indicators. Therefore, this study used the propagation neural prediction method to improve prediction accuracy for plastic injection machines sales in Taiwan for 2018. We first identified influential factors using correlation analysis and integrated a BPN. Overall prediction accuracy achieved was RMSE = 24,858,562.25 for the test set.
Overall outcomes confirmed it was feasible to predict plastic injection machine sales using factor analysis and back-propagation neural networks, and the proposed prediction system clarified the importance of prediction accuracy (MAE and RMSE) could be clearly seen, and was consistent with predicted results.
However, the proposed system has some several limitations that should be addressed:
  • The system should be extended to consider changes occurring for the different factors.
  • Implementation required for more time than other prediction methods.
  • Current accuracy is acceptable, but could be improved.
Industrial sales are driven by many complicated factors. Although the proposed system provided acceptable prediction accuracy, future research should evaluate alternative prediction methods for sales, including classification and fuzzy. Improved accuracy with help further reduce production inventories, improve sales target predictions, and achieve operational goals.

Author Contributions

Data curation: P.-H.W.; methodology: G.-H.L. and Y.-C.W.; software: Y.-C.W.; writing—original draft: P.-H.W. and G.-H.L.; writing—review and editing: Y.-C.W.

Funding

This research received no external funding.

Acknowledgments

We are thankful to the referees for their careful reading of the manuscript and for valuable comments and suggestions that greatly improved the presentation of this work.

Conflicts of Interest

The authors declare that there is no conflict of interest regarding the publication of this article.

References

  1. Taiwan Machinery Industry Association. 2018 Taiwan Plastic Rubber Machinery Industry Status; TAMI: Taipei, Taiwan, 2018; Available online: http://www.tami.org.tw/market/week3_20180727 (accessed on 5 November 2019).
  2. US Commercial Service. Plastics Materials and Machinery Export Guide. A Reference for U.S. Exporters in the Plastics Industry; US Commercial Service: Washington, DC, USA, 2018. Available online: https://www.trade.gov/industry/materials/Plastics%20Export%20Guide%202018_final.pdf (accessed on 5 November 2019).
  3. Bahrammirzaee, A. A comparative survey of artificial intelligence applications in finance: Artificial neural networks, expert system and hybrid intelligent systems. Neural Comput. Appl. 2010, 19, 1165–1195. [Google Scholar] [CrossRef]
  4. Clemen, R.T. Combining forecasts: A review and annotated bibliography. Int. J. Forecast. 1989, 5, 559–583. [Google Scholar] [CrossRef]
  5. Weron, R. Electricity price forecasting: A review of the state-of-the-art with a look into the future. Int. J. Forecast. 2014, 30, 1030–1081. [Google Scholar] [CrossRef]
  6. Boussabaine, A.H. The use of artificial neural networks in construction management: A review. Constr. Manag. Econ. 1996, 14, 427–436. [Google Scholar] [CrossRef]
  7. Shin, K.S.; Lee, Y.J. A genetic algorithm application in bankruptcy prediction modeling. Expert Syst. Appl. 2002, 23, 321–328. [Google Scholar] [CrossRef]
  8. Tkáč, M.; Verner, R. Artificial neural networks in business: Two decades of research. Appl. Soft Comput. 2016, 38, 788–804. [Google Scholar] [CrossRef]
  9. Zhao, K.; Wang, C. Sales forecast in e-commerce using convolutional neural network. arXiv 2017, arXiv:1708.07946. [Google Scholar]
  10. Croda, R.M.C.; Romero, D.E.G.; Morales, S.-O.C. Sales prediction through neural networks for a small dataset. Int. J. Interact. Multimed. Artif. Intell. 2019, 5, 35–41. [Google Scholar]
  11. Lawrence, I.; Lin, K. A concordance correlation coefficient to evaluate reproducibility. Biometrics 1989, 45, 255–268. [Google Scholar]
  12. Pituch, K.A.; Stevens, J.P. Applied Multivariate Statistics for the Social Sciences: Analyses with SAS and IBM’s SPSS; Routledge: Abingdon-on-Thames, UK, 2015. [Google Scholar]
  13. Gupta, K.R. Business Statistics; Atlantic Publishers & Distributors: Delhi, India, 2017. [Google Scholar]
  14. Krivic, S.J.; Loh, A. Factors relating to brand loyalty of a fitness health club franchise business in Vienna, Austria. Int. Res. E-J. Bus. Econ. 2018, 2, 56. [Google Scholar]
  15. Schober, P.; Boer, C.; Schwarte, L.A. Correlation coefficients: appropriate use and interpretation. Anesth. Analg. 2018, 126, 1763–1768. [Google Scholar] [CrossRef]
  16. Akerejola, W.O.; Okpara, E.U.; Ohikhena, P.; Emenike, P.O. Availability of infrastructure and adoption of point of sales of selected small and medium enterprises (SMEs) in Lagos State, Nigeria. Int. J. Acad. Res. Bus. Soc. Sci. 2019, 9, 137–150. [Google Scholar] [CrossRef]
  17. Hai, C.; Yan-Yan, C.; Wei, L.; Jun, W.; Shuai, Y.; Ju, G.; Yi, H.; Xiao-jing, D. Early warning analysis of electricity sales based on multi-factor correlation analysis. E3S Web Conf. EDP Sci. 2018, 53, 02007. [Google Scholar] [CrossRef]
  18. Gao, Y.; Chang, D.; Fang, T.; Luo, T. The correlation between logistics industry and other industries: An evaluation of the empirical evidence from China. Asian J. Shipp. Logist. 2018, 34, 27–32. [Google Scholar] [CrossRef]
  19. Kong, Y.; Xie, C.; Zheng, S.; Jiang, P.; Guan, M.; Wang, F. Dynamic early warning method for major hazard installation systems in chemical industrial park. Complexity 2019, 2019, 6250483. [Google Scholar] [CrossRef]
  20. Christou, E. Branding social media in the travel industry. Procedia Soc. Behav. Sci. 2015, 175, 607–614. [Google Scholar] [CrossRef]
  21. Kung’u, J. Effect of liquidity management practices on profitability of manufacturing industry in Kenya. IOSR J. Econ. Financ. 2017, 8, 84–89. [Google Scholar] [CrossRef]
  22. Van Wassenhoven, M.; Goyens, M.; Henry, M.; Capieaux, E.; Devos, P. Nuclear magnetic resonance characterization of traditional homeopathically manufactured copper (Cuprum metallicum) and plant (Gelsemium sempervirens) medicines and controls. Homeopathy 2017, 106, 223–239. [Google Scholar] [CrossRef]
  23. Anh, T.; Thi, L.; Quang, H.; Thi, T. Factors influencing the effectiveness of internal control in cement manufacturing companies. Manag. Sci. Lett. 2020, 10, 133–142. [Google Scholar] [CrossRef]
  24. Kotler, P.; Scheff, J. Standing Room Only: Strategies for Marketing the Performing Arts; Harvard Business School Press: Brighton, MA, USA, 1997. [Google Scholar]
  25. Minsky, M.; Papert, S. Perceptron: An Introduction to Computational Geometry; The MIT Press: Cambridge, MA, USA, 1969; p. 2. [Google Scholar]
  26. Chang, P.-C.; Wang, Y.-W.; Tsai, C.-Y. Evolving neural network for printed circuit board sales forecasting. Expert Syst. Appl. 2005, 29, 83–92. [Google Scholar] [CrossRef]
  27. Au, K.-F.; Choi, T.-M.; Yu, Y. Fashion retail forecasting by evolutionary neural networks. Int. J. Prod. Econ. 2008, 114, 615–630. [Google Scholar] [CrossRef]
  28. Kong, J.; Martin, G. A backpropagation neural network for sales forecasting. In Proceedings of the ICNN’ 95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 2, pp. 1007–1011. [Google Scholar]
  29. Thiesing, F.M.; Vornberger, O. Sales forecasting using neural networks. In Proceedings of the International Conference on Neural Networks (ICNN’97), Houston, TX, USA, 12 June 1997; pp. 2125–2128. [Google Scholar]
  30. Chen, C.-Y.; Lee, W.-I.; Kuo, H.-M.; Chen, C.-W.; Chen, K.-H. The study of a forecasting sales model for fresh food. Expert Syst. Appl. 2010, 37, 7696–7702. [Google Scholar] [CrossRef]
  31. Vhatkar, S.; Dias, J. Oral-care goods sales forecasting using artificial neural network model. Procedia Comput. Sci. 2016, 79, 238–243. [Google Scholar] [CrossRef]
  32. Mo, M.; Zhao, L.; Gong, Y.; Wu, Y. Research and application of BP neural network based on genetic algorithm optimization. Mod. Electron. Tech. 2018, 41, 41–44. [Google Scholar]
  33. Cincotti, S.; Gallo, G.; Ponta, L.; Raberto, M. Modeling and forecasting of electricity spot-prices: Computational intelligence vs classical econometrics. AI Commun. 2014, 27, 301–314. [Google Scholar]
  34. Ahlgren, P.; Jarneving, B.; Rousseau, R. Requirements for a cocitation similarity measure, with special reference to Pearson’s correlation coefficient. J. Am. Soc. Inf. Sci. Technol. 2003, 54, 550–560. [Google Scholar] [CrossRef]
  35. Werbos, P.J. The Roots of Backpropagation: From Ordered Derivatives to Neural Networks and Political Forecasting; John Wiley & Sons: Hoboken, NJ, USA, 1994. [Google Scholar]
  36. Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning Internal Representations by Error Propagation; Technical Report No. Mar–Sep 1985; University of California San Diego: La Jolla, CA, USA, 1985. [Google Scholar]
  37. Horn, J.F.; Calise, A.J.; Prasad, J. Flight envelope cueing on a tilt-rotor aircraft using neural network limit prediction. J. Am. Helicopter Soc. 2001, 46, 23–31. [Google Scholar] [CrossRef]
  38. Wang, Y.C. Prediction of engine failure time using principal component analysis, categorical regression tree, and back propagation network. J. Ambient Intell. Humaniz. Comput. 2018, 1–9. [Google Scholar] [CrossRef]
  39. Pasini, A. Artificial neural networks for small dataset analysis. J. Thorac. Dis. 2015, 7, 953. [Google Scholar]
  40. Pasini, A.; Potestà, S. Short-range visibility forecast by means of neural-network modelling: A case-study. Il Nuovo Cimento C 1995, 18, 505–516. [Google Scholar] [CrossRef]
  41. Pasini, A.; Pelino, V.; Potestà, S. A neural network model for visibility nowcasting from surface observations: Results and sensitivity to physical input variables. J. Geophys. Res. Atmos. 2001, 106, 14951–14959. [Google Scholar] [CrossRef]
  42. Xu, Y.; Goodacre, R. On splitting training and validation set: A comparative study of cross-validation, bootstrap and systematic sampling for estimating the generalization performance of supervised learning. J. Anal. Test. 2018, 2, 249–262. [Google Scholar] [CrossRef] [PubMed]
  43. Kohavi, R.A. study of cross-validation and bootstrap for accuracy estimation and model selection. Int. J. Conf. Articial Intell. 1995, 14, 1137–1145. [Google Scholar]
  44. Kennard, R.W.; Stone, L.A. Computer aided design of experiments. Technometrics 1969, 11, 137–148. [Google Scholar] [CrossRef]
  45. Lewis, E. Control of body segment differentiation in Drosophila by the bithorax gene complex. In Genes, Development and Cancer: The Life and Work of Edward B. Lewis; Lipshitz, H.D., Ed.; Springer Science+Business: New York, NY, USA, 1982; pp. 239–253. [Google Scholar]
  46. TAITRA Home Page. Available online: https://www.taitraesource.com/default.asp (accessed on 5 November 2019).
  47. Zhang, G.; Patuwo, B.E.; Hu, M.Y. Forecasting with artificial neural networks: The state of the art. Int. J. Forecast. 1998, 14, 35–62. [Google Scholar] [CrossRef]
  48. Lee, T.-S.; Chen, N.-J. Investigating the information content of non-cash-trading index futures using neural networks. Expert Syst. Appl. 2002, 22, 225–234. [Google Scholar] [CrossRef]
  49. Kuo, J.-T.; Hsieh, M.-H.; Lung, W.-S.; She, N. Using artificial neural network for reservoir eutrophication prediction. Ecol. Model. 2007, 200, 171–177. [Google Scholar] [CrossRef]
Figure 1. Propagation network architecture.
Figure 1. Propagation network architecture.
Applsci 09 05107 g001
Figure 2. Prediction accuracy (average over three runs).
Figure 2. Prediction accuracy (average over three runs).
Applsci 09 05107 g002
Figure 3. Average MSE for 3 Runs.
Figure 3. Average MSE for 3 Runs.
Applsci 09 05107 g003
Figure 4. Test results from the back-propagation network.
Figure 4. Test results from the back-propagation network.
Applsci 09 05107 g004
Table 1. The collected dataset.
Table 1. The collected dataset.
X6 Net Entry Rate of Employees in Manufacturing Industries and Service Industries (%)X8 Coincident Indicators Excluding Trend IndexX11 Manufacturing Industry Sales Volume IndexX12 Customs Export Values ($NTD Billion)X27 Business TurnoverX29 General Price IndexX31 The Total Value of Trade Imports ($NTD)Sales Amount ($NTD)
10.12110.7297.3723.281,160,27398.99679,55067,953,351
2−0.34111.2481.84569.69984,02498.43522,53224,804,652
30.26111.3997.86757.571,101,31598.23763,09582,065,179
40.08111.1496.78690.241,091,99799.62665,57884,636,977
50.12110.2796.81727.091,110,420102.36661,89259,689,586
60.25108.5994.47746.271,090,907104.24704,08857,378,259
70.71105.9394.55702.831,107,411105.87719,137111,436,433
80.07102.3692.05784.491,104,864104.68785,34152,641,164
9−0.2797.6888.47701.11,082,218102.73672,97750,619,993
10−0.3792.0786.96676.331,073,24398.56580,75925,432,050
11−0.786.3370.03560.84958,90692.59510,76082,767,645
12−1.3681.6668.5459.51944,13788.82399,34336,808,567
13−1.4178.8762.32413.65938,37688.25301,65018,262,761
14−0.8578.1368.72431.96896,76689.24373,59120,653,199
15−0.3278.9578.09545.31969,39589.08432,85040,703,997
16−0.2180.9781.52504.36986,15388.52436,38139,451,103
17−0.1383.7180.9543.17976,55888.54439,88029,829,289
180.2586.8688560.731,040,58689.96505,50621,569,185
190.4289.8991.24574.131,078,56691.06515,40368,449,065
200.3892.5886.99629.341,082,73993.15568,09445,444,464
~
1000.0196.4897.19720.251,129,50884.26565,03550,847,793
1010.0797.29101.78763.611,150,37285.07649,60483,251,179
1020.0898.2101.96743.411,178,03885.33627,46759,205,118
1030.4899.13101.27761,186,58284.81657,57094,401,114
1040.1100105.38780.011,198,74583.89653,52056,923,071
105−0.06100.7898.41712.911,192,62884.03574,54972,765,826
1060.1101.44101.75840.861,221,37084.72701,61354,305,574
1070.18101.81104.94801.181,236,08185.2664,89578,899,175
1080.12101.81105.54820.741,261,62686.51665,53392,900,118
1090.06101.3997.23757.131,224,80487.14645,62858,716,581
110−0.09100.7289.06702.81,037,53886.63599,18193,628,196
1110.21100.07107.95792.481,195,29685.88670,89148,821,852
1120.1199.6395.37738.981,149,97685.09654,94255,235,186
1130.1299.54101.97769.311,187,77583.98665,41243,872,675
1140.1999.87105.53777.741,217,62883.84602,50754,494,222
1150.62100.43102.42823.941,217,43684.25660,76367,840,954
1160.14101.1109.34840.251,264,34184.87667,52677,709,509
117−0.03101.66105.08869.471,263,09385.64668,97471,531,514
1180.12101.97103.39832.341,266,19486.12674,86357,053,528
1190.21102.13105.42869.411,278,62986.54691,78755,927,346
1200.12102.1104.98884.841,275,36286.72701,06094,772,146
Table 2. Pearson correlations between collected indicators.
Table 2. Pearson correlations between collected indicators.
ItemIndicatorPearson CorrelationHypothesis Test
X1Leading indicator composite index0.1610.08
X2Leading indicator excluding trend index0.2730.003
X3Export order trend index−0.1740.058
X4Total currency count M1B ($NTD Billion)0.0330.718
X5Stock index0.1230.184
X6Net entry rate of employees in manufacturing industries and service industries (%)0.3990.000
X7Coincident indicator composite index0.2170.018
X8Coincident indicators excluding trend index0.3390.000
X9Industrial production index0.2640.004
X10Enterprise total electricity consumption (109 kWh)0.2070.024
X11Manufacturing industry sales volume index0.3810.000
X12Customs export values ($NTD Billion)0.3390.000
X13Import value of machinery and electrical equipment ($NTD Billion)0.2730.003
X14Lagging indicator composite index0.1160.21
X15Leading indicator excluding trend index0.0760.411
X16Unemployment rate (%)−0.1150.214
X17Manufacturing unit output labor cost index−0.2360.010
X18Financial industry overnight interest rate−0.0310.741
X19Loan and investment for all financial institutions ($NTD Billion)0.0090.925
X20Manufacturing inventory (103)0.2430.008
X21Consumer confidence index0.2250.014
X22National income0.0110.902
X23Consumer price index0.0420.652
X24Industrial production index0.3060.001
X25Manufacturing industry production index0.2970.001
X26Plastic products manufactured per worker per month0.2880.002
X27Business turnover0.3680.000
X28Labor productivity index for plastic products manufacturing0.1510.102
X29General price index0.3230.000
X30Export orders (machinery)0.2780.002
X31The total value of trade imports ($NTD)0.4410.000
X32Commercial container unloading capacity (equivalent to 20 containers)−0.0750.419
X33Freight tonnage (metric ton)0.2540.005
X34$USD spot exchange rate—exchange rate between bank and customer—buy ($NTD)−0.4300.000
X35$USD spot exchange rate—exchange rate between bank and customer—sell ($NTD)−0.4300.000
Table 3. The Pearson correlation results.
Table 3. The Pearson correlation results.
ItemX6X8X11X12X27X29X31
Pearson correlation0.3990.3390.3810.3390.3680.3230.441
p-Value0.0000.0000.0000.0000.0000.0000.000
Table 4. Training error.
Table 4. Training error.
All RunsTraining MinimumTraining Standard Deviation
Average of Minimum MSEs0.0258827210.000802464
Average of Final MSEs0.0258827210.000802464
Table 5. Back-propagation artificial neural network prediction performance.
Table 5. Back-propagation artificial neural network prediction performance.
PerformanceY
RMSE24858562.25
NRMSE0.169958497
MAE20464935.83
NMAE0.139919183
Min Abs Error1192904.849
Max Abs Error55346814.96
r−0.041973132
Score44.05494133

Share and Cite

MDPI and ACS Style

Wang, P.-H.; Lin, G.-H.; Wang, Y.-C. Application of Neural Networks to Explore Manufacturing Sales Prediction. Appl. Sci. 2019, 9, 5107. https://doi.org/10.3390/app9235107

AMA Style

Wang P-H, Lin G-H, Wang Y-C. Application of Neural Networks to Explore Manufacturing Sales Prediction. Applied Sciences. 2019; 9(23):5107. https://doi.org/10.3390/app9235107

Chicago/Turabian Style

Wang, Po-Hsun, Gu-Hong Lin, and Yu-Cheng Wang. 2019. "Application of Neural Networks to Explore Manufacturing Sales Prediction" Applied Sciences 9, no. 23: 5107. https://doi.org/10.3390/app9235107

APA Style

Wang, P. -H., Lin, G. -H., & Wang, Y. -C. (2019). Application of Neural Networks to Explore Manufacturing Sales Prediction. Applied Sciences, 9(23), 5107. https://doi.org/10.3390/app9235107

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop