Next Article in Journal
Degree of Conversion and Mechanical Properties of Modern Self-Adhesive Luting Agents
Next Article in Special Issue
Numerical Parametric Studies on the Stress Distribution in Rocks around Underground Silo
Previous Article in Journal
Role of Blockchain Technology in Combating COVID-19 Crisis
Previous Article in Special Issue
Prestack Seismic Inversion via Nonconvex L1-2 Regularization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Productivity Prediction of Fractured Horizontal Well in Shale Gas Reservoirs with Machine Learning Algorithms

1
State Key Laboratory of Petroleum Resources and Prospecting, China University of Petroleum, Beijing 102249, China
2
State Key Laboratory of Oil and Gas Reservoir Geology and Exploitation, Southwest Petroleum University, Chengdu 610500, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(24), 12064; https://doi.org/10.3390/app112412064
Submission received: 3 October 2021 / Revised: 7 December 2021 / Accepted: 16 December 2021 / Published: 17 December 2021
(This article belongs to the Special Issue Geomechanics and Reservoirs: Modeling and Simulation)

Abstract

:
Predicting shale gas production under different geological and fracturing conditions in the fractured shale gas reservoirs is the foundation of optimizing the fracturing parameters, which is crucial to effectively exploit shale gas. We present a multi-layer perceptron (MLP) network and a long short-term memory (LSTM) network to predict shale gas production, both of which can quickly and accurately forecast gas production. The prediction performances of the networks are comprehensively evaluated and compared. The results show that the MLP network can predict shale gas production by geological and fracturing reservoir parameters. The average relative error of the MLP neural network is 2.85%, and the maximum relative error is 12.9%, which can meet the demand of engineering shale gas productivity prediction. The LSTM network can predict shale gas production through historical production under the constraints of geological and fracturing reservoir parameters. The average relative error of the LSTM neural network is 0.68%, and the maximum relative error is 3.08%, which can reliably predict shale gas production. There is a slight deviation between the predicted results of the MLP model and the true values in the first 10 days. This is because the daily production decreases rapidly during the early production stage, and the production data change greatly. The largest relative errors of LSTM in this work on the 10th, 100th, and 1000th day are 0.95%, 0.73%, and 1.85%, respectively, which are far lower than the relative errors of the MLP predictions. The research results can provide a fast and effective mean for shale gas productivity prediction.

1. Introduction

Shale gas is an unconventional and promising alternative energy resource [1,2,3]. Hydraulic fracturing is a main technology of shale gas development, which determines shale gas production [4,5]. Predicting shale gas production under different geological and fracturing reservoir conditions in the fractured shale gas reservoirs is the foundation of optimizing the fracturing parameters, which is crucial to effectively exploit shale gas.
Due to the various complicated and co-dependent factors, such as geological and fracturing reservoir parameters, predicting gas production in shale gas reservoir poses a long-standing challenge [6,7]. Numerical simulations are conventional methods to predict shale gas production [8]. However, a numerical model needs detailed information about the specific reservoirs, which is based on numerous geological data [9]. The structure of the geology is complex, the production involves large nonlinear dynamic problems, and the numerical simulation cost is high. Therefore, the numerical simulation of shale gas production is a time-consuming method. Recently, machine learning (ML), especially deep learning, has developed rapidly, providing an effective means for shale gas production forecast [10,11]. ML is good at dealing with nonlinear problems. Usually, the calculation speed of ML is faster than numerical analysis. This requires a deeper understanding of productivity prediction in shale gas reservoirs based on machine learning algorithms.
An artificial neural network is used to predict shale gas production, because it can learn the relationship between geological and fracturing reservoir parameters and gas production. Moreover, a few particular artificial neural networks are trained to describe the relationship of the production data. Chakra et al. [12] predicted the oil production with higher-order neural network in a reservoir in India. The network has a good forecast effect with insufficient field data available. Sheremetov et al. [13] used nonlinear autoregressive neural network to predict oil production, which shows competitive accuracy in a naturally fractured oilfield. They also found that preliminary clustering is an effective means to enhance the prediction veracity. Aizenberg et al. [14] predicted oil production with a multilayer neural network. They evaluated the algorithm using a real field dataset and found the model can predict univariate and multivariate reservoir dynamics. Cao et al. [15] applied ANN to predict production using geological maps, production data, and pressure and operational constraints. Sun et al. [16] compared the RNN productive prediction model and decline curve analysis of single and multiple wells. They found that LSTM models can describe the overall trend, whereas DCA can only generate smooth curves. Previous research has shown that machine learning shows efficient prediction performance for well production. However, the constraints of geological and fracturing parameters are ignored. Thus, a shale gas productivity forecasting model, which takes the historical production data with the constraints of geological and fracturing reservoir parameters, urgently needs to be established.
In this study, we investigate the feasibility of production forecast based on machine learning algorithms, and use MLP and LSTM neural networks to predict shale gas production. We first calculate massive production data with different geological and fracturing parameters based on our previous numerical production model as the machine learning dataset. It is necessary to preprocess the collected field raw data and use it in the machine learning process. In order to reduce the user’s manual processing and fill in empty values, an automated preprocessing method must be adopted. Then, we train the MLP and LSTM neural networks based on the dataset. The MLP neural network can forecast shale gas production by geological and fracturing reservoir parameters, and the LSTM neural network can predict shale gas production through historical production under the constraints of a geological and fracturing reservoir parameters. Lastly, we analyze the prediction results and compare the prediction performances of MLP and LSTM neural networks. This study provides an accurate and efficient method for predicting shale gas production.

2. Model Description

2.1. Neural Network Description

The MLP network, which is composed of several neurons, is a basic artificial neural network [17]. MLP is a fully connected neural network with three or more layers (an input layer, an output layer, and one or more hidden layers) of nonlinearly activating nodes. The nonlinear relationship between the input and output can be obtained by using a nonlinear activation function. In this paper, the nonlinear relationship between shale gas production and reservoir properties and fracturing parameters is obtained using MLP. However, the MLP cannot deal with sequential data because it lacks memory functions.
A recurrent neural network (RNN) allows information to be transferred from one step to the next [18]. In RNN, a neural network unit can be regarded as the superposition of multiple units, and each neural network transmits a message to the next neural network. This chain-like structure enables the RNN to connect the previously stored information with the current information. Thus, RNN can infer subsequent events from previous events, which is suitable for solving time series data problems. However, because of gradient disappearance and gradient explosion, RNN cannot solve the problem of time series forecasting [19].
The LSTM network is a special kind of RNN, which was proposed by Hochreiter and Schmidhuber [20]. LSTM’s design was inspired by the logic gates of a computer. LSTM introduces a memory cell (or cell for short) that has the same shape as the hidden state (some papers consider the memory cell as a special type of the hidden state), engineered to record additional information. To control the memory cell, we need a number of gates. One gate is needed to read out the entries from the cell. We will refer to this as the output gate. A second gate is needed to decide when to read data into the cell. We refer to this as the input gate. Lastly, we need a mechanism to reset the content of the cell, governed by a forget gate. This is more practical than the ordinary recurrent neural network because of its ability to process the sequential data [10].
The forget gate is expressed as:
f t = σ w f h h t 1 + w f x x t + b f
where ft is the sigmoid gate, σ is sigmoid activation function, ht−1 is the hidden state from the previous time step t−1, xt is the input at the time step t, w represents the weight, and b is the bias.
The input gate is calculated by:
i t = σ w i h h t 1 + w i x x t + b i
C t * = tanh w C h h t 1 + w C x x t + b C
where it is the sigmoid layer, Ct* is the tanh layer, and tanh is the tanh activation function.
C t = f t × C t 1 + i t × C t *
The output gate is written as:
o t = σ w o h h t 1 + w o x x t + b o
h t = o t × tanh C t
where ot is the sigmoid layer and ht is the tanh layer.

2.2. Data Preparation and Preprocessing

The networks establishment needs to determine the weights and thresholds based on the dataset [21]. We calculate massive production data with different geological and fracturing parameters based on our previous numerical production model [22,23] as the dataset. Detailed information of the numerical model can be found in our previous study [8]. Shale gas production is affected by many geological and fracturing parameters. To facilitate the learning dataset preparation, four parameters that have major effects on shale gas production are chosen: fracturing cluster, half-length of fractures, fracture conductivity, and reservoir permeability, as shown in Table 1. Each parameter has four values, so we have 44 = 256 numerical simulation cases in total. Every case includes 1000-day production data. Thus, we have 256,000 group data points. The first and last 5 lines are listed in Table 2. The shale gas productions with different geological and fracturing parameters are obtained as the machine learning dataset. Considering that the numerical value of geological and fracturing reservoir parameters significantly differ, we normalize the input variables, preventing the contribution of smaller eigenvalues from being eliminated. A total of 80% of the data is chosen to train the model and 20% of the data is used to test the model.

2.3. Prediction Accuracy Evaluation

In the process of training, we used the mean absolute error (MAE), which is a commonly used index to evaluate the accuracy of regression prediction model [24], to evaluate the shale gas prediction model:
MAE = 1 n i = 1 n y i p r e d y i a c t
where n is the sample quantity, y i p r e d is the predictive value, and y i a c t is the true value.

3. Results and Discussion

3.1. MLP Network

The MLP network is trained on the training dataset to determine the structure parameters. The optimal number of neural network layers is 3 and the number of neurons in each layer is 128. The batch size and epochs are 5000 and 100, respectively. The ReLU activation function and Adam optimizer are used in the training process. The network is implemented based on the TensorFlow framework [25]. The MAE between prediction values and true values is 50.12 m2/d.
To validate the shale gas productive prediction model based on MLP, we select 100 groups of different geological and fracturing reservoir parameters and compare the prediction results of the model with the real values. The comparison of the true values and values predicted by MLP networks of daily shale gas production and the relative error between the true values and predictions are shown in Figure 1. The black broken line is the true value of daily production, and the red circular scatter is the predictions of the model. The red scattered points are all located on the black broken line, which shows that the model can accurately and stably forecast the gas production. Figure 1 also shows the relative errors between the predicted values and true data. The average relative error is 2.85% and the maximum relative error is 12.9%. It was found that there are only three data points with relative errors greater than 8%, and the corresponding daily productions are small. At this time, the absolute errors of the model prediction results are small.
Furthermore, we analyze the prediction performance of the MLP network for shale gas production. Figure 2 depicts the prediction performance for the daily shale gas production under three groups of geological and fracturing reservoir parameters. The geological and fracturing reservoir parameters of the three cases are as follows: in case 1, the number of perforating clusters is 9, the half-length of hydraulic fracture is 120 m, the hydraulic fracture conductivity is 400 MD·m, and the matrix permeability is 400 nd; in case 2, the number of perforation clusters is 5, the half-length of hydraulic fracture is 100 m, the hydraulic fracture conductivity is 300 MD·m, and the matrix permeability is 300 nd; in case 3, the number of perforation clusters is 3, the half-length of hydraulic fracture is 60 m, the hydraulic fracture conductivity is 100 MD·m, and the matrix permeability is 100 nd. The parameters in case 1 are the maximum values in the dataset, and the parameters in case 3 are the minimum values in the dataset. The predictions of the three groups of examples are in good agreement with the actual daily production, which further validated the prediction model. There is a slight deviation between the predicted results of the model and the true values in the first 10 days. This is because the daily production decreases rapidly during the early production stage, and the production data changes greatly. The neural network model focus on the changes of production with time. Therefore, in the early stage, the predicted results of the MLP network are slightly different from the true values.
The relative error of the predictions and the true value of shale gas production on the 10th day, 100th day, and 1000th day are shown in Figure 3. When the production time is the 100th day, the relative errors of the three groups are the smallest, which are 1.49%, 1.55%, and 1.99%, separately. On the 10th day, the relative errors of the three groups are 5.61%, 6.55%, and 7.81%, separately. On the 1000th day, the relative error of case 1 and case 3 is significantly larger than that of case 2. In conclusion, the shale gas productivity prediction model based on MPL can predict shale gas production by geological and fracturing reservoir parameters.

3.2. LSTM Network

The LSTM network is trained to determine the structure parameters based on the training dataset. The optimal number of neural network layers is 2 and the number of neurons in each layer is 32. The batch size and epochs are 256 and 150. The tanh activation function and Adam optimizer are used in the training process. To validate the shale gas productive prediction model based on LSTM, we also select 100 groups of different geological and fracturing reservoir parameters and compare the prediction results of the model with the real values. The comparison of the true values and values predicted by LSTM networks of daily shale gas production and the relative error between the true values and predictions are shown in Figure 4. The predictions of the model are in accordance with the true values, which indicates that the model has a good prediction effect. Figure 4 also shows the relative error between the predicted values and the true values of the neural network under the constraint of geological and fracturing reservoir parameters. The average relative error is 0.68% and the maximum relative error is 3.08%, which is far lower than the average relative error of the MLP network.
In this section, we analyze the LSTM prediction results for shale gas production. The prediction performance for the daily shale gas production under three groups of geological and fracturing reservoir parameters is shown in Figure 5. The geological and fracturing reservoir parameters in case 1 are the maximum values in the dataset, and the parameters in case 3 are the minimum values in the dataset. The comparison between the LSTM predictions and the true values under different constraint parameters is shown in Figure 6. The data of the first 6 days are input data, and the prediction results start from the 7th day. It can be seen that the LSTM predictions agree satisfactorily with the true values. The relative error of the LSTM predictions and the true value of shale gas production on the 10th day, 100th day, and 1000th day are shown in Figure 6. The relative errors of case 2 are the smallest, and the relative errors on the 10th, 100th, and 1000th day are 0.04%, 0.13%, and 1.18% respectively. The relative errors of case 3 on the 10th, 100th, and 1000th day are the largest, which are 0.95%, 0.73%, and 1.85%, respectively, of the production. However, they are far lower than the relative errors between the MLP predictions and the true values.

3.3. Comparisons of the Different Networks

We then compared the predictions of the MLP and LSTM networks under two new constraint conditions, as shown in Figure 7. The geological and fracturing reservoir parameters of the three cases are as follows: in case 1, the number of perforating clusters is 7, the half-length of hydraulic fracture is 120 m, the hydraulic fracture conductivity is 400 MD·m, and the matrix permeability is 400 nd; in case 2, the number of perforation clusters is 5, the half-length of hydraulic fracture is 60 m, the hydraulic fracture conductivity is 100 MD·m, and the matrix permeability is 100 nd. Figure 8 shows the relative error between the predictive value and true value of the MLP and LSTM networks. We can see that the number of relative errors of LSTM in this work is far lower than the number of relative errors of the MLP predictions.

3.4. Relative Errors at an Early Production Time

The relative error of the predictions and the true value of shale gas production on pre-100 days are shown in Figure 9. For the MLP networks, the relative errors of the three groups are 3.98%, 6.18%, and 6.46%, respectively. The average relative error is 5.54%. The maximum relative errors of the three groups are on the 1st day, 3rd day, and 3rd day, respectively. After that, the relative errors are decreased with the production time. The relative errors of case 1 and case 3 are slightly increased between 90 days and 100 days. For the LSTM networks, the relative errors of the three groups are 0.16%, 0.08%, and 0.55%, respectively. The average relative error is 0.26%. The maximum relative errors of the three groups are on the 100th day, 8th day, and 9th day, respectively.

4. Conclusions

We present two machine learning algorithms to predict the shale gas production, both of which can reliably forecast shale gas production. The MLP neural network can forecast shale gas production by geological and fracturing reservoir parameters, and the LSTM neural network can predict shale gas production through historical production under the constraints of geological and fracturing reservoir parameters. The average relative error of the MLP neural network is 2.85%, and the maximum relative error is 12.9%, which can meet the demand of engineering shale gas productivity prediction. The average relative error of the LSTM neural network is 0.68%, and the maximum relative error is 3.08%, which can reliably predict shale gas production. There is a slight deviation between the predicted results of the MLP model and the true values in the first 10 days. This is because the daily production decreases rapidly during the early production stage, and the production data change greatly. The largest relative errors of LSTM in this work on the 10th, 100th, and 1000th day are 0.95%, 0.73%, and 1.85% respectively, which are far lower than the relative errors of the MLP predictions. Although these neural network models are trained based on the simulation data in this study, the research results can provide a basic theory for shale gas productivity prediction and fine design of fracturing and completion parameters. In future work, we will train a new model based on the field data.

Author Contributions

Conceptualization, T.W. and S.T.; methodology, W.Z.; software, T.W.; validation, H.W., Q.W. and J.S.; writing—original draft preparation, T.W.; writing—review and editing, S.T.; visualization, Q.W.; supervision, W.R.; funding acquisition, S.T. All authors have read and agreed to the published version of the manuscript.

Funding

Financial support from the National Key Scientific Research Instrument Research Project of NSFC (No. 51827804), the Science Foundation of China University of Petroleum, Beijing (No. 2462021XKBH013), and China Postdoctoral Science Foundation Funded Project (Project No.: 2021M703578) are acknowledged.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Curtis, J.B. Fractured shale-gas systems. AAPG Bull. 2002, 86, 1921–1938. [Google Scholar]
  2. Freeman, C.M.; Moridis, G.; Ilk, D.; Blasingame, T.A. A numerical study of performance for tight gas and shale gas reservoir systems. J. Petrol. Sci. Eng. 2013, 108, 22–39. [Google Scholar] [CrossRef]
  3. Wang, T.; Tian, S.; Li, G.; Sheng, M.; Ren, W.; Liu, Q.; Zhang, S. Molecular Simulation of CO2/CH4 Competitive Adsorption on Shale Kerogen for CO2 Sequestration and Enhanced Gas Recovery. J. Phys. Chem. C 2018, 122, 17009–17018. [Google Scholar] [CrossRef]
  4. Guo, J.; Zhao, X.; Zhu, H.; Zhang, X.; Pan, R. Numerical simulation of interaction of hydraulic fracture and natural fracture based on the cohesive zone finite element method. J. Nat. Gas. Sci. Eng. 2015, 25, 180–188. [Google Scholar] [CrossRef]
  5. Tang, J.; Wu, K.; Li, Y.; Hu, X.; Liu, Q.; Ehlig-Economides, C. Numerical investigation of the interactions between hydraulic fracture and bedding planes with non-orthogonal approach angle. Eng. Fract. Mech. 2018, 200, 1–16. [Google Scholar] [CrossRef]
  6. Ren, W.; Lau, H.C. New Rate-Transient Analysis for Fractured Shale Gas Wells Using a Tri-linear Flow Model. J. Nat. Gas Sci. Eng. 2020, 80, 103368. [Google Scholar] [CrossRef]
  7. Zeng, J.; Liu, J.; Li, W.; Leong, Y.K.; Elsworth, D.; Guo, J. Shale gas reservoir modeling and production evaluation considering complex gas transport mechanisms and dispersed distribution of kerogen. Pet. Sci. 2020, 18, 195–218. [Google Scholar] [CrossRef]
  8. Wang, T.; Tian, S.; Zhang, W.; Ren, W.; Li, G. Production Model of a Fractured Horizontal Well in Shale Gas Reservoirs. Energy Fuels 2021, 35, 493–500. [Google Scholar] [CrossRef]
  9. Shi, Y.; Song, X.; Song, G. Productivity prediction of a multilateral-well geothermal system based on a long short-term memory and multi-layer perceptron combinational neural network. Appl. Energy 2021, 282, 116046. [Google Scholar] [CrossRef]
  10. Sagheer, A.; Kotb, M. Time series forecasting of petroleum production using deep LSTM recurrent networks. Neurocomputing 2019, 323, 203–213. [Google Scholar] [CrossRef]
  11. Shoeibi Omrani, P.; Dobrovolschi, I.; Belfroid, S.; Kronberger, P.; Munoz, E. Improving the accuracy of virtual flow metering and back-allocation through machine learning. In Proceedings of the Abu Dhabi International Petroleum Exhibition & Conference, Abu Dhabi, United Arab Emirates, 12–15 November 2018; Society of Petroleum Engineers: Delft, The Netherlands, 2018. [Google Scholar]
  12. Chakra, N.C.; Song, K.-Y.; Gupta, M.M.; Saraf, D.N. An innovative neural forecast of cumulative oil production from a petroleum reservoir employing higher-order neural networks (HONNs). J. Petrol. Sci. Eng. 2013, 106, 18–33. [Google Scholar] [CrossRef]
  13. Sheremetov, L.; Cosultchi, A.; Martínez-Muñoz, J.; Gonzalez-Sánchez, A.; Jiménez-Aquino, M. Data-driven forecasting of naturally fractured reservoirs based on nonlinear autoregressive neural networks with exogenous input. J. Petrol. Sci. Eng. 2014, 123, 106–119. [Google Scholar] [CrossRef]
  14. Aizenberg, I.; Sheremetov, L.; Villa-Vargas, L.; Martinez-Muñoz, J. Multilayer neural network with multi-valued neurons in time series forecasting of oil production. Neurocomputing 2016, 175, 980–989. [Google Scholar] [CrossRef]
  15. Cao, Q.; Banerjee, R.; Gupta, S.; Li, J.; Zhou, W.; Jeyachandra, B. Data driven production forecasting using machine learning. In Proceedings of the SPE Argentina Exploration and Production of Unconventional Resources Symposium, Buenos Aires, Argentina, 1–3 June 2016; Society of Petroleum Engineers: Beijing, China, 2016. [Google Scholar]
  16. Sun, J.; Ma, X.; Kazi, M. Comparison of decline curve analysis DCA with recursive neural networks RNN for production forecast of multiple wells. In Proceedings of the SPE Western Regional Meeting, Garden Grove, CA, USA, 22–26 April 2018; Society of Petroleum Engineers: Richardson, TX, USA, 2018. [Google Scholar]
  17. Hampshire, J.B., II; Pearlmutter, B. Equivalence Proofs for Multi-Layer Perceptron Classifiers and the Bayesian Discriminant Function, Connectionist Models; Elsevier: Amsterdam, The Netherlands, 1991; pp. 159–172. [Google Scholar]
  18. Schuster, M.; Paliwal, K.K. Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 1997, 45, 2673–2681. [Google Scholar] [CrossRef] [Green Version]
  19. Pascanu, R.; Mikolov, T.; Bengio, Y. On the difficulty of training recurrent neural networks. In Proceedings of the International Conference on Machine Learning, Atlanta, GA, USA, 17–19 June 2013; pp. 1310–1318. [Google Scholar]
  20. Hochreiter, S.; Schmidhuber, J. LSTM can solve hard long time lag problems. Adv. Neural Inf. Process. Syst. 1997, 9, 473–479. [Google Scholar]
  21. Zhu, Z.; Song, X.; Li, G.; Xu, Z.; Zhu, S.; Yao, X.; Jing, S. Prediction of the settling velocity of the rod-shaped proppant in vertical fracture using artificial neural network. J. Petrol. Sci. Eng. 2021, 200, 108158. [Google Scholar] [CrossRef]
  22. Wang, T.Y.; Tian, S.C.; Liu, Q.L.; Li, G.S.; Sheng, M.; Ren, W.X.; Zhang, P.P. Pore structure characterization and its effect on methane adsorption in shale kerogen. Pet. Sci. 2021, 18, 565–578. [Google Scholar] [CrossRef]
  23. Wang, T.; Tian, S.; Li, G.; Zhang, L.; Sheng, M.; Ren, W. Molecular simulation of gas adsorption in shale nanopores: A critical review. Renew. Sustain. Energy Rev. 2021, 149, 111391. [Google Scholar] [CrossRef]
  24. Tseng, F.-M.; Yu, H.-C.; Tzeng, G.-H. Combining neural network model with seasonal time series ARIMA model. Technol. Forecast. Soc. Chang. 2002, 69, 71–87. [Google Scholar] [CrossRef]
  25. Abadi, M.; Barham, P.; Chen, J.; Chen, Z.; Zhang, X. TensorFlow: A system for large-scale machine learning. In Proceedings of the 12th USENIX Symposium on Operating Systems Design and Implementation, Savannah, GA, USA, 2–4 November 2016; pp. 265–283. [Google Scholar]
Figure 1. Comparison of the true values and values predicted by MLP networks for validation data of daily shale gas production and the relative error between the true values and predictions.
Figure 1. Comparison of the true values and values predicted by MLP networks for validation data of daily shale gas production and the relative error between the true values and predictions.
Applsci 11 12064 g001
Figure 2. Comparison of the predictions by MLP networks and the true values of the daily production under three constraints.
Figure 2. Comparison of the predictions by MLP networks and the true values of the daily production under three constraints.
Applsci 11 12064 g002
Figure 3. Relative error between the values predicted by MLP networks and true values under different production time.
Figure 3. Relative error between the values predicted by MLP networks and true values under different production time.
Applsci 11 12064 g003
Figure 4. Comparison of the true values and values predicted by LSTM networks for validation data of daily shale gas production and the relative error between the true values and the predictions.
Figure 4. Comparison of the true values and values predicted by LSTM networks for validation data of daily shale gas production and the relative error between the true values and the predictions.
Applsci 11 12064 g004
Figure 5. Comparison of the values predicted by the LSTM networks and the true values of the daily shale gas production under three constraint conditions.
Figure 5. Comparison of the values predicted by the LSTM networks and the true values of the daily shale gas production under three constraint conditions.
Applsci 11 12064 g005
Figure 6. Relative error between the values predicted by the LSTM networks and the true values under different production times.
Figure 6. Relative error between the values predicted by the LSTM networks and the true values under different production times.
Applsci 11 12064 g006
Figure 7. LSTM and MLP predictions and actual values of the daily shale gas production under two constraint conditions.
Figure 7. LSTM and MLP predictions and actual values of the daily shale gas production under two constraint conditions.
Applsci 11 12064 g007
Figure 8. Relative error between the predictive value and true value of the MLP and LSTM networks.
Figure 8. Relative error between the predictive value and true value of the MLP and LSTM networks.
Applsci 11 12064 g008
Figure 9. Relative error between the values predicted by the MLP networks (left) and LSTM networks (right) and true values under early production time.
Figure 9. Relative error between the values predicted by the MLP networks (left) and LSTM networks (right) and true values under early production time.
Applsci 11 12064 g009
Table 1. The parameters for training data.
Table 1. The parameters for training data.
ParametersValue
Fracture cluster3, 5, 7, 9
Half-length of fracture (m)60, 80, 100, 120
Fracture conductivity (mD·m)100, 200, 300, 400
Permeability (nD)100, 200, 300, 400
Table 2. Dataset of shale gas production.
Table 2. Dataset of shale gas production.
ClusterHalf-Length of Fracture (m)Fracture
Conductivity (mD·m)
Permeability (nD)Production Time (d)Daily Production (m3/d)
191204004001106,485.00
29120400400289,001.90
39120400400379,231.30
49120400400472,545.10
59120400400567,481.10
255,996360100100996289.90
255,997360100100997289.75
255,998360100100998289.60
255,999360100100999289.43
256,0003601001001000289.28
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wang, T.; Wang, Q.; Shi, J.; Zhang, W.; Ren, W.; Wang, H.; Tian, S. Productivity Prediction of Fractured Horizontal Well in Shale Gas Reservoirs with Machine Learning Algorithms. Appl. Sci. 2021, 11, 12064. https://doi.org/10.3390/app112412064

AMA Style

Wang T, Wang Q, Shi J, Zhang W, Ren W, Wang H, Tian S. Productivity Prediction of Fractured Horizontal Well in Shale Gas Reservoirs with Machine Learning Algorithms. Applied Sciences. 2021; 11(24):12064. https://doi.org/10.3390/app112412064

Chicago/Turabian Style

Wang, Tianyu, Qisheng Wang, Jing Shi, Wenhong Zhang, Wenxi Ren, Haizhu Wang, and Shouceng Tian. 2021. "Productivity Prediction of Fractured Horizontal Well in Shale Gas Reservoirs with Machine Learning Algorithms" Applied Sciences 11, no. 24: 12064. https://doi.org/10.3390/app112412064

APA Style

Wang, T., Wang, Q., Shi, J., Zhang, W., Ren, W., Wang, H., & Tian, S. (2021). Productivity Prediction of Fractured Horizontal Well in Shale Gas Reservoirs with Machine Learning Algorithms. Applied Sciences, 11(24), 12064. https://doi.org/10.3390/app112412064

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop