Next Article in Journal
Regulations and Fintech: A Comparative Study of the Developed and Developing Countries
Next Article in Special Issue
Rainbow Step Barrier Options
Previous Article in Journal
Enhancing and Validating a Framework to Curb Illicit Financial Flows (IFFs)
Previous Article in Special Issue
Advanced Statistical Analysis of the Predicted Volatility Levels in Crypto Markets
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhancing Model Selection by Obtaining Optimal Tuning Parameters in Elastic-Net Quantile Regression, Application to Crude Oil Prices

by
Abdullah S. Al-Jawarneh
1,
Ahmed R. M. Alsayed
2,3,*,
Heba N. Ayyoub
4,
Mohd Tahir Ismail
5,
Siok Kun Sek
5,
Kivanç Halil Ariç
6 and
Giancarlo Manzi
2
1
Department of Mathematics, Faculty of Science, Jerash University, Jerash 26150, Jordan
2
Department of Economics, Quantitative Methods, and Data Mining Centre, University of Milan, 20122 Milan, Italy
3
Department of Economics, University of Bergamo, 24127 Bergamo, Italy
4
Department of Mathematics, Faculty of Science, Philadelphia University, Amman 19392, Jordan
5
School of Mathematical Sciences, Universiti Sains Malaysia, Gelugor 11700, Malaysia
6
Faculty of Economics and Administrative Sciences, Sivas Cumhuriyet University, 58070 Sivas, Turkey
*
Author to whom correspondence should be addressed.
J. Risk Financial Manag. 2024, 17(8), 323; https://doi.org/10.3390/jrfm17080323
Submission received: 30 May 2024 / Revised: 12 July 2024 / Accepted: 22 July 2024 / Published: 26 July 2024
(This article belongs to the Special Issue Featured Papers in Mathematics and Finance)

Abstract

:
Recently, there has been an increased focus on enhancing the accuracy of machine learning techniques. However, there is the possibility to improve it by selecting the optimal tuning parameters, especially when data heterogeneity and multicollinearity exist. Therefore, this study proposed a statistical model to study the importance of changing the crude oil prices in the European Union, in which it should meet state-of-the-art developments on economic, political, environmental, and social challenges. The proposed model is Elastic-net quantile regression, which provides more accurate estimations to tackle multicollinearity, heavy-tailed distributions, heterogeneity, and selecting the most significant variables. The performance has been verified by several statistical criteria. The main findings of numerical simulation and real data application confirm the superiority of the proposed Elastic-net quantile regression at the optimal tuning parameters, as it provided significant information in detecting changes in oil prices. Accordingly, based on the significant selected variables; the exchange rate has the highest influence on oil price changes at high frequencies, followed by retail trade, interest rates, and the consumer price index. The importance of this research is that policymakers take advantage of the vital importance of developing energy policies and decisions in their planning.

1. Introduction

In most research fields that have large time series data, like environmental, medical, marketing, etc., the data have great importance as information that need a development tool to reach the right decision. To detect more information around these data like patterns and trends, advanced machine learning (ML) has been used. ML is classified into two parts, namely supervised and unsupervised methods. Supervised ML algorithms build mathematical models to predict outcomes in the future. One of the main applications of supervised ML is regression analysis (Kassambara 2018; Ray 2019).
Regression analysis faces several challenges and affect prediction accuracy. For example, heterogeneity and the multicollinearity problem exist among the predictor variables; consequently, such a model is difficult to interpret (Qin et al. 2016; Alsayed et al. 2018; Al-Jawarneh et al. 2022). Many researchers continue to develop hybrid regression models to deal with these issues by improving the ordinary least squares method (OLS) method, such as the penalized regularization method; namely, Ridge regression (RR) and the least absolute shrinkage and selection operator (LASSO) method. However, the RR method still cannot deal with the reduction of the predictor numbers; hence, the unnecessary predictor variables will still exist in the final model (Tibshirani 1996; Zou and Hastie 2005). Meanwhile, the LASSO method is inconsistent for variable selection and dealing with multicollinearity (Fan and Li 2001; Zou and Hastie 2005). Elastic-net (ELNET) methods (Zou and Hastie 2005) were proposed. This method represents a newly developed penalized regularization method for improving the model’s interpretability and identifying relevant variables, considering that the procedures with the initial coefficient estimator used to compute the adaptive weights need not be consistent. In addition to that, quantile regression (QR) seeks to search for a model that minimizes the sum of the absolute residuals rather than the sum of the squared residuals. QR measures the effects of unobserved heterogeneity in the included variables. If the dependent variable distribution changes together with the independent variables, then the result is misleading when using the OLS regression, whereas QR shows how such changes in the independent variables affect the distribution shape of the dependent variable. Therefore, it will provide significant estimators for the changing of the heterogeneous distribution of the dependent variable (Alsayed et al. 2020). Moreover, there is a study that proposed the idea of penalized LASSO quantile regression that used the sum of the absolute values of the coefficients as the penalty (Li and Zhu 2008). In addition, recent research proposed an elastic net penalized quantile regression model approach that combines the strengths of the quantile loss and the Elastic net (Su and Wang 2021).
In penalized regression, the tuning parameters play a serious role in improving the penalty to realize the optimal estimation and consistent selection (Xiao and Sun 2019) when the tuning parameters have control of the coefficient shrinkage rate. For instance, if λ has too high a value, which leads to more shrinkage to be a small value or exactly equal zero, that means the model will be under-fitting (i.e., high bias and low variance). As the tuning parameter value increases, the bias increases, the variance decreases, and vice versa, in addition to the choosing of tuning parameter alpha ( α ) in the elastic net method which belongs to a value between zero and one, several of the studies work to fix this value at alpha equal to 0.5 or 0.75. So, choosing tuning parameter values is a difficult business and very sensitive (Fan and Tang 2013; Desboulets 2018).
To choose the tuning parameter, the literature advised some frequently used methods. These include the minimizing information criterion (IC), namely the Akaike information criterion (AIC) (Akaike et al. 1973), Bayes information criterion (BIC) (Schwarz 1978), Mallow Cp (Efron et al. 2004), and cross-validation (CV) (Stone 1974). The CV method is the simplest and most commonly used method in the literature for estimating and choosing the tuning parameter that has the minimizing CV sum of squared residuals (Chand 2012; Desboulets 2018), where the principle of the CV method presents a grid of λ values and computing the CV error for each λ after choosing the optimal λ, which has the smallest CV error (Gareth et al. 2013).
On the other hand, the importance of predicting and forecasting the energy market is highly needed to reach the optimal balanced point between energy, economics and environmental quality. Crude oil is an ingredient for sustainable economic growth, while the supply and demand are inelastic, and crude oil prices often experience sharp and sustained fluctuations (Alsayed and Manzi 2019). Therefore, this study examines the reaction of crude oil prices during the recent period, which has several economic shocks, particularly in the European Union; these include the global economic crisis in 2008, COVID-19 in 2020, and the recent wars in 2022–2024, which led to challenges for the global economy. Several studies have shown interest in examining the crude oil prices with several global and local factors (Aastveit et al. 2023; He et al. 2021; Kartal 2020; Baumeister and Kilian 2015; Doğrul and Soytas 2010; Amano and Van Norden 1998).
The significant contribution of this research is twofold: First, on the statistical aspect to deal with the heterogeneity, and second, to improve the accuracy of model selection by selecting the predictors that have the most effect on the response variable; the ELNET. QR regression at τ = 0.25, 0.5, 0.75 will be used based on the D-fold C.V. method to select the optimal tuning parameters ( α o p t and λ o p t ). Then, it is evaluated and compared with recently developed methods using both simulations and real applications. Additionally, regarding the novelty of the econometrics aspect, we model and predict the crude oil price using local and global variables—namely, the exchange rate and retail trade, interest rates, and consumer price index—to tackle the time series data that suffer from heterogeneity.
The advantage of using this approach is that this model is distinguished by its superior ability to deal with time series issues compared to the old models used and its ability to keep pace with developments in time series. However, there is the disadvantage that it does not have the oracle property, but this can be treated using the adaptive elastic net method in the future.
This research is organized as follows: Section 1 provides the introduction and literature review, Section 2 presents the methods, quantile regression, elastic net regression, D-fold cross-validation, and the proposed method. Section 3 explains the data and variables. Section 4 presents the empirical findings and discussion, and the Section 5 is the conclusion.

2. Methodology

This section briefly describes the applied methods. The first method is the QR regression method, which deals with heterogeneity problems. The second method pertains to the penalized regularization method by the Elastic-net (ELNET) method and D-fold cross-validation. Finally, this section discusses the proposed method provided, ELNET.QR α o p t regression.

2.1. Quantile Regression

QR regression is broadly applied, covering wide research areas. Koenker suggested a general approach of QR for longitudinal data (Koenker 2004). QR is used to estimate the conditional median and any other quantiles of the dependent predictor variables, and it could tackle the unobserved heterogeneity effects. The QR could describe that relationship at different points in the conditional median or quantiles distribution of dependent variable Q y / X ( τ ) , where τ is the quantiles or percentiles and takes value from 0 < τ < 1 (Ambark et al. 2023). The model structure of the multiple linear regression is
y = X T β τ + ε
where y n × 1 is a vector of the response variable, X n × p is a matrix of the predictor variables, β τ p × 1 is the unknown vector of the regression coefficients associated with the τth quantile, ε n × 1 is a vector of the random observation errors that are supposed to be a normal distribution with zero mean error term and variance E ε = σ 2 I n .
The linear quantile regression model assumes:
Q y / X ( τ ) = X T β τ   ;   β τ = { β 0 τ , β 1 τ , , β p τ }
where β τ   is the quantile coefficient. Then, the τ t h quantile regression estimator minimizes the objective function β τ (Davino et al. 2013), which is given by:
β ^ τ = m i n β i = 1 n ρ τ (   y i x i T β )
where x i T is the ith row of X and ρ τ ( v ) is a loss function defined as follows:
ρ τ v = v τ I { v < 0 } ) ;     0 < τ < 1   and   v R
To improve quantile regression and regularization, Koenker suggested a penalized version, as follows:
β ^ τ = m i n β i = 1 n ρ τ (   y i x i T β ) + P λ ( β )
where P ( β ) is the penalty function and λ is the tuning, which is greater than zero.

2.2. Elastic Net Regression

ELNET regression is proposed to deal with the limitations of LASSO and improve the interpretability model and accuracy prediction by combining two penalties; namely, L 1 penalty (LASSO) and L 2 penalty (RR) (Zou and Hastie 2005; Zou and Zhang 2009; Friedman et al. 2010; Lee et al. 2016). The ELNET estimator is given as follows:
β ^ E L N E T = m i n β i = 1 n (   y i x i T β ) + λ 1 β 1 + λ 2 β 2 2
where β 1 = j = 1 p β j     is L 1 -norm of β, β 2 2 = j = 1 p β j   2   is L 2 -norm of β, and λ 1 and λ 2 are the tuning λ 1   ,   λ 2   > 0 . These functions control the strength of shrinkage of the predictor variables. The values of λ 1 and λ 2 are dependent on the dataset, and they are automatically selected using CV (Zou and Hastie 2005; Melkumova and Shatskikh 2017; Masselot et al. 2018; Al-Jawarneh and Ismail 2024). The best values of tuning parameters λ 1   and λ 2 can be defined as the minimum mean squared error (MSE) (Friedman et al. 2010; Lee et al. 2016).
Equation (7) becomes equivalent to the following by denoting λ 1 = 2 n λ α and λ 2 = n λ ( 1 α ) (Haws et al. 2015; Al-Jawarneh et al. 2022):
β ^ E L N E T = m i n β i = 1 n   y i x i T β + λ α β 1 + ( 1 α ) 2 β 2 2
where α is a regularization parameter between zero and one. The ELNET estimation undergoes the RR estimator when α = 0, whereas it is subject to the LASSO estimator when α = 1.

2.3. D-Fold Cross-Validation

The D-fold cross-validation (D-CV) method was proposed by (Geisser 1975). D-CV idea is to split the dataset into D folds nearly equal in size. After that, the D − 1 folds are used as the training set for the estimation of the model, while the dt fold is used as a test set to assess the predictive performance of the model. This process repeats until every D fold serves as the test set. Then, the mean prediction error over all folds is calculated (van Houwelingen and Sauerbrei 2013; Gareth et al. 2013). The D-CV algorithm is explained in Algorithm 1 (Hastie et al. 2009; Melkumova and Shatskikh 2017; Hastie et al. 2015) and is repeated for each fold as shown in Figure 1.
Algorithm 1: D-fold Cross-Validation
1-
Randomly, split the whole dataset of size n into D folds of roughly equal size.
2-
  • For a grid of S values of  λ : s in 1:S.
    • For d in 1:D.
      • Consider D 1 folds as a training set, and d t h fold as a test set.
      • By the training set, using an estimation method to estimate the regression coefficient at λ s value, and denoted by the fitted function f ^ d , λ s ( z ) .
      • Calculate the prediction error (PE) on the test set.
        P E d , λ s = i = 1 n d V i f ^ d , λ s ( z i ) 2
    • End for d .
  • Repeat for d = 1,2 , , D
  • For each λ s overall fold, calculate the average of D prediction errors.
    C V λ s = 1 D d = 1 D P E d , λ s
    • End for s.
  • Repeat for s in 1 : S ¯
3-
Choose the optimal λ that gives a minimum average CV.
λ o p t = a r g m i n s = 1,2 , S C V λ s
End.
Figure 1 describes the process of D-CV where the dataset is split for D folds. Each row represents one iteration and each column represents one fold. For example, in the first iteration, the 1st fold represents the test set, whereas the remaining D − 1 folds represent the training set. In the second iteration, the 2nd fold represents the test set, and the other folds are the training set. Thus, repeat this until every D fold serves as the test set.
The size of each training set is equal to [ ( D 1 ) n ] D   observations, as the D value increase leads to a decrease in the bias of the fit model, whereas the variance will increase, as will the correlation among the fitted model because of the overlap among the training sets (Gareth et al. 2013). Usually, the D value is chosen at 5 or 10, where these values yield estimates that achieve an intermediate level of bias and are not excessively biased nor from high variance; Thus, D = 5 or 10 involve the bias-variance trade-off (Gareth et al. 2013; Kuhn and Johnson 2013; Al-Jawarneh and Ismail 2024).

2.4. Proposed Penalized Quantile Regression Method

The Elastic-net quantile regression method based on D-CV (ELNET.QR) is presented to explain the significance of the predictor variables on the response variable and enhance the prediction error of the final model based on the optimal tuning parameters as follows:
  • Apply the QR method at τ = (0.25, 0.50, 0.75) using all the variables:
    β ^ τ Q R = m i n β i = 1 n ρ τ (   y i x i T β )
  • Using the training set only, select the optimal parameters via the D-CV method at D = 10 as follows:
    • The regularization parameter   α o p t value of the sequence 0 < α < 1, where   α o p t represents the relative contribution of the L1 penalty versus L2 penalty.
      α o p t = a r g m i n k = 1 : K ¯ C V α k   ;
      C V α k = 1 10 d = 1 10 P E α k   ;   α k ( 0 , 1 )
      where k represents the number of α values between one and zero and will be chosen. In this study, we choose K = 50 .
    • The tuning parameter λ o p t   value is at α o p t
      λ o p t   = a r g m i n s = 1 : S ¯ C V α o p t ,   λ s   ;
      C V α o p t ,   λ s   = 1 10 d = 1 10 M S E α o p t ,   λ s  
  • Based on the Equations (7) and (10) at α o p t and λ o p t , the ELNET penalized regression is used as the following formula:
    β ^ τ E L N E T .   Q R = m i n β i = 1 n ρ τ   y i x i T β + λ o p t   α o p t β ^ 1 + ( 1 α o p t ) 2 β ^ 2 2 ;
    ρ τ ( v ) = v ( τ I { v < 0 } )
Finally, a comparison was made between the proposed methods with traditional methods. The performance of the proposed estimated method has been tested by using several well-known criteria, namely; Residual Sum of Squares (RSS (Equation (14)), root mean square error (RMSE; Equation (15)), mean absolute error (MAE; Equation (16)), mean absolute percentage error (MAPE; Equation (17)), and mean absolute scaled error (MASE; Equation (18)).
R S S = i = 1 n ( y i y ^ i ) 2
R M S E = 1 n l = 1 n ( y l y ^ l ) 2
M A E = 1 n l = 1 n y l y ^ l
M A P E = 100 % n i = 1 n y i y ^ i y i
M A S E = 1 n l = 1 n y l y ^ l 1 n 1 l = 2 n y l y l 1

3. Application

This section implemented the numerical simulation experiment and a real dataset application to show the capacity of the proposed methods. The analyses are performed using open-source R 4.3.1 software by using the hqreg package and our developed code to calculate the function of obtaining the best tuning parameters value for ELNET.QR regression.

3.1. Simulation Study

In this section, we present the results of the numerical simulation for the eight methods: namely; RR.QR, LASSO.QR, ELNET.QR at the best   α o p t value, ELNET.QR at α = 0.25 , ELNET.QR at α = 0.5 ,   ELNET.QR at α = 0.75 , AdLASSO.QR method based on the minimum MSE  ( λ m i n ) , and minimum MSE with one standard error ( λ 1 s e ) of weighted RR. We evaluate and illustrate these eight methods of variable selection and prediction performance under a normal distribution. Simulation scenarios considered three QR levels at τ = 0.25 ,   0.5 ,   0.75 , a sample size of n = 150, and iteration= 1000. The 10-CV was applied to select the best tuning parameter values. The simulated data are split into two parts: 70% for training, and 30% for testing the estimated models then evaluated by using the performance criteria.

3.2. Application Datasets

The European Union (EU) is an economic and political union of 27 countries. It operates an internal (or single) market, which allows the free movement of goods, capital, services, and people between member states. In recent years, the European Union has made significant and noticeable progress in implementing new policies in energy consumption to shift to low carbon emissions, while crude oil market prices have experienced important changes and is more volatile than the price of other tradable commodities that have negative effects on investment.
The dataset of energy oil prices with the relevant affected factors is included to evaluate the performance of the penalized regression methods. The explanatory variable is crude oil prices ( y ) measured by local currency per barrel, while the predictor variables are the consumer price index, retail trade, the foreign exchange rate (Euro/USD), and interest rates represented by x 1 , ,   x 4 respectively. The data are collected on a monthly basis from the beginning of the year 2000 to the end of the year 2022 to include the highest affected period, such as the financial crisis in 2008, COVID-19 in 2020, and the Russian-Ukraine war in 2022. The data for all variables are gathered from Bloomberg (2023). There are several variables used to examine oil price changes, but we included the variables that are available every quarter, and those variables have effects on changes in the oil price according to economic theory, policy, and literature.
Recent literature included those variables as independent variables to measure the effects on crude oil prices (Kirikkaleli and Doğan 2021; Yilmaz and Altay 2016; Alsayed 2023; Amano and Van Norden 1998). Their findings showed that it significantly affects oil prices at various levels. Other studies used advanced statistical methods such as a multivariate adaptive regression splines model to detect the effect of foreign exchange (USD-TRY), credit default swap spread, global uncertainty, and global volatility on local currency oil prices at a local economy level in Turkey during the COVID-19 pandemic using daily data from July 2019 to October 2020 (Kartal 2020). Doğrul and Soytas (2010) detected the relationship between oil prices, interest rate, economic activity, and unemployment in Turkey by applying the Toda–Yamamoto technique. Their findings support that the volatility index is the most important factor influencing crude oil prices.
Descriptive statistics of our dataset are presented in Figure 2a–c. We can observe that crude oil prices were facing three serious shocks with an exponential increase in 2008, the spread of the COVID-19 pandemic in 2020, and the recent oil price slump in 2022, which have substantially raised the economic uncertainty and geopolitical risk levels. The combination of those economic degradations will likely initiate a long-term economic downturn and drive the European Union economy into the next recession.
The estimated model consists of four independent variables x 1 , , x 4 , and the terms of interaction between variables are x 5 , , x 10 . The dataset is divided into two parts: 70% (192 cases) which used for the training dataset and the remaining 30% of the dataset (83 cases) is used for testing. The whole dataset has been made stationarity at first difference and then standardized before doing the analysis. The interested estimated model is as follows.
y t = α t + β   1 t x 1 + β 2 t x 2 + β 3 t   x 3 + β 4 t   x 4 + β 5 t x 5   + β 6 t x 6   + β 7 t x 7   + β 8 t x 8 +   β 9 t x 9   + β 10 t x 10 + ε t
where y represents the oil price at day t in Europe, x 1 is the consumer price index, x 2   is retail trade, x 3   is exchange rate, x 4   is interest rates, x 5 = x 1 x 2 ,     x 6 = x 1 x 3 ,   x 7 = x 1 x 4 ,     x 8 = x 2 x 3 ,   x 9 = x 2 x 4 ,   x 10 = x 3 x 4   and εt is the error term.

4. Results and Discussion

In this section, we provide the result of the numerical experiment and the application based on the real dataset.

4.1. Simulation Results

Table 1 describes the average of the performance criteria in terms of RSS, RMSE, MAE, MASE, and MAPE for all the regression methods used in this study. In cases of study at τ = (0.25, 0.50, 0.75), the results show that the proposed regression method ELNET.QR   α o p t , λ m i n (by determining the best optimal α value) has the smallest error value in these criteria tests. Therefore, ELNET.QR α o p t , λ m i n   improves the prediction accuracy by producing the smallest error values in terms of RSS, RMSE, MAE, MASE, and MAPE.

4.2. Application Results and Discussion

Figure 3 shows the curve of R S S to choose the optimal alpha (   α o p t ) in three cases of τ = 0.25 ,   0.50 ,   0.75 he y -axis represents the estimation of the RSS and the x -axis represents the alpha values. At τ = 0.25 , the minimum RSS value appears at   α o p t = 0.38 , whereas at τ = 0.50 , 0.75 , the minimum RSS values are at 0.02 in two cases. These results indicate that the optimal alpha can reduce the RSS value more than the traditional methods for chosen alpha at fixed points like α = 0.25 ,   0.5 ,   0.75 , or other methods like lasso and ridge methods.
Figure 4 shows the 10-CV estimate plot of the ELNET.Qr αopt; ( α o p t = 0.38 ,   0.02 ,   and   0.02 ) and τ = 0.25, 0.50, 0.75, respectively. In each plot, the red dotted line is the mean square error ( M S E ) curve with one standard error band along the error bars. The y -axis represents the M S E   and the x -axis denotes the l o g ( λ ) function. The upper horizontal line of the plot represents the numbers of nonzero regression coefficients in the model at l o g ( λ ) value. The first vertical dotted lines from the right represent the point selected at a minimum of the M S E   ( λ m i n ) rule, while the second vertical line denotes the location of the point selected at a minimum of M S E with the one standard error ( λ 1 s e ) rule. These two lines show the numbers of nonzero regression coefficients selected at λ 1 s e and λ m i n rules. The increase in λ value leads to a decrease in the number of non-zero coefficients in the model. Therefore, the selection of the λ is based on the optimal minimum M S E value.
Figure 5 illustrates the relationship between l o g ( λ ) and the selected nonzero coefficient estimation in the ELNET.QR αopt( 0.38 ,   0.02 ,   and   0.02 ) at current λ, which represents the actual degrees of freedom. All methods have regularization and variable selection. In each one of the figures from right to left, the estimation of the coefficients decreases toward zero with the increase in the λ value and forces it to become zero for the unnecessary coefficient estimation (i.e., i f   λ ,   then the estimated coefficients   0 ). For instance, at τ = 0.25 , the ELNET.QR as the   α o p t = 0.38 method selected eight nonzero coefficients at λ m i n   and five nonzero coefficients at λ 1 s e with different significant strengths.
Table 2 display the R S S values and the number of variable selections (Num. of V.S.) of the proposed method compared to those of the previous methods in the testing datasets. Based on the R S S values, the order of the proposed methods among all used methods in this study is as follows: The first-order method among all methods is achieved by ELNET.QR   α o p t = 0.38 ,   τ = 0.25     ( λ m i n = 0.0787 ; RSS  = 84.426 ; Num. of V.S. = 8), c,  τ = 0.5   ( λ m i n = 0.3392 ; RSS = 72.9008; Num. of V.S. = 10). At τ = 0.75   ( λ m i n = 0.2555 ; RSS = 97.6474; Num. of V.S. = 9), it has the smallest RSS value.
Table 3 shows the performance criteria of the prediction accuracy for comparing the penalized QR regressions methods by using RMSE, MAE, MAPE, and MASE. The proposed method is ELNET.QR   α o p t . In three cases of study at τ = 0.25 ,   0.5 ,   0.75 , it provides the smallest error value in terms of RMSE, MAE, MASE, and MAPE. For instance,   τ = 0.25 at λ m i n =   0.0787 provides the first-order method with the smallest error value, and it remains the same for τ = 0.5 at λ m i n = 0.3392 and τ = 0.75 at λ m i n = 0.2555 . However, τ = 0.25 and 0.5 provides a different order in terms of MAPE, and at τ = 0.75 , it provides a second order in terms of MAE. Therefore, ELNET.QR   α o p t   improves prediction accuracy by producing the smallest error values in terms of RSS, RMSE, MAE, and MAPE.
Generally, in this application, the proposed ELNET.QR   α o p t   at τ = 0.25 ,   0.50 ,   0.75 , and λ m i n is better in three cases of τ . Moreover, it proved that these predictors have a great significance on the response variable. The ELNET.QR   α o p t   method has achieved the best method for reducing the number of components and selecting predictor variables with high prediction accuracy. The ELNET.QR   α o p t   method deals with multicollinearity in three cases of τ by choosing some of these variables and forcing the other to be zero in the final model, whereas the RR.QR method lost reliability and accuracy for the selection. This method has been selected for all the predictor variables. The rest of the methods are LASSO.QR and AdLASSO.QR methods to deal with multicollinearity.
Based on the findings in the previous section, we will rely on the ELNET.QR   α o p t estimated coefficients to interpret the oil price model as shown in Table 4, as it is more consistent in terms of RMSE, MAE, MAPE, and MASE. These results imply that the changes in the oil prices could be explained with the variables included in the analysis with different significant strengths, particularly the exchange rate ( x 3 ), as it has the highest effect on oil prices. As expected, there is a positive relationship between crude oil prices and exchange rate, retail trade, interest rates and the consumer price index. In other words, crude oil prices increase when the exchange rate and other variables increase. In addition, the results reveal that the interaction team has a very important role and a mixed effect on the changes in oil prices. The results obtained are consistent with the literature. The oil prices in Europe have an increasing trend during several periods, but are low in comparison during the COVID-19 pandemic until the period at the start of the Russian-Ukraine war; they have been quite high since the beginning of the war (Krozer 2013; Borowski 2020; Balashova and Serletis 2021).

5. Conclusions

In this study, we proposed ELNET.QR based on selecting the best alpha value   α o p t using a cross-validation method. The method was used to identify the relationship between the predictor variables and the response variable to improve the accuracy of model selection and to deal with heavy-tailed distributions, heterogeneity, and multicollinearity between the predictor variables by determining the best alpha value. Numerical experiments and actual time-series datasets were carried out. The results showed that the ELNET.QR   α o p t method effectively selected the actual predictor variables that were most significant for the response variable with a reduced prediction error at τ = 0.25 ,   0.5 ,   0.75 . The ELNET.QR   α o p t method selected the best-fitting model with high prediction accuracy, compared to the other methods. It also proved that not all of the alpha values can be used to represent the Elastic net. Thus, the cross-validation represents the best way to choose an alpha value and use it as the optimal value for building the final model.
This method offers additional insights into the behaviour of crude oil prices, as it provides evidence of how oil prices have changed by the exchange rate, retail trade, interest rates, and consumer price index during the economic shocks in the European Union, while the findings reveal that the exchange rate has the highest effect on changing the crude oil price. In conclusion, this research indicates that the selected penalized QR method is well suited for modelling crude oil prices while considering the dynamic effect on the factors that influence local currency and the global economy in the European Union. By considering these results, the European Union should make the exchange rate stable so that the effect of these variables on oil prices are less significant. The main reason behind this recommendation is the focus on local factors, as it is either mostly or partially under the control of European policy.

Author Contributions

Conceptualization, Writing, original draft preparation, methodology, software, formal analysis: A.S.A.-J. and A.R.M.A. Methodology, review and editing H.N.A. Supervision: M.T.I., S.K.S., K.H.A. and G.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The used data of this study are available in Bloomberg website. https://www.bloomberg.com/professional/solution/bloomberg-terminal, accessed on 29 May 2024.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Aastveit, Knut Are, Jamie L. Cross, and Herman K. van Dijk. 2023. Quantifying time-varying forecast uncertainty and risk for the real price of oil. Journal of Business & Economic Statistics 41: 523–37. [Google Scholar]
  2. Akaike, Hirotsugu, Boris Nikolaevich Petrov, and F. Csaki. 1973. Second International Symposium on Information Theory. Budapest: Akademiai Kiado, pp. 267–81. [Google Scholar]
  3. Al-Jawarneh, Abdullah S., and Mohd Tahir Ismail. 2024. The adaptive LASSO regression and empirical mode decomposition algorithm for enhancing modelling accuracy. Communications in Statistics-Simulation and Computation 53: 714–26. [Google Scholar] [CrossRef]
  4. Al-Jawarneh, Abdullah S., Mohd Tahir Ismail, Ahmad M. Awajan, and Ahmed R. M. Alsayed. 2022. Improving accuracy models using elastic net regression approach based on empirical mode decomposition. Communications in Statistics-Simulation and Computation 51: 4006–25. [Google Scholar] [CrossRef]
  5. Alsayed, Ahmed R. M. 2023. Turkish stock market from pandemic to Russian invasion, evidence from developed machine learning algorithm. Computational Economics 62: 1107–23. [Google Scholar] [CrossRef]
  6. Alsayed, Ahmed R. M., and Giancarlo Manzi. 2019. A comparison of monotonic correlation measures with outliers. WSEAS Transactions on Computers 18: 223–30. [Google Scholar]
  7. Alsayed, Ahmed R. M., Zaidi Isa, and Sek Siok Kun. 2018. Outliers detection methods in panel data regression: An application to environment science. International Journal of Ecological Economics & Statistics 39: 73–86. [Google Scholar]
  8. Alsayed, Ahmed R. M., Zaidi Isa, Sek Siok Kun, and Giancarlo Manzi. 2020. Quantile regression to tackle the heterogeneity on the relationship between economic growth, energy consumption, and CO2 emissions. Environmental Modeling & Assessment 25: 251–58. [Google Scholar]
  9. Amano, Robert A., and Simon Van Norden. 1998. Exchange rates and oil prices. Review of International Economics 6: 683–94. [Google Scholar] [CrossRef]
  10. Ambark, Ali S. A., Mohd Tahir Ismail, Abdullah S. Al-Jawarneh, and Samsul Ariffin Abdul Karim. 2023. Elastic Net Penalized Quantile Regression Model and Empirical Mode Decomposition for Improving the Accuracy of the Model Selection. IEEE Access 11: 26152–62. [Google Scholar] [CrossRef]
  11. Balashova, Svetlana, and Apostolos Serletis. 2021. Oil price uncertainty, globalization, and total factor productivity: Evidence from the European Union. Energies 14: 3429. [Google Scholar] [CrossRef]
  12. Baumeister, Christiane, and Lutz Kilian. 2015. Forecasting the real price of oil in a changing world: A forecast combination approach. Journal of Business & Economic Statistics 33: 338–51. [Google Scholar]
  13. Bloomberg. 2023. Bloomberg Terminal. Available online: https://www.bloomberg.com/professional/solution/bloomberg-terminal/ (accessed on 1 January 2023).
  14. Borowski, Piotr F. 2020. Zonal and Nodal Models of energy market in European Union. Energies 13: 4182. [Google Scholar] [CrossRef]
  15. Chand, Sohail. 2012. On tuning parameter selection of lasso-type methods—A monte carlo study. Paper presented at 2012 9th International Bhurban Conference on Applied Sciences & Technology (IBCAST), Islamabad, Pakistan, January 9–12; pp. 120–29. [Google Scholar]
  16. Davino, Cristina, Marilena Furno, and Domenico Vistocco. 2013. Quantile Regression: Theory and Applications. Hoboken: John Wiley & Sons. [Google Scholar]
  17. Desboulets, Loann David Denis. 2018. A review on variable selection in regression analysis. Econometrics 6: 45. [Google Scholar] [CrossRef]
  18. Doğrul, H. Günsel, and Ugur Soytas. 2010. Relationship between oil prices, interest rate, and unemployment: Evidence from an emerging market. Energy Economics 32: 1523–28. [Google Scholar]
  19. Efron, Bradley, Trevor Hastie, Iain Johnstone, and Robert Tibshirani. 2004. Least angle regression. The Annals of Statistics 32: 407–99. [Google Scholar] [CrossRef]
  20. Fan, Jianqing, and Runze Li. 2001. Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American statistical Association 96: 1348–60. [Google Scholar] [CrossRef]
  21. Fan, Yingying, and Cheng Yong Tang. 2013. Tuning parameter selection in high dimensional penalized likelihood. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 75: 531–52. [Google Scholar] [CrossRef]
  22. Friedman, Jerome, Trevor Hastie, and Rob Tibshirani. 2010. Regularization paths for generalized linear models via coordinate descent. Journal of Statistical Software 33: 1–22. [Google Scholar] [CrossRef]
  23. Gareth, James, Witten Daniela, Hastie Trevor, and Tibshirani Robert. 2013. An Introduction to Statistical Learning with Applications in R. New York: Springer. [Google Scholar]
  24. Geisser, Seymour. 1975. The predictive sample reuse method with applications. Journal of the American statistical Association 70: 320–28. [Google Scholar] [CrossRef]
  25. Hastie, Trevor, Robert Tibshirani, Jerome H. Friedman, and Jerome H. Friedman. 2009. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. New York: Springer Science & Business Media. [Google Scholar]
  26. Hastie, Trevor, Robert Tibshirani, and Martin Wainwright. 2015. Statistical Learning with Sparsity: The Lasso and Generalizations. Boca Raton: CRC Press. [Google Scholar]
  27. Haws, David C., Irina Rish, Simon Teyssedre, Dan He, Aurelie C. Lozano, Prabhanjan Kambadur, Zivan Karaman, and Laxmi Parida. 2015. Variable-selection emerges on top in empirical comparison of whole-genome complex-trait prediction methods. PLoS ONE 10: e0138903. [Google Scholar] [CrossRef]
  28. He, Yanan, Ai Han, Yongmiao Hong, Yuying Sun, and Shouyang Wang. 2021. Forecasting crude oil price intervals and return volatility via autoregressive conditional interval models. Econometric Reviews 40: 584–606. [Google Scholar] [CrossRef]
  29. Kartal, Mustafa Tevfik. 2020. The Effect of the COVID-19 Pandemic on Oil Prices: Evidence from Turkey. Energy Research Letters. Available online: https://erl.scholasticahq.com/article/18723-the-effect-of-the-covid-19-pandemic-on-oil-prices-evidence-from-turkey (accessed on 29 May 2024).
  30. Kassambara, Alboukadel. 2018. Machine Learning Essentials: Practical Guide in R. Scotts Valley: CreateSpace Independent Publishing Platform. [Google Scholar]
  31. Kirikkaleli, Dervis, and Nezahat Doğan. 2021. Energy consumption and refugee migration in Turkey. Utilities Policy 68: 101144. [Google Scholar] [CrossRef]
  32. Koenker, Roger. 2004. Quantile regression for longitudinal data. Journal of Multivariate Analysis 91: 74–89. [Google Scholar] [CrossRef]
  33. Krozer, Yoram. 2013. Cost and benefit of renewable energy in the European Union. Renewable Energy 50: 68–73. [Google Scholar]
  34. Kuhn, Max, and Kjell Johnson. 2013. Applied Predictive Modeling. New York: Springer. [Google Scholar]
  35. Lee, Young, Vy Nguyen, and Duzhe Wang. 2016. On Variable and Grouped Selections of the Elastic Net. Report CS532 961: 1–24. [Google Scholar]
  36. Li, Youjuan, and Ji Zhu. 2008. L 1-norm quantile regression. Journal of Computational and Graphical Statistics 17: 163–85. [Google Scholar] [CrossRef]
  37. Masselot, Pierre, Fateh Chebana, Diane Bélanger, André St-Hilaire, Belkacem Abdous, Pierre Gosselin, and Taha BMJ Ouarda. 2018. EMD-regression for modelling multi-scale relationships, and application to weather-related cardiovascular mortality. Science of The Total Environment 612: 1018–29. [Google Scholar] [CrossRef]
  38. Melkumova, L. E., and S. Ya Shatskikh. 2017. Comparing Ridge and LASSO estimators for data analysis. Procedia Engineering 201: 746–55. [Google Scholar] [CrossRef]
  39. Qin, Lei, Shuangge Ma, Jung-Chen Lin, and Ben-Chang Shia. 2016. Lasso Regression Based on Empirical Mode Decomposition. Communications in Statistics-Simulation and Computation 45: 1281–94. [Google Scholar] [CrossRef]
  40. Ray, Susmita. 2019. A quick review of machine learning algorithms. Paper presented at 2019 International Conference on Machine Learning, Big Data, Cloud and Parallel Computing (COMITCon), Faridabad, India, February 14–16. [Google Scholar]
  41. Schwarz, Gideon. 1978. Estimating the dimension of a model. The Annals of Statistics 6: 461–64. [Google Scholar] [CrossRef]
  42. Stone, Mervyn. 1974. Cross-validatory choice and assessment of statistical predictions. Journal of the Royal Statistical Society: Series B (Methodological) 36: 111–33. [Google Scholar]
  43. Su, Meihong, and Wenjian Wang. 2021. Elastic net penalized quantile regression model. Journal of Computational and Applied Mathematics 392: 113462. [Google Scholar] [CrossRef]
  44. Tibshirani, Robert. 1996. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological) 58: 267–88. [Google Scholar] [CrossRef]
  45. van Houwelingen, Hans C., and Willi Sauerbrei. 2013. Cross-validation, shrinkage and variable selection in linear regression revisited. Open Journal of Statistics 3: 79–102. [Google Scholar] [CrossRef]
  46. Xiao, Hui, and Yiguo Sun. 2019. On Tuning Parameter Selection in Model Selection and Model Averaging: A Monte Carlo Study. Journal of Risk and Financial Management 12: 109. [Google Scholar] [CrossRef]
  47. Yılmaz, Alper, and Hüseyin Altay. 2016. Examining the cointegration relationship and volatility spillover between imported crude oil prices and exchange rate: The Turkish case. Ege Academic Review 16: 655–71. [Google Scholar]
  48. Zou, Hui, and Hao Helen Zhang. 2009. On the adaptive elastic-net with a diverging number of parameters. Annals of statistics 37: 1733. [Google Scholar] [CrossRef]
  49. Zou, Hui, and Trevor Hastie. 2005. Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 67: 301–20. [Google Scholar] [CrossRef]
Figure 1. Diagram of D -fold cross-validation ( D -CV).
Figure 1. Diagram of D -fold cross-validation ( D -CV).
Jrfm 17 00323 g001
Figure 2. (a) The monthly crude oil prices in the European Union. (b) The consumer price index and retail trade in the European Union. (c) The monthly exchange rate and interest rates in the European Union.
Figure 2. (a) The monthly crude oil prices in the European Union. (b) The consumer price index and retail trade in the European Union. (c) The monthly exchange rate and interest rates in the European Union.
Jrfm 17 00323 g002
Figure 3. 10-CV estimation for choosing the α o p t   of ELNET.QR ( τ = 0.25 ,   0.50 ,   0.75 ).
Figure 3. 10-CV estimation for choosing the α o p t   of ELNET.QR ( τ = 0.25 ,   0.50 ,   0.75 ).
Jrfm 17 00323 g003
Figure 4. 10 -CV estimation of the ELNET.QR α o p t at τ = 0.25 , 0.50 , 0.75 .
Figure 4. 10 -CV estimation of the ELNET.QR α o p t at τ = 0.25 , 0.50 , 0.75 .
Jrfm 17 00323 g004
Figure 5. Coefficient estimation of the ELNET.QR α o p t .
Figure 5. Coefficient estimation of the ELNET.QR α o p t .
Jrfm 17 00323 g005
Table 1. Performance criteria of the simulation scenarios.
Table 1. Performance criteria of the simulation scenarios.
Method λ RSSRMSEMAEMAPEMASE
τ = 0.25
RR.QR λ m i n 35.58270.886240.719773.6572650.794615
λ 1 s e   39.47180.933950.761834.0249550.841051
LASSO.QR λ m i n 14.18300.560930.400531.7463250.442176
λ 1 s e   15.30820.582780.437241.9991090.482706
AdLASSO.QR (RR.W. λmin) λ m i n 14.51570.567620.415792.0820110.459031
λ 1 s e   16.03790.596330.451332.0309260.498263
AdLASSO.QR (RR.W. λ1se) λ m i n 14.71900.570340.418302.0920050.461806
λ 1 s e   16.22240.598780.453432.0445660.500584
ELNET . QR   α = 0.25 λ m i n 19.03380.647010.502692.2727810.554958
λ 1 s e   22.91790.710310.565362.5043810.624143
ELNET . QR   α = 0.5 λ m i n 14.74330.571770.416651.894850.459972
λ 1 s e   17.43720.620820.478372.1317150.528107
ELNET . QR   α = 0.75 λ m i n 14.28950.563060.403771.7806310.445752
λ 1 s e   15.73790.590670.445632.0449580.491972
ELNET . QR     α o p t λ m i n 13.96150.556710.396541.7761010.437778
λ 1 s e   15.71950.590490.446962.0203860.493431
τ = 0.5
RR.QR λ m i n 30.85630.825670.694942.1278620.76720
λ 1 s e   33.56670.862070.728932.0191260.80472
LASSO.QR λ m i n 11.71780.508230.397182.5920740.438483
λ 1 s e   13.67710.549480.450722.4561350.497588
AdLASSO.QR (RR.W. λmin) λ m i n 13.3850.537330.440512.6043910.486318
λ 1 s e   14.91370.569510.469142.4884780.517925
AdLASSO.QR (RR.W. λ1se) λ m i n 13.41630.538050.441682.6023950.487607
λ 1 s e   14.92820.569760.469442.488530.518257
ELNET . QR   α = 0.25 λ m i n 16.16700.597380.488192.6687770.538961
λ 1 s e   19.35410.653710.537902.6021620.593840
ELNET . QR   α = 0.5 λ m i n 13.19760.539570.443882.6652150.490041
λ 1 s e   15.55540.585960.479152.6591320.528980
ELNET . QR   α = 0.75 λ m i n 12.32960.521410.426392.6398710.470737
λ 1 s e   14.33240.562470.462682.5447960.510792
ELNET . QR     α o p t λ m i n 10.30900.476890.372362.5649130.411081
λ 1 s e   13.68000.549530.450872.5150770.497758
τ = 0.75
RR.QR λ m i n 42.74580.969930.796806.2882120.879659
λ 1 s e   46.83971.016520.836176.473820.923126
LASSO.QR λ m i n 12.96710.536230.453904.7167190.501104
λ 1 s e   14.37180.56440.469554.6969690.518382
AdLASSO.QR (RR.W. λmin) λ m i n 14.42910.557780.470794.8893320.519747
λ 1 s e   15.80430.583920.486874.9106420.537499
AdLASSO.QR (RR.W. λ1se) λ m i n 14.52220.559170.471694.8929640.520746
λ 1 s e   15.89260.585370.488034.9168130.538782
ELNET . QR   α = 0.25 λ m i n 20.65170.671910.551335.0738620.608665
λ 1 s e   23.66450.718730.588545.3058330.649741
ELNET . QR   α = 0.5 λ m i n 13.79920.552950.461964.6202980.510000
λ 1 s e   15.16150.579410.478964.6536680.528763
ELNET . QR   α = 0.75 λ m i n 13.08850.538680.454674.682140.501955
λ 1 s e   14.51950.567250.471024.6759780.520007
ELNET . QR     α o p t λ m i n 12.84680.533760.452444.7140460.499487
λ 1 s e   14.33090.563600.468644.6833240.517377
Note: The bold number indicates the lowest values in favor of the superior performance method compared to the other methods.
Table 2. Selected number of variables and RSS error values.
Table 2. Selected number of variables and RSS error values.
Method λ RSSNum. of V.S.V.S.
RR λ m i n =  0.041839109.17710 x 1 , , x 10
λ 1 s e =  0.7483577.014110 x 1 , , x 10
LASSO λ m i n =  0.003994120.0919 x 1 , , x 4 , x 6 , , x 10
λ 1 s e =  0.07839982.36517 x 1 , , x 4 , x 6 , x 8 ,   x 10
ELNET   α = 0.25 λ m i n =  0.013263116.12910 x 1 , , x 10
λ 1 s e =  0.26035276.92887 x 1 , , x 4 , x 6 , x 8 ,   x 10
ELNET   α = 0.5 λ m i n = 0.007987117.9519 x 1 , , x 4 , x 6 , , x 10
λ 1 s e =  0.14286879.55247 x 1 , , x 4 , x 6 , x 8 ,   x 10
ELNET   α = 0.75 λ m i n =  0.005844118.6239 x 1 , , x 4 , x 6 , , x 10
λ 1 s e = 0.104532 80.86567 x 1 , , x 4 , x 6 , x 8 ,   x 10
τ = 0.25
RR.QR λ m i n = 0.1322 103.90710 x 1 , , x 10
λ 1 s e   = 0.5480 92.04610 x 1 , , x 10
LASSO.QR λ m i n     = 0.0471 87.4924 x 1 , , x 4
λ 1 s e = 0.0720 93.6593 x 1 , x 2 , x 4
AdLASSO.QR (RR.W. λmin) λ m i n = 0.0008 100.1394 x 1 , x 3 , x 4 , x 10
λ 1 s e = 0.0028 100.7973 x 1 , x 3 , x 4
AdLASSO.QR (RR.W. λ1se) λ m i n = 0.0003 96.5424 x 1 , , x 4
λ 1 s e = 0.0012 99.3693 x 1 , x 3 , x 4
ELNET . QR   α = 0.25 λ m i n = 0.0911 86.3778 x 1 , , x 6 , x 8 , x 10
λ 1 s e =  0.200287.6145 x 1 , , x 4 , x 10
ELNET . QR   α = 0.5 λ m i n = 0.0317 100.7629 x 1 , , x 6 , x 8 , x 9 , x 10
λ 1 s e =  0.127590.67384 x 1 , , x 4
ELNET . QR   α = 0.75 λ m i n = 0.054 87.0455 x 1 , , x 4 , x 6
λ 1 s e = 0.0931 92.9674 x 1 , , x 4
ELNET . QR     α o p t = 0.38 λ m i n =   0.0787 84.4268 x 1 , , x 6 , x 8 , x 10
λ 1 s e = 0.1532 89.4585 x 1 , , x 4 , x 10
τ = 0.5
RR.QR λ m i n = 0.0678 79.041210 x 1 , , x 10
λ 1 s e   = 0.6563 74.813710 x 1 , , x 10
LASSO.QR λ m i n     = 0.0228 77.27488 x 1 , , x 4 , x 6 , x 8 , x 10
λ 1 s e = 0.0531 75.00645 x 1 , , x 4 , x 8
AdLASSO.QR (RR.W. λmin) λ m i n = 0.0006 79.34983 x 1 , x 3 , x 4
λ 1 s e   = 0.0035 77.06062 x 1 , x 3
AdLASSO.QR (RR.W. λ1se) λ m i n = 0.0001 73.21895 x 1 , , x 4 , x 10
λ 1 s e   = 0.0009 80.39823 x 1 , x 3 , x 4
ELNET . QR   α = 0.25 λ m i n =  0.039076.2839 x 1 , , x 6 , x 8 , x 9 , x 10
λ 1 s e =  0.171974.42878 x 1 , , x 4 , x 6 , x 8 , x 10
ELNET . QR   α = 0.5 λ m i n =  0.035777.37929 x 1 , , x 6 , x 8 , x 9 , x 10
λ 1 s e =  0.100074.97406 x 1 , , x 4 , x 8 , x 10
ELNET . QR   α = 0.75 λ m i n =  0.027777.51818 x 1 , , x 4 , x 6 , x 8 , x 9 , x 10
λ 1 s e =  0.068774.92866 x 1 , , x 4 , x 8 , x 10
ELNET . QR     α o p t = 0.02 λ m i n = 0.3392 72.900810 x 1 , , x 10
λ 1 s e = 0.7450 74.97299 x 1 , , x 4 , x 6 , x 7 , x 10
τ = 0.75
RR.QR λ m i n = 0.0511 99.879910 x 1 , , x 10
λ 1 s e   = 0.6298 99.373310 x 1 , , x 10
LASSO.QR λ m i n     = 0.0053 108.91248 x 1 , , x 4 , x 6 , x 8 , , x 10
λ 1 s e = 0.0354 101.62895 x 1 , x 3 , x 4 , x 6 , x 10
AdLASSO.QR (RR.W. λmin) λ m i n = 0.0004 113.29575 x 1 , x 3 , x 4 , x 6 , x 10
λ 1 s e   = 0.0011 118.73154 x 1 , x 3 , x 6 , x 10
AdLASSO.QR (RR.W. λ1se) λ m i n = 0.0001 100.75815 x 1 , x 3 , x 4 , x 6 , x 10
λ 1 s e = 0.0003 105.51923 x 1 , x 3 , x 4
ELNET . QR   α = 0.25 λ m i n =  0.0204103.23038 x 1 , x 4 , x 6 , x 8 , x 10
λ 1 s e =  0.1334102.97626 x 1 x 3 , x 4 ,   x 6 , x 8 , x 10
ELNET . QR   α = 0.5 λ m i n =  0.0102106.19628 x 1 , x 4 , x 6 , x 8 , x 10
λ 1 s e = 0.0688 101.79875 x 1 x 3 , x 4 ,   x 6 , x 10
ELNET . QR   α = 0.75 λ m i n =  0.0068107.96458 x 1 , x 4 , x 6 , x 8 , x 10
λ 1 s e =  0.0458101.42875 x 1 x 3 , x 4 ,   x 6 , x 10
ELNET . QR     α o p t = 0.02 λ m i n = 0.2555 97.64749 x 1 , , x 6 , x 8 , x 10
λ 1 s e = 0.8571 105.41148 x 1 , x 4 , x 6 , x 8 , x 10
Note: The bold number indicates the lowest values in favor of the superior performance method compared to the other methods.
Table 3. Performance criteria.
Table 3. Performance criteria.
Method λ RMSEMAEMAPEMASE
RR λ m i n =  0.0418391.14690.78061.72600.8729
λ 1 s e =  0.748350.96330.71051.22350.7945
LASSO λ m i n =  0.0039941.20290.80691.82650.9022
λ 1 s e =  0.0783990.99620.71611.33240.8008
ELNET α = 0.25 λ m i n =  0.0132631.18290.79741.79230.8916
λ 1 s e =  0.2603520.96270.70461.23430.7879
ELNET α = 0.5 λ m i n =  0.0079871.19210.80181.80800.8966
λ 1 s e =  0.1428680.97900.70961.28850.7935
ELNET α = 0.75 λ m i n =  0.0058441.19550.80351.81360.8985
λ 1 s e = 0.104532 0.98710.71291.31120.7972
τ = 0.25
RR.QR λ m i n = 0.1322 1.11890.88762.5370.9924
λ 1 s e   = 0.5480 1.05310.86102.7270.9628
LASSO.QR λ m i n     = 0.0471 1.02670.82172.5140.9187
λ 1 s e = 0.0720 1.06230.85202.5710.9527
AdLASSO.QR (RR.W. λmin) λ m i n = 0.0008 1.09840.86442.6180.9667
λ 1 s e = 0.0028 1.10200.87422.6390.9776
AdLASSO.QR (RR.W. λ1se) λ m i n = 0.0003 1.07850.85272.7040.9534
λ 1 s e = 0.0012 1.09420.86612.6710.9685
ELNET . QR   α = 0.25 λ m i n = 0.0911 1.02010.82592.42730.9236
λ 1 s e =  0.20021.02740.83872.58790.9378
ELNET . QR   α = 0.5 λ m i n = 0.0317 1.10180.87792.56500.9817
λ 1 s e =  0.12751.04520.84322.55920.9429
ELNET . QR   α = 0.75 λ m i n = 0.054 1.02410.81952.52570.9164
λ 1 s e = 0.0931 1.05830.85002.56360.9505
ELNET . QR   α o p t = 0.38 λ m i n = 0.0787 1.00860.81802.4440.9146
λ 1 s e = 0.1532 1.03820.84042.5650.9398
τ = 0.5
RR.QR λ m i n = 0.0678 0.97590.70581.64860.7893
λ 1 s e   = 0.6563 0.94940.68761.22750.7689
LASSO.QR λ m i n     = 0.0228 0.96490.70371.66930.7869
λ 1 s e = 0.0531 0.95060.69071.42990.7724
AdLASSO.QR (RR.W. λmin) λ m i n = 0.0006 0.97780.71091.71080.7949
λ 1 s e   = 0.0035 0.96360.71511.68330.7996
AdLASSO.QR (RR.W. λ1se) λ m i n = 0.0001 0.93920.68721.59840.7685
λ 1 s e   = 0.0009 0.98420.70991.39130.7938
ELNET . QR   α = 0.25 λ m i n =  0.03900.95900.69931.64030.7820
λ 1 s e =  0.17190.94700.68761.30010.7689
ELNET . QR   α = 0.5 λ m i n =  0.03570.96550.70311.64940.7863
λ 1 s e =  0.10000.95040.68951.32760.7710
ELNET . QR   α = 0.75 λ m i n =  0.02770.96640.70401.66120.7872
λ 1 s e =  0.06870.95010.69011.39270.7717
ELNET . QR     α o p t = 0.02 λ m i n = 0.3392 0.93720.68271.33900.7634
λ 1 s e = 0.7450 0.95040.69141.19550.7731
τ = 0.75
RR.QR λ m i n = 0.0511 1.09700.79132.98230.8848
λ 1 s e   = 0.6298 1.09420.78992.91920.8833
LASSO.QR λ m i n     = 0.0053 1.14550.83233.13770.9307
λ 1 s e = 0.0354 1.10650.77852.91820.8705
AdLASSO.QR (RR.W. λmin) λ m i n = 0.0004 1.16830.86443.15100.9665
λ 1 s e   = 0.0011 1.19600.90163.42281.0082
AdLASSO.QR (RR.W. λ1se) λ m i n = 0.0001 1.10180.78372.9090.8764
λ 1 s e = 0.0003 1.12750.81373.10280.9099
ELNET . QR   α = 0.25 λ m i n =  0.02041.11520.80573.01210.9010
λ 1 s e =  0.13341.11390.79472.97580.8886
ELNET . QR   α = 0.5 λ m i n =  0.01021.13110.81913.06540.9159
λ 1 s e = 0.0688 1.10750.78472.93900.8775
ELNET . QR   α = 0.75 λ m i n =  0.00681.14050.82783.11410.9256
λ 1 s e =  0.04581.10550.78012.92120.8723
ELNET . QR     α o p t =  0.02 λ m i n = 0.2555 1.08470.78122.88450.8736
λ 1 s e = 0.8571 1.12700.81563.04040.9120
Note: The bold number indicates the lowest values in favor of the superior performance method compared to the other methods.
Table 4. Coefficients estimation for the predictor variables by ELNET.QR method.
Table 4. Coefficients estimation for the predictor variables by ELNET.QR method.
τ  = 0.25 ,   α o p t =  0.38τ  = 0.5 ,   α o p t =  0.02τ  = 0.75 ,   α o p t =  0.02
λ m i n λ 1 s e λ m i n λ 1 s e λ m i n λ 1 s e
β ^ 1 0.16860.13130.15450.09040.10540.0548
β ^ 2 0.12250.04150.10660.06140.03060.0184
β ^ 3 0.30380.21100.20720.10610.14210.0703
β ^ 4 0.11510.09480.13600.08790.07610.0487
β ^ 5 0.01500−0.01720−0.02770
β ^ 6 0.052900.04130.01230.06480.0158
β ^ 7 000.00330.002600
β ^ 8 0.003700.04480.01280.04890.0093
β ^ 9 000.01330.0013−0.0357−0.0048
β ^ 10 0.08020.00620.08620.05310.10220.0425
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Al-Jawarneh, A.S.; Alsayed, A.R.M.; Ayyoub, H.N.; Ismail, M.T.; Sek, S.K.; Ariç, K.H.; Manzi, G. Enhancing Model Selection by Obtaining Optimal Tuning Parameters in Elastic-Net Quantile Regression, Application to Crude Oil Prices. J. Risk Financial Manag. 2024, 17, 323. https://doi.org/10.3390/jrfm17080323

AMA Style

Al-Jawarneh AS, Alsayed ARM, Ayyoub HN, Ismail MT, Sek SK, Ariç KH, Manzi G. Enhancing Model Selection by Obtaining Optimal Tuning Parameters in Elastic-Net Quantile Regression, Application to Crude Oil Prices. Journal of Risk and Financial Management. 2024; 17(8):323. https://doi.org/10.3390/jrfm17080323

Chicago/Turabian Style

Al-Jawarneh, Abdullah S., Ahmed R. M. Alsayed, Heba N. Ayyoub, Mohd Tahir Ismail, Siok Kun Sek, Kivanç Halil Ariç, and Giancarlo Manzi. 2024. "Enhancing Model Selection by Obtaining Optimal Tuning Parameters in Elastic-Net Quantile Regression, Application to Crude Oil Prices" Journal of Risk and Financial Management 17, no. 8: 323. https://doi.org/10.3390/jrfm17080323

APA Style

Al-Jawarneh, A. S., Alsayed, A. R. M., Ayyoub, H. N., Ismail, M. T., Sek, S. K., Ariç, K. H., & Manzi, G. (2024). Enhancing Model Selection by Obtaining Optimal Tuning Parameters in Elastic-Net Quantile Regression, Application to Crude Oil Prices. Journal of Risk and Financial Management, 17(8), 323. https://doi.org/10.3390/jrfm17080323

Article Metrics

Back to TopTop