Next Issue
Volume 4, December
Previous Issue
Volume 4, June
 
 

Econometrics, Volume 4, Issue 3 (September 2016) – 9 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
779 KiB  
Article
Econometric Information Recovery in Behavioral Networks
by George Judge
Econometrics 2016, 4(3), 38; https://doi.org/10.3390/econometrics4030038 - 14 Sep 2016
Cited by 3 | Viewed by 5944
Abstract
In this paper, we suggest an approach to recovering behavior-related, preference-choice network information from observational data. We model the process as a self-organized behavior based random exponential network-graph system. To address the unknown nature of the sampling model in recovering behavior related network [...] Read more.
In this paper, we suggest an approach to recovering behavior-related, preference-choice network information from observational data. We model the process as a self-organized behavior based random exponential network-graph system. To address the unknown nature of the sampling model in recovering behavior related network information, we use the Cressie-Read (CR) family of divergence measures and the corresponding information theoretic entropy basis, for estimation, inference, model evaluation, and prediction. Examples are included to clarify how entropy based information theoretic methods are directly applicable to recovering the behavioral network probabilities in this fundamentally underdetermined ill posed inverse recovery problem. Full article
Show Figures

Figure 1

381 KiB  
Article
Generalized Fractional Processes with Long Memory and Time Dependent Volatility Revisited
by M. Shelton Peiris and Manabu Asai
Econometrics 2016, 4(3), 37; https://doi.org/10.3390/econometrics4030037 - 5 Sep 2016
Cited by 9 | Viewed by 7152
Abstract
In recent years, fractionally-differenced processes have received a great deal of attention due to their flexibility in financial applications with long-memory. This paper revisits the class of generalized fractionally-differenced processes generated by Gegenbauer polynomials and the ARMA structure (GARMA) with both the long-memory [...] Read more.
In recent years, fractionally-differenced processes have received a great deal of attention due to their flexibility in financial applications with long-memory. This paper revisits the class of generalized fractionally-differenced processes generated by Gegenbauer polynomials and the ARMA structure (GARMA) with both the long-memory and time-dependent innovation variance. We establish the existence and uniqueness of second-order solutions. We also extend this family with innovations to follow GARCH and stochastic volatility (SV). Under certain regularity conditions, we give asymptotic results for the approximate maximum likelihood estimator for the GARMA-GARCH model. We discuss a Monte Carlo likelihood method for the GARMA-SV model and investigate finite sample properties via Monte Carlo experiments. Finally, we illustrate the usefulness of this approach using monthly inflation rates for France, Japan and the United States. Full article
Show Figures

Figure 1

306 KiB  
Article
Nonparametric Regression with Common Shocks
by Eduardo A. Souza-Rodrigues
Econometrics 2016, 4(3), 36; https://doi.org/10.3390/econometrics4030036 - 1 Sep 2016
Cited by 1 | Viewed by 6392
Abstract
This paper considers a nonparametric regression model for cross-sectional data in the presence of common shocks. Common shocks are allowed to be very general in nature; they do not need to be finite dimensional with a known (small) number of factors. I investigate [...] Read more.
This paper considers a nonparametric regression model for cross-sectional data in the presence of common shocks. Common shocks are allowed to be very general in nature; they do not need to be finite dimensional with a known (small) number of factors. I investigate the properties of the Nadaraya-Watson kernel estimator and determine how general the common shocks can be while still obtaining meaningful kernel estimates. Restrictions on the common shocks are necessary because kernel estimators typically manipulate conditional densities, and conditional densities do not necessarily exist in the present case. By appealing to disintegration theory, I provide sufficient conditions for the existence of such conditional densities and show that the estimator converges in probability to the Kolmogorov conditional expectation given the sigma-field generated by the common shocks. I also establish the rate of convergence and the asymptotic distribution of the kernel estimator. Full article
143 KiB  
Editorial
Special Issues of Econometrics: Celebrated Econometricians
by Econometrics Editorial Office
Econometrics 2016, 4(3), 35; https://doi.org/10.3390/econometrics4030035 - 17 Aug 2016
Viewed by 4989
Abstract
Econometrics is pleased to announce the commissioning of a new series of Special Issues dedicated to celebrated econometricians of our time.[...] Full article
911 KiB  
Article
Jump Variation Estimation with Noisy High Frequency Financial Data via Wavelets
by Xin Zhang, Donggyu Kim and Yazhen Wang
Econometrics 2016, 4(3), 34; https://doi.org/10.3390/econometrics4030034 - 16 Aug 2016
Cited by 15 | Viewed by 7743
Abstract
This paper develops a method to improve the estimation of jump variation using high frequency data with the existence of market microstructure noises. Accurate estimation of jump variation is in high demand, as it is an important component of volatility in finance for [...] Read more.
This paper develops a method to improve the estimation of jump variation using high frequency data with the existence of market microstructure noises. Accurate estimation of jump variation is in high demand, as it is an important component of volatility in finance for portfolio allocation, derivative pricing and risk management. The method has a two-step procedure with detection and estimation. In Step 1, we detect the jump locations by performing wavelet transformation on the observed noisy price processes. Since wavelet coefficients are significantly larger at the jump locations than the others, we calibrate the wavelet coefficients through a threshold and declare jump points if the absolute wavelet coefficients exceed the threshold. In Step 2 we estimate the jump variation by averaging noisy price processes at each side of a declared jump point and then taking the difference between the two averages of the jump point. Specifically, for each jump location detected in Step 1, we get two averages from the observed noisy price processes, one before the detected jump location and one after it, and then take their difference to estimate the jump variation. Theoretically, we show that the two-step procedure based on average realized volatility processes can achieve a convergence rate close to O P ( n 4 / 9 ) , which is better than the convergence rate O P ( n 1 / 4 ) for the procedure based on the original noisy process, where n is the sample size. Numerically, the method based on average realized volatility processes indeed performs better than that based on the price processes. Empirically, we study the distribution of jump variation using Dow Jones Industrial Average stocks and compare the results using the original price process and the average realized volatility processes. Full article
(This article belongs to the Special Issue Financial High-Frequency Data)
Show Figures

Figure 1

145 KiB  
Editorial
Econometrics Best Paper Award 2016
by Kerry Patterson
Econometrics 2016, 4(3), 33; https://doi.org/10.3390/econometrics4030033 - 1 Aug 2016
Cited by 1 | Viewed by 5761
Abstract
Econometrics has had a distinguished start publishing over 92 articles since 2013, with 76,475 downloads.[...] Full article
Show Figures

Graphical abstract

442 KiB  
Article
Measuring the Distance between Sets of ARMA Models
by Umberto Triacca
Econometrics 2016, 4(3), 32; https://doi.org/10.3390/econometrics4030032 - 15 Jul 2016
Cited by 13 | Viewed by 8316
Abstract
A distance between pairs of sets of autoregressive moving average (ARMA) processes is proposed. Its main properties are discussed. The paper also shows how the proposed distance finds application in time series analysis. In particular it can be used to evaluate the distance [...] Read more.
A distance between pairs of sets of autoregressive moving average (ARMA) processes is proposed. Its main properties are discussed. The paper also shows how the proposed distance finds application in time series analysis. In particular it can be used to evaluate the distance between portfolios of ARMA models or the distance between vector autoregressive (VAR) models. Full article
Show Figures

Figure 1

526 KiB  
Article
Market Microstructure Effects on Firm Default Risk Evaluation
by Flavia Barsotti and Simona Sanfelici
Econometrics 2016, 4(3), 31; https://doi.org/10.3390/econometrics4030031 - 8 Jul 2016
Cited by 2 | Viewed by 7172
Abstract
Default probability is a fundamental variable determining the credit worthiness of a firm and equity volatility estimation plays a key role in its evaluation. Assuming a structural credit risk modeling approach, we study the impact of choosing different non parametric equity volatility estimators [...] Read more.
Default probability is a fundamental variable determining the credit worthiness of a firm and equity volatility estimation plays a key role in its evaluation. Assuming a structural credit risk modeling approach, we study the impact of choosing different non parametric equity volatility estimators on default probability evaluation, when market microstructure noise is considered. A general stochastic volatility framework with jumps for the underlying asset dynamics is defined inside a Merton-like structural model. To estimate the volatility risk component of a firm we use high-frequency equity data: market microstructure noise is introduced as a direct effect of observing noisy high-frequency equity prices. A Monte Carlo simulation analysis is conducted to (i) test the performance of alternative non-parametric equity volatility estimators in their capability of filtering out the microstructure noise and backing out the true unobservable asset volatility; (ii) study the effects of different non-parametric estimation techniques on default probability evaluation. The impact of the non-parametric volatility estimators on risk evaluation is not negligible: a sensitivity analysis defined for alternative values of the leverage parameter and average jumps size reveals that the characteristics of the dataset are crucial to determine which is the proper estimator to consider from a credit risk perspective. Full article
(This article belongs to the Special Issue Financial High-Frequency Data)
Show Figures

Figure 1

282 KiB  
Article
Estimation of Gini Index within Pre-Specified Error Bound
by Bhargab Chattopadhyay and Shyamal Krishna De
Econometrics 2016, 4(3), 30; https://doi.org/10.3390/econometrics4030030 - 24 Jun 2016
Cited by 14 | Viewed by 6703
Abstract
Gini index is a widely used measure of economic inequality. This article develops a theory and methodology for constructing a confidence interval for Gini index with a specified confidence coefficient and a specified width without assuming any specific distribution of the data. Fixed [...] Read more.
Gini index is a widely used measure of economic inequality. This article develops a theory and methodology for constructing a confidence interval for Gini index with a specified confidence coefficient and a specified width without assuming any specific distribution of the data. Fixed sample size methods cannot simultaneously achieve both specified confidence coefficient and fixed width. We develop a purely sequential procedure for interval estimation of Gini index with a specified confidence coefficient and a specified margin of error. Optimality properties of the proposed method, namely first order asymptotic efficiency and asymptotic consistency properties are proved under mild moment assumptions of the distribution of the data. Full article
Previous Issue
Next Issue
Back to TopTop