Διδακτορικές διατριβές
Μόνιμο URI για αυτήν τη συλλογήhttps://pyxida.aueb.gr/handle/123456789/14
Περιήγηση
Πλοήγηση Διδακτορικές διατριβές ανά Επιβλέπων "Dellaportas, Petros"
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
Α Β Γ Δ Ε Ζ Η Θ Ι Κ Λ Μ Ν Ξ Ο Π Ρ Σ Τ Υ Φ Χ Ψ Ω
Τώρα δείχνει 1 - 5 από 5
- Αποτελέσματα ανά σελίδα
- Επιλογές ταξινόμησης
Τεκμήριο Bayesian model determination and nonlinear threshold volatility modelsPetralias, Athanassios; Πετραλιάς, Αθανάσιος; Athens University of Economics and Business, Department of Statistics; Dellaportas, Petros; Ntzoufras, IoannisThe purpose of this Thesis is to document an original contribution in the areas of model determination and volatility modeling. Model determination is the procedure that evaluates the ability of competing hypothesized models to describe a phenomenon under study. Volatility modeling in the present context, involves developing models that can adequately describe the volatility process of a financial time series. In this Thesis we focus on the development of efficient algorithms for Bayesian model determination using Markov Chain Monte Carlo (MCMC), which are also used to develop a family of nonlinear flexible models for volatility. We propose a new method for Bayesian model determination that incorporates several desirable characteristics, resulting in better mixing for the MCMC chain and more precise estimates of the posterior density. The new method is compared with various existing methods in an extensive simulation study, as well as more complex model selections problems based on linear regression, with both simulated and real data comprising of 300 to 1000 variables. The method seems to produce rather promising results, overperforming several other existing algorithms in most of the analyzed cases. Furthermore the method is applied to gene selection using logistic regression, with a famous dataset including 3226 genes. The problem lies in identifying the genes related to the presence of a specific form of breast cancer. The new method again proves to be more efficient when compared to an existing Population MCMC sampler, while we extend the findings of previous medical studies on this issue. We present a new class of flexible threshold models for volatility. In these models the variables included, as well as the number and location of the threshold points are estimated, while the exogenous variables are allowed to be observed on lower frequencies than the dependent variable. To estimate these models we use the new method for Bayesian model determination, enriched with new move types, the use of which is validated through additional simulations. Furthermore, we propose a comparative model based on splines, where the number and location of the spline knots is related to a set of exogenous variables. The new models are applied to estimate and predict the variance of the Euro-dollar exchange rate, using as exogenous variables a set of U.S. macroeconomic announcements. The results indicate that the threshold models can provide significantly better estimates and projections than the spline model and typical conditional volatility models, while the most important macroeconomic announcements are identified. The threshold models are then generalised to the multivariate case. Under the proposed methodology, the estimation of the univariate variances is only required, as well as a rather small collection of regression coefficients. This simplifies greatly the inference, while the model is found to perform rather well in terms of predictability. A detailed review of both the available algorithms for Bayesian Model determination and nonlinear models for financial time series is also included in this Thesis. We illustrate how the existing methods for model determination are embedded into a common general scheme, while we discuss the properties and advantages each method has to offer. The main argument presented is that there is no globally best or preferable method, but their relative performance and applicability, depends on the dataset and problem of interest. With respect to the nonlinear models for financial time series and volatility we present in a unified manner, the main parametric and nonparametric classes of these models, while there is also a review of event studies analyzing the effect of news announcements on volatility.Τεκμήριο Bayesian modelling of high dimensional financial data using latent gaussian modelsAlexopoulos, Angelis N.; Αλεξόπουλος, Αγγελής Ν.; Athens University of Economics and Business, Department of Statistics; Dellaportas, Petros; Papaspiliopoulos, OmirosThe present thesis deals with the problem of developing statistical methodology for modellingand inference of high dimensional financial data. The motivation of our research wasthe identification of infrequent and extreme movements, which are called jumps, in the pricesof the 600 stocks of Euro STOXX index. This is known in the financial and statistical literatureas the problem of separating jumps from the volatility of the underlying process whichis assumed for the evolution of the stock prices.The main contribution of the thesis is the modelling and the development of methodsfor inference on the characteristics of the jumps across multiple stocks, as well as across thetime horizon. Following the Bayesian paradigm we use prior information in order to modela known characteristic of financial crises, which is that jumps in stock prices tend to occurclustered in time and to a↵ect several markets within a sort period of time. An improvementin the prediction of future stock prices has been achieved.The proposed model combines the stochastic volatility (SV) model with a multivariatejump process and belongs to the very broad class of latent Gaussian models. Bayesian inferencefor latent Gaussian models relies on a Markov chain Monte Carlo (MCMC) algorithmwhich alternates sampling from the distribution of the latent states of the model conditionalon the parameters and the observations, and sampling from the distribution of the parametersof the model conditional on the latent states and the observations. In the case of SVmodels with jumps, sampling the latent volatility process of the model is not a new problem.Over the last few years several methods have been proposed for separating the jumps fromthe volatility process but there is not a satisfactory solution yet, since sampling from a highdimensional nonlinear and non-Gaussian distribution is required. In the present thesis wepropose a Metropolis-Hastings algorithm in which we sample the whole path of the volatilityprocess of the model without using any approximation. We compare the resulting MCMCalgorithm with existing algorithms. We apply our proposed methodology on univariate SVwith jumps models in order to identify jumps in the stock prices of the real dataset thatmotivated our research.To model the propagation of the jumps across stocks and across time we combine the SVmodel with a doubly stochastic Poisson process, also known as Cox process. The intensityof the jumps in the Poisson process is modelled using a dynamic factor model. Furthermore,we develop an MCMC algorithm to conduct Bayesian inference for the parameters and thelatent states of the proposed model. We test the proposed methods on simulated data and weapplied them on our real dataset. We compare the prediction of future stock prices using theproposed model with the predictions obtained using existing models. The proposed modelprovides better predictions of future stock prices and this is an indication for a predictablepart of the jump process of SV models.IIIThe MCMC algorithm that is implemented in order to conduct Bayesian inference forthe aforementioned models is also employed on a demographic application. More precisely,within the context of latent Gaussian models we present a novel approach to model andpredict mortality rates of individuals.Τεκμήριο An econometric analysis of high-frequency financial data(12/09/2021) Lamprinakou, Fiori; Λαμπρινάκου, Φιόρη; Athens University of Economics and Business, Department of Statistics; Papaspiliopoulos, Omiros; Demiris, Nikolaos; Pedeli, Xanthi; Papastamoulis, Panagiotis; Tsionas, Mike; Damien, Paul; Dellaportas, PetrosWe present and compare observation driven and parameter driven models for predictinginteger price changes of high-frequency financial data. We explore Bayesian inferencevia Markov chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) for the observationdriven model activity-direction-size (ADS), introduced by Rydberg and Shephard [1998a,2003]. We extend the ADS model by proposing a parameter driven model and use a Bernoulligeneralized linear model (GLM) with a latent process in the mean. We propose a new decompositionmodel that uses trade intervals and is applied on data that allow three possible tickmovements: one tick up price change, one tick down price change, or no price change. Wemodel each component sequentially using a Binomial generalized linear autoregressive movingaverage (GLARMA) model, as well as a GLM with a latent process in the mean. We perform asimulation study to investigate the effectiveness of the proposed parameter driven models usingdifferent algorithms within a Bayesian framework. We illustrate the analysis by modelling thetransaction-by-transaction data of of E-mini Standard and Poor’s (S&P) 500 index futures contracttraded on the Chicago Mercantile Exchange’s Globex platformbetween May 16th 2011 andMay 24th 2011. In order to assess the predictive performance, we compare the mean square error(MSE) and mean absolute error (MAE) criterion, as well as four scalar performance measures,namely, accuracy, sensitivity, precision and specificity derived from the confusion matrix.Τεκμήριο High dimensional time-varying covariance matrices with applications in finance(10-07-2011) Plataniotis, Anastasios; Πλατανιώτης, Αναστάσιος; Athens University of Economics and Business, Department of Statistics; Dellaportas, PetrosThe scope of this Thesis is to provide an original contribution in the area of Multivariate Volatility modeling. Multivariate Volatility modeling in the present context, involves developing models that can adequately describe the Covariance matrix process of Multivariate financial time series. Developmentof efficient algorithms for Bayesian model estimation using Markov Chain Monte Carlo (MCMC) and Nested Laplace approximations is our main objective in order to provide parsimonious and flexible volatility models. A detailed review of Univariate Volatility models for financial time series is first introduced in this Thesis. We illustrate the historical background of each model proposed and discuss its properties and advantages as well as comment on the several estimation methods that have emerged. We also provide a comparative analysis via a small simulation example for the dominant models in the literature. Continuing the review from the univariate models we move on to the multivariate case and extensively present competing models for Covariance matrices. The main argument presented is that currently no model is able to capture the dynamics of higher dimensional Covariance matrices fully, but their relative performance and applicability depends on the dataset and problem of interest. Problems are mainly due to the positive definiteness constraints required by most models as well as lack of interpretability of the model parameters in terms of the characteristics of the financial series. In addition, model development so far focus mostly in parameter estimation and in sample fit; it is our goal to examine the out-of-sample fit perspective of these models. We conclude the review section by proposing some small improvements for existing models that will lead towards more efficient parameter estimates, faster estimation methods and accurate forecasts. Subsequently, a new class of multivariate models for volatility is introduced. The new model is based on the Spectral decomposition of the time changing covariance matrix and the incorporation of autocorrelation modeling or the time changing elements. In these models we allow a priori for all the elements of the covariance matrix to be time changing as independent autoregressive processes and then for any given dataset we update our prior information and decide on the number of time changing elements. Theoretical properties of the new model are presented along with a series of estimation methods, bayesian and classical. We conclude that in order to estimate these models one may use an MCMC method for small dimension portfolios in terms of the size of the covariance matrix. For higher dimensions, due to the curse of dimensionality we propose to use a Nested Laplace approximation approach that provides results much faster with small loss in accuracy. Once the new model is proposed along with the estimation methods, we compare its performance against competing models in simulated and real datasets; we also examine its performance in small portfolios of less than 5 assets as well in the high dimensional case of up to 100 assets. Results indicate that the new model provides significantly better estimates and projections than current models in the majority of example datasets. We believe that small improvements in terms of forecasting is of significant importance in the finance industry. In addition, the new model allows for parameter interpretability and parsimony which is of huge importance due to the dimensionality curse. Simplifying inference and prediction of multivariate volatility models was our initial goal and inspiration. It is our hope that we have made a small step towards that direction, and a new path for studying multivariate financial data series has been revealed. We conclude by providing some proposals for future research that we hope may influence some people into furthering this class of models.Τεκμήριο On variance reduction for Markov chain Monte Carlo(02-2012) Tsourti, Zoi; Τσούρτη, Ζωή; Athens University of Economics and Business, Department of Statistics; Dellaportas, Petros; Kontoyannis, IoannisIn the present thesis we are concerned with appropriate variance reduction methods for specific classes of Markov Chain Monte Carlo (MCMC) algorithms. The variance reduction method of main interest here is that of control variates. More particularly, we focus on control variates of the form U = G−PG, for arbitrary function G, where PG stands for the one-step ahead conditional expectation, that have been proposed by Henderson (1997). A key issue for the efficient implementation of control variates is the appropriate estimation of corresponding coefficients. In the case of Markov chains, this involves the solution of Poisson equation for the function of initial interest, which in most cases is intractable. Dellaportas & Kontoyiannis (2012) have further elaborated on this issue and they have proven optimal results for the case of reversible Markov chains, avoiding that function. In this context, we concentrate on the implementation of those results for Metropolis-Hastings (MH) algorithm, a popular MCMC technique. In the case of MH, the main issue of concern is the assessment of one-step ahead conditional expectations, since these are not usually available in closed form expressions. The main contribution of this thesis is the development and evaluation of appropriate techniques for dealing with the use of the above type of control variates in the MH setting. The basic approach suggested is the use of Monte Carlo method for estimating one-step ahead conditional expectations as empirical means. In the case of MH this is a straightforward task requiring minimum additional analytical effort. However, it is rather computationally demanding and, hence, alternative methods are also suggested. These include importance sampling of the available data resulting from the algorithm (that is, the initially proposed or finally accepted values), additional application of the notion of control variates for the estimation of PG’s, or parallel exploitation of the values that are produced in the frame of an MH algorithm but not included in the resulting Markov chain (hybrid strategy). The ultimate purpose is the establishment of a purely efficient strategy, that is, a strategy where the variance reduction attained overcomes the additional computational cost imposed. The applicability and efficiency of the methods is illustrated through a series of diverse applications.