Master Thesis Defense: Aycan Adrian Corum
  • FENS
  • Master Thesis Defense: Aycan Adrian Corum

You are here

FAST ALGORITHMS FOR SMOOTH AND MONOTONE COVARIANCE MATRIX ESTIMATION

 

Aycan Adrian Corum

 

Electronics Engineering Master's Thesis, 2012

 

Thesis Jury

Asst. Prof. Dr. Müjdat Çetin (Thesis Supervisor), Prof. Dr. İlker Birbil, Assoc. Prof. Dr. Kerem Bülbül, Asst. Prof. Dr. Semih Onur Sezer, Assoc. Prof. Dr. Koray Deniz Şimşek

 

Date & Time: July 5th 2012 – 16:40, Place: FENS G032

 

ABSTRACT

In this thesis the problem of interest is covariance matrix estimation from limited number of high dimensional independent identically distributed (i.i.d.) multivariate samples when the random variables of interest have a natural spatial indexing along a low-dimensional manifold, e.g., along a line. We consider this problem within the setting of financial risk management.

 

It is well-known that sample covariance matrix estimate is fraught with peril in high-dimensional settings with scarce data. In particular, the inconsistency of its eigenvalue spectrum has grave implications for financial risk management when the sample covariance estimate is used within the Markowitz portfolio optimization framework. A variety of approaches to improve the covariance estimates have been developed by exploiting knowledge of structure in the data, including low-rank models (principal component and factor analysis), sparse inverse covariance, and parametric models. All of these models have very successful domains of applications, but in general they manage to reduce the required number of samples by imposing very strict structure.

 

In this thesis we instead make use of another formulation which investigates a different prior for random variables indexed along a low-dimensional manifold: it assumes that the covariance matrix is monotone and smooth with respect to this indexing. This is a natural assumption for a variety of problems including interest-rate risk modeling in financial engineering. Originally the formulation is derived via formulating the estimation problem in a convex-optimization framework, and the resulting semidefinite-programming problem (SDP) is solved by an interior-point method (IPM). However, solving SDP via an IPM can become unduly computationally expensive for large covariance matrices, as it involves computing the Hessian. 

 

Motivated by this observation, this thesis develops highly efficient first-order solvers for smooth and monotone covariance matrix estimation. We propose two types of first-order solvers for covariance matrix estimation: first based on projected gradients, and then based on recently developed optimal first order methods. Given such numerical algorithms, we present a comprehensive experimental analysis. We first demonstrate the benefits of imposing smoothness and monotonicity constraints in covariance matrix estimation in a number of scenarios. This includes cases with limited, missing, and asynchronous data. We then demonstrate the potential computational benefits offered by first order methods through a detailed comparison to solution of the problem via IPMs.