The degree of dependence between asset returns during normal times can change dramatically during crisis times. In view of risk assessment, it is key to use models that are flexible enough to account for this type of features; a useful model must also be applicable in high dimensions.
By conditioning on extreme occurences of an indicator of interest (e.g. VIX), it is possible to model the joint distribution of asset returns in periods of high market stress. Under mild assumptions, it is possible to model losses of portfolios with hundreds of positions.
After the financial crisis of 2007-8, stronger regulations were implemented through Basel III. Within this framework, it becomes increasingly important to accurately estimate value-at-risk and expected shortfall. Recent multidimensional extreme value theory can tackle this type of issues.
Climate change means that an increasing amount of energy is available on the Earth's surface, which in particular implies floods of larger amplitude that last longer. In the UK, population is particularly vulnerable to floods, and more than £500 million is spent each year on flood defences across the country. Flooding is responsible for more than £1 billion in annual damage costs [source: UK parliament].
Environmental hazards often spread spatially; an appropriate model is able to reproduce spatial patterns and a dependence structure that changes with distance. On the time scale, the duration of flooding events is essential for accurate estimation of return levels.
Extreme value theory represents a proper method to provide decision-makers with estimates of river levels for a given return period, up to a conservative 10,000-year return period in particularly vulnerable regions.
My current research deals with extending and refining our understanding of rare events and how they are connected. This yields more accurate assessment of the risk of damaging events, as it is crucial to report different degrees of dependence between pairs of events at different levels of severity/magnitude.
Statistics is primarily concerned with prediction of average behaviours/expectations. But when it comes to prediction of extreme events, the mean is no longer of interest and we focus on the maximum instead. Further extensions of the theory look at excesses of a high threshold. The excesses can provide more information than, e.g., yearly maxima, but imply a more involved inference that requires more careful attention.
Excesses of a threshold typically happen in clusters, as is the case for flooding events or large negative returns in the stock market, which tend to last several days. Little has been done to model and understand temporal dependence of extremes in time series, although this improves estimation of risks and return levels.
I use a complex Bayesian framework in order to fit distributions with unknown shape that arise in my research. Typical non-parametric Bayesian models are Dirichlet process mixtures, which are very flexible. These mixtures are distributions over distributions, which means they provide uncertainty on distribution functions rather than model parameters.
The Bayesian approach to statistics is particularly well-suited when knowledge from experts is available for the data to be analysed. In general, despite critics about its subjectivity, the Bayesian approach is very natural and provides a measure of uncertainty that accounts for previous knowledge, re-shaped by the additional information provided by the data. It is complementary to the frequentist approach which gives an account of uncertainty based on the hypothesis that an experience can be repeated infinitely many times.
The non-parametric Bayesian approach is necessary for the type of problems I am dealing with in my research and could not be efficiently replaced with a frequentist model.
My current research focuses on developing a coherent and flexible framework for modelling extremes in time and potentially in space. For this, I use a semi-parametric Bayesian approach and I develop computer-intensive algorithms involving hierarchical Markov chain Monte Carlo methods with adaptive schemes. Probability and statistical theory are key in supporting the approach.
I worked on a couple of specific research projects, some of which are available on infoscience, the EPFL database for technical reports and publications. These include projects related to finance, such as
I also worked on projects related to social behaviour and environmental issues, such as
I graduated from EPFL in 2013, where I completed a Master's in Mathematical engineering. I recently completed a PhD in mathematics at EPFL co-supervised at Lancaster University, UK. More details here and there.
During my PhD, I benefitted from advanced and specific courses and workshops, among which
I attended courses ranging from advanced mathematical analysis to discrete optimisation at the Bachelor's level, while the Master's was more focused on Probability and Statistics, specifically
The original research and new methodology was disseminated in various international conferences and workshops, among which the Joint Statistical Meeting (JSM Boston, MA, 2014), the International Statistical Meeting (ISM Rio 2015), the Extreme Value Analysis conference (EVA Ann Arbour, MI, 2015 and EVA Delft, Netherlands, 2017).