*Applications for the below PhD projects can be submitted at any-time. Once a suitable candidate has been appointed to each of the following PhD projects, no further applications will be considered.*

For all PhD project enquires contact: Mark Cardy

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

Stochastic optimization methods are critical to assist portfolio managers with the asset allocation problems they are facing. When the number of possible assets is high, simulation-based methods are the numerical method to favour. The purpose of this project is to improve the Regression Monte Carlo numerical algorithm used by CSIRO for portfolio allocation, and adapt it to similar stochastic optimization projects faced by portfolio managers, such as dynamic hedge ratio computation, and optimal portfolio liquidation under time constraints.

Keywords: Stochastic control, Least-squares Monte Carlo, computational finance, portfolio allocation, dynamic hedge ratio, portfolio liquidation

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

Life-cycle retirement funds require long-term adaptive investment decisions to meet the consumption targets of retirees over time. Several dynamic variables can affect the financial needs of retirees over time, such as health condition or house prices among many others. Stochastic optimization methods can help retirees mitigate risk over time, through optimal dynamic purchase of annuity, health insurance, and/or reverse mortgage for example. For public and private retirement funds, these tool can help to understand the financial decisions of retirees, and assess their long term sustainability. This project will consist in setting up efficient and time-consistent simulation-based computational method for solving such long-term stochastic optimisation problem with social discounting, with key dynamic information and sustainable targets taken into account.

Keywords: Stochastic control, long-term portfolio allocation, life-cycle, behavioural finance

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

The increasing reliance on cyberspace for personal, professional, commercial, industrial and infrastructure activities pose serious cybersecurity challenges for individuals, business and government. There is the risk of individuals’ privacy being compromised from hacks of their personal accounts, or business and government databases; and the theft of intellectual property can also compromise business and government activities resulting in economic lose (National Science and Technology Council, 2016). This work will concentrate on undertaking research into developing risk assessment methods for quantifying cybersecurity risks for different organisations, both government and private. The research activity has two main aims:

The aim is to identify what should be evaluated for undertaking cybersecurity risk assessment, what data should be collected for this evaluation, and how should cybersecurity risk be measured and quantified. This will require determining cyberspace vulnerabilities and threats posed by adversaries, and how the likelihood and consequences of cybersecurity events should be measured and quantified. This work will be undertaken using evidence-based methods, hence they will be based on real-life data and examples.**Evaluation for cybersecurity risk assessment.**

**Quantify costs and savings associated with mitigation strategies.**The goal is to develop risk mitigation strategies and to quantify their value. This will require determining the costs associated with successful cybersecurity breaches without risk mitigation strategies or with existing risk mitigation strategies, and the costs associated with introducing new risk mitigation strategies and the reduction in costs associated with successful or unsuccessful cybersecurity breaches with the new risk mitigation strategies. This work will be evidence-based and involve experimental designs to ascertain which new risk mitigation strategies work best and the conditions under which they work best.

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

Through enabling technologies such as wireless communication (WiFi) and smartphones, the digital revolution has well and truly taken over our day-to-day lives. Whether it’s planning and maintaining our social lives via social media platforms, doing some online shopping or streaming live sports and TV to relax after work, information and digital technologies have become integral to our way of life. Indeed, the average Australian household is predicted to have 24 devices connected to the internet by 2019, while 50% of small and medium enterprises will receive payments online [1]. Whilst this digital disruption has provided the opportunity for innovation in our economy, there are also inherent risks for individuals, the private sector and for governments. We generate and store immense quantities of data each day, most of which we would like to keep private and secure. However, this data is valued by other parties, leading to the persistent threat of cybercrime estimated to cost global economies approximately 1% of GDP each year [1]. Thus, some form of insurance for cyber risk management would provide great benefit to the global economy. Nevertheless, there has been little research into quantitative models for cyber-insurance [2]. This work will focus on developing quantitative valuation models for cybersecurity risk management, in both corporate and private applications. More specifically, the work can be split into two mini-projects:

**Develop a quantitative pricing model for cyber-insurance.**The aim is to develop a pricing framework in an analogous fashion to the current financial industry. This will require the identification of risks, modelling them as stochastic processes, and valuing possible risk mitigation strategies. The work will utilise real-world data when available to ensure real-world applicability.**Develop a model for determining resource allocations to combat cyber-crime.**The aim is to address the challenge of allocating limited resources to different risk-mitigation strategies to minimise the expected future losses of an organisation. This work will utilise the valuation model developed in the first mini-project, and will again utilise real-world data when possible.

[1]. Commonwealth of Australia, Department of the Prime Minister and Cabinet, Australia’s Cybersecurity Strategy (2016), accessed online: https://cybersecuritystrategy.dpmc.gov.au

[2]. Tondel, I.A., Meland, P.H., Omerovic, A., Gjaere E.A. and Solhaudg, B., *Using Cyber-Insurance as a Risk Management Strategy: Knowledge Gaps and Recommendations for Further Research, *(2015), Report No SINTEF A27298, SINTEF ICT, Norway

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

The Least Square Monte-Carlo Alogorithm is a broadly used algorithm used to price bermudean options. It relies on an approximation of the continuation value by orthogonal projection on a linear subspace spanned by chosen basis functions. A key ingredient in the success of this method is the fact that the orthogonal projection on a linear space can be directly computed by a matrix inversion. The project is to perform the projection on a non linear subset, i.e. when one tries to approximate the value function with a set of functions that depend non-linearly of a finite number of parameter. One of the key to the success of this approach, is to solve efficiently a complex optimization problem, similar to those encountered in neural networks.

Keywords: Least-Square Monte-Carlo, Stochastic Calculus, Black-Scholes equation, Computational Optimization.

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

The celebrated Dupire’s formula allows to infer the local volatility from the knowledge of option’s prices for all strikes and maturities. Options’ prices being available only at discrete times in real markets, this involves some way of interpolating option prices to non-listed maturities, which in turn often leads to unstable local volatilities. The project here is to adopt an alternative approach inspired by Optimal Transport, that directly finds a local volatility surface compatible with option prices given at discrete maturities. The project consists of a theoretical study followed by the numerical implementation of the model.

Keywords: Local volatility models, Optimal Transport, Stochastic Control, Nonlinear PDE’s.

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

Electricity smart meter technology is increasingly being deployed in residential and commercial buildings. The technology facilitates the collection of energy usage data at much finer temporal scales than was possible previously. These smart meters collect energy usage information at half hourly intervals, resulting in over 35 billion half hourly observations per year across all households in the state. Using the geographical data of the households also makes it possible to identify spatial and spatio-temporal patterns. This project will concentrate on undertaking research into spatial and spatio-temporal visualisation and inferential methods for cognostics of spatially distributed big time series electricity usage data. The research activity has three main aims:

The aim is to test existing metrics for time series data (e.g. Fulcher, Little, and Jones, 2013) to ascertain if they provide spatially useful diagnostic statistics when the spatial dimension of the spatially distributed big time series data is accounted for. If not, to propose new metrics that do provide spatially useful diagnostic statistics, and to then extend the methods to provide useful spatio-temporal diagnostic statistics. The methods will need to be able to account for spatially distributed explanatory variables such as demographics, building size and material, and household behavioural patterns. The methods developed will be reliant on parallel processing using multiple multi-core computers and platforms such as Hadoop, Spark or Tessera.**Cognostics methods for electricity usage time series data with spatial and spatio-temporal structure.**

**Develop visualisation methods for spatial and spatio-temporal cognostics for spatially distributed big time series.**The goal is to provide visualisation methods for supporting the analysis, to find anomalies, possibly errors in the data, or unusual uses, and explore patterns like clusters of behaviour, and summarise the behaviour (e.g. Javed et al, 2010). Cognostics provide efficient numerical summaries that can be understood better with plots of the time series, in the spatial context where the data arises. The new challenge for visualisation is handling the volume of data supplied by smart meters, especially to provide interactive graphics (e.g. Cheng et al, 2016). In addition, providing visual explanations is helpful for decision makers to understand patterns or changes in patterns. There is a need to develop visualisation tools that can be used to describe associations between different variables, clustering in spatial location, demography, behavioural or building type, that can change over time (e.g. Wickham et al, 2012). New methods are needed to visualise changing spatial-temporal patterns in an intuitive and interactive way.

**Develop inferential methods for spatially distributed large time series electricity usage data.**It is important to determine if any identified clusters or patterns are indeed statistically meaningful. That is, what confidence do we have that the identified patterns actually exist and are not a random artefact? As the electricity energy usage data will likely exhibit complex multi-seasonality, serial and spatial correlation, making comparisons against permuted observations assumed to be independent will not be appropriate. The permutations against which comparisons will be undertaken will need to respect the multi-seasonality and build on the seasonal block bootstrap (Dudek*et al*., 2014), the serial correlation (Kreiss and Lahiri, 2012) and the spatial correlation (García-Soidán, Menezes and Rubinos, 2014).

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

In the state of Victoria, Australia, it is state government policy that all households have a smart meter fitted (Victoria State Government, 2015), which has resulted in 97% of the approximately 2.1 million households in Victoria having a smart meter installed in their home. These smart meters collect energy usage information at half hourly intervals, resulting in over 35 billion half hourly observations per year across all households in the state. The introduction of smart meters affords the opportunity to better model and understand residential and business energy usage patterns between months, between days and within days, something that is not possible using only quarterly energy usage information. This is an emerging area of research (Taieb *et al*., 2015, and 2016) but there are considerable challenges as the computational price of processing this extra data can be immense. This project will concentrate on undertaking research to tackle this computational challenge and to link the large smart meter energy usage data set with other smaller datasets, such as demographic datasets, building size and material, behavioural and customer billing information. This research activity has two main aims:

**Develop scalable method to analyse a large number of time series.**In the first instance the aim will be to develop methods that can analyse many time series. Dimension reduction methods such as functional data analysis (Ramsey and Silverman, 2005), probabilistic methods for approximate matrix decompositions (Halko, Martinsson and Tropp, 2011), state-space approaches that do not require the use of large matrices (Jones, 1993), and sparse matrices approaches (e.g. Furrer, Genton and Nychka, 2006) will be explored. The methodology developed will be reliant on parallel processing using multiple multi-core computers and platforms such as Hadoop, Spark or Tessera. The method will be able to link to other data sets that contain explanatory variables such demographic datasets, building size and material, behavioural and customer billing information for inferential purposes.**Extend the scalable method developed to analyse a large number of time series to work in real-time or near real-time.**Increasingly decision makers want to be able to make decisions as new data is acquired or soon after the acquisition of the new data. The method developed will need to be extended to work in real-time or near real-time. This will help with identifying any changes that may require a management response and with forecasting household and business energy usage behaviour. New statistical machine learning (Hastie, Tibshirani, and Friedman, 2009) approaches will be developed or modified to monitor the time series for changes. Two approaches could be adopted. One approach is to update model parameters as new data is collected, and the other is to only update model parameters when a shift in the parameters values is detected. Evaluating which approach is computationally feasible will be an important component of this work.

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

Using modeling methods developed by Hyndman and Ullah, this research will look at the ability of the superannuation system to fund retirement in the case of a fixed age of retirement at 65 and in conjunction with the state pension system. Incorporating longevity forecasts using the Hyndman Ullah model we will consider the sustainability of the superannuation system in its current form. Secondly, we will consider the question of what is the appropriate retirement age, given increasing longevity, cost implications and level of lifestyle required in retirement.

The project would require the use of national death and birth data available from the human mortality and human fertility databases in order to forecast working and retired populations. With the data from Department of Human Services, we would also be able to consider appropriate retirement age decisions for different sub population cohorts, for example considering affordability by gender, by socio economic status or by geographical location.

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

Recent decades have witnessed extraordinary improvements in human lifespan. The fact that we are expected to live longer is welcome news, but not one without certain associated risks. From the perspective of an individual, longevity risk is the risk that one might outlive one’s financial resources. At the societal or governmental level, longevity risk is the risk that guaranteed pension benefits might become underfunded, in part due to a sudden increase in human longevity. From the perspective of insurance companies and private or corporate pension plan providers, there is a non-negligible risk that current policyholders and retirees might, on average, live longer than anticipated, making organizations contractually obligated to pay higher aggregate benefits. The grave financial implication of longevity risk can be clearly seen in the IME report [1]. Therefore, correct modeling and management of longevity risk is of extraordinary importance for the financial stability of all of the parties involved.

[1] Oppers, S., et al. “The financial impact of longevity risk.” *Global Financial Stability Report. International Monetary Fund* (2012).

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

The celebrated Black-Scholes formula has been derived under the assumption of constant volatility in stocks. In spite of evidence that this parameter is not constant, this formula is widely used by the markets. It is therefore natural to ask whether a model for stock price exists such that the Black-Scholes formula holds while the volatility is non-constant. This project attempts to answer the more general question of the existence of alternative models, in the theory of stochastic processes and its applications and in particular finance. It will continue similar work on the construction of certain processes (in particular self-similar Markov martingales) with given marginals by Fan, Hamza and Klebaner.

[1] Fan, Jie Yen, Kais Hamza, and Fima Klebaner. “Mimicking self-similar processes.” *Bernoulli* 21.3 (2015): 1341-1360.

[2] Hamza, Kais, and Fima C. Klebaner. “A Family of Non-Gaussian Martingales with Gaussian Marginals-Volume 2007, Article ID 92723, 19 pages.” *JAMSA-Journal of Applied Mathematics and Stochastic Analysis* 1 (2007).

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

Health is a time dependent process. While medical progress is ensuring an increase in the overall life expectancy, ageing populations are exposed to more health problems. The longer people live, the more probable they are to suffer various illnesses. This positive ageing trend has enormous implications on the overall resources (financial, environmental, economical) and thus improvements on current models for statistical extremes, for financial and actuarial valuations, or mathematical models for populations dynamics are needed to better understand and predict the implications of the current health and ageing trends. The best-developed and most important mathematical models for populations are based on branching processes, the financial and actuarial risk analysis are based on probabilistic models, while the realistic forecasts are primarily based on statistics and extreme value theory. This project is part of a much wider programme to develop probabilistic and statistical modelling of health and ageing models.

[1] Hamza, Kais, Peter Jagers, and Fima C. Klebaner. “On the establishment, persistence, and inevitable extinction of populations.” *Journal of mathematical biology* 72.4 (2016): 797-820.

[2] Hamza, Kais, Peter Jagers, and Fima C. Klebaner. “The age structure of population-dependent general branching processes in environments with a high carrying capacity.” *Proceedings of the Steklov Institute of Mathematics* 282.1 (2013): 90-105.

[3] Jagers, Peter, and Fima C. Klebaner. “Population-size-dependent and age-dependent branching processes.” *Stochastic Processes and their Applications* 87.2 (2000): 235-254.

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

In the field of actuarial science modelling mortality rates has always been of the upmost importance. The actuarial profession established around the mathematical principles of predicting and pricing life expectancies. In the late 20th and early 21st centuries the issue of accurately modelling mortality rates has become the key concern of actuaries.

The intersection of actuarial work and financial product development has recently come under in- creased attention as banks attempt to develop methods to transfer longevity risk from the tradition- al pension scheme / insurance company / state government to the financial sector more generally. This has been in the form of longevity future and forward contacts, longevity swap contracts and to a lesser extent longevity bonds. These instruments, similar in structure to the equivalent interest rate risk products, are designed to provide a way to transfer longevity risk from the holder (pension schemes, life insurance companies, governments etc) to the banks and capital markets.

This project will consider the feasibility of designing and implementing an European welfare bond, to increase the stability of European countries’ state finances as baby boomers reach retirement, healthcare costs increase, unemployment rates fluctuate and social costs vary from country to country.

The research project will broadly cover:

- The identification and modelling of causes of increased welfare costs (retirement costs, unemployment costs, disability costs etc).
- The design of a bond like product that will smooth those costs by transferring some of the risk from individual countries to the issuer of the European bond (e.g. the European central bank).

The project requires a strong mathematical background and a keen interest to work in the interface of financial mathematics and actuarial science. Applicants should have either a background in Mathematical or Actuarial/Financial Sciences or in relevant fields. A combined education in these fields would be an advantage.

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

Gaussian process considers all sample observations as a single draw from a multivariate normal distribution while each observation follows a uni-variate marginal normal distribution. This model has drawn great attention in the general academic literature, and offers substantial potential for realistic application to financial asset management, because of its high accuracy for prediction. So far, this model has not been studied in depth within the financial mathematics community. The project will have the potential to generate research outcome of high impact, and importantly, the research output can be incorporated into portfolio selection processes in the asset management industry.

Three potential stages for this project:

- An analytical solution is derived for single period optimization.
- The model is extended to multi-period (with transaction cost).
- A numerical method is developed to solve general problems

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

Long-term risks such as climate change question the resilience of Australian agriculture. Difficult, expensive, and possibly irreversible changes to farming practices will have to be implemented in the face of uncertainty, such as adapting, moving, or abandoning altogether some types of crop. Stochastic optimisation tools such as the real option approach are suitable to assist decision making under uncertainty, regarding both type and timing of changes to be implemented by farmers for resilience. The aim of this project is to identify the main unintended effects of climate uncertainty on agricultural systems, identify the set of mitigation options available for farmer, model mathematically the decision-making process under uncertainty, and finally solve it numerically, so as to be able to propose mitigation policies towards greater resilience of agricultural systems nationwide.

Keywords: Real option, climate change, adaptation, food and water, stochastic programming

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

Phase-type model has been an important probabilistic tool in the analysis of complex stochastic system evolution. The model describes the lifetime distribution of an underlying Markov pro-cess making transitions within predetermined states/pools before moving to an absorbing state. In insurance and superannuation, the states may represent the status of an insured individual.

The goal of this project is two fold. First, we want to investigate the use of the conditional phase-type model proposed in [1], [2] for the calculations of pension and health insurance premium by taking account of heterogeneity among insured individuals and the advantage of available data. We should expect to see different results between the two models. Secondly, we are looking at the possibility of making a GUI platform (in R) that implements the model to real data.

[1] Surya, B.A. (2016). Markov chains mixture process and its phase-type dis-tributions. Working paper. Victoria University of Wellington, New Zealand.

[2] Zadeh, A.H., Jones, B.L. and Stanford, D.A. (2014). The use of phase-type models for disability insurance calculations, Scand. Actuar. J., 8, 714-728.

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

The age of big data is upon us. Every day we generate vast quantities of data but the question is how to utilize it. From the perspective of property and casualty insurance this question is truly acute. For example, car insurance companies in USA, Canada and Australia from recently have access to data generated by telematics devices. As Alliance Australia [1] describes telematics devices installed in motor vehicles allow for remote and real time tracking of vehicle’s location, speed, acceleration, deceleration and many other parameters of driving. How to use this data to understand the behaviour patterns of driving and how to integrate other sources of data such as local weather and traffic conditions, health status of driver at every point it time (available through wearable consumer electronics), is an active source of investigation in a nascent field of telematics insurance. Techniques such belonging to class of advanced data analytics such machine, deep learning and clustering can help answer this question. The purpose of this PhD project is to develop predictive analytics algorithms for sophisticated real time classification of drivers according to their driving patterns and likelihood of accidents. Also real time pricing implications in this project would be investigated. Dynamic reduction in premiums might have significant appeal to many drivers as a compensation for access to their relevant real time data. From the perspective of insurance company the ability to access the risk in a more sophisticated way is highly welcomed especially in the context of new capital adequacy rules.

1] https://www.allianz.com.au/car-insurance/news/use-of-telematics-in-vehicles

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

Mine extraction involves major uncertainties that affect the estimated value of mining projects, most notably grade quality and metal price. Mathematical programming is the standard tool used in the mining industry for valuing mines, as it can accommodate large-scale problems involving a large number of physical constraints. However, this approach is ill-suited to deal with uncertainty, as its anticipative nature (i.e. assuming exact prior knowledge of the future) can produce overestimated, unattainable valuations that mask the true risk taken by mining companies. By contrast, the alternative real option (stochastic control) approach provides by definition accurate, non-anticipative policies. The aim of this PhD research project is to adapt and improve the latest methodologies in real option literature for solving realistic mining valuation and operation problems. The objective is to develop and demonstrate the value adding by using the real option approach for efficient and sustainable mining projects.

__Keywords__: Mining, real option, stochastic control, mathematical programming, geostatistics, mathematical finance, simulation

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

**Supervisory Team**:

Athanasios A. Pantelous, Colin O’Hare (Monash University & RiskLab, Australia),

Tim J. Boonen (University of Amsterdam, The Netherlands)

**Proposal **

There is a classic actuarial approach towards the pricing of insurance risk. In particular, the expected value premium principle, or the risk-based principle gained particular academic and practitioner’s interest (see for instance, Kaas et al. 2005). These principles need an approximation of the underlying distribution of insurance risk (claim sizes and timings), but they do not take into account the other insurers in the market. In economic theory on industrial organizations, competition, between alternative insurers in our case, may drive prices down, as observed in the well-known Bertrand competition. In this project, we aim to model the competition of multiple insurers in the market, where the insurance risk is observed and stochastic and all insurers aim to optimize a mean-variance objective function. In particular, we determine the prices as the Nash equilibria in the market. Focusing on a two-period model initially, we wish to see if competition, indeed, leads to lower prices and more insured risk for the insurer. This will allow us to simulate the distribution of the net asset value (also called basic own funds in Solvency II regulation) of the insurance companies. Also, we hypothesize that diversification of insurance policies yield an equilibrium where there is one insurer that attracts almost all policyholders. If the insured risk is too high, there is an interest for the policyholders (and thus the regulator) to introduce competition constraints (such as one preventing a monopole). This offset between regulation and diversification is non-trivial. Our approach extends the deterministic approaches of Taylor (1986) and Wu and Pantelous (2017), where there is no uncertainty in the pay-out of insurance policies. In a deterministic setting, the (risk-based) regulation is essentially irrelevant. Next, we aim to extend our results to a dynamic continuous-time setting. In this setting, open-loop Nash equilibria are studied to determine the premium profile, extending the approach of Boonen et al. (2018) to the case of stochastic insurance risk. This project is computationally more advanced than the first one, but it may allow us to understand what pricing profiles can be expected in stochastic insurance markets. For instance, Boonen et al. (2018) provide a deterministic example with premium cycles. Premium cycles are well-studied empirically, and exist in some insurance markets.

**Bibliography**

- Boonen, T.J., Pantelous, A.A., Wu, R. Non-cooperative dynamic games for general insurance markets,
*Insurance: Mathematics and Economics*, 78 (2018), 123-135. - Kaas, R., Goovaerts, M., Dhaene, J. and Denuit, M..
*Modern actuarial risk theory: using R*(Vol. 128). Springer Science & Business Media, 2008. - Taylor, G.C. Underwriting Strategy in a Competitive Insurance Environment.
*Insurance: Mathematics and Economics*, 5 (1986), 59-77. - Wu, R., Pantelous, A.A. Potential games with aggregation in non-cooperative general insurance markets,
*ASTIN Bulletin*, 34(1) (2017), 269-302.

Commencement Date: Anytime 2018, 3 year scholarship (Monash University or Data61 CSIRO)

**Supervisory Team: **

Athanasios A. Pantelous and Colin O’Hare (Monash University & RiskLab, Australia),

Tim J. Boonen (University of Amsterdam, The Netherlands)

**Proposal **

Climate change is an important underlying risk factor that might have an impact on the insurance industry, pension funds, the financial sector (as a significant proportion of the financial markets is driven by pension funds) governmental agencies, and decision and policy makers. In the insurance industry, strong market competition has boosted the demand for a competitive premium, where competition drives premiums down. The insurance premium has a substantial impact on the actuarial reserving calculations and on the implementations by the regulatory authorities.

In this project, we have the following objectives. First, we want to identify and categorize the major risk segments for life and non-life insurers, and understand the way these risk segments are affected by climate change. Following this, the impact of uncertainty on the various parameters involved in the applied model will be examined. Secondly, we will model premium dynamics via differential games, and study the insurers’ equilibrium premium dynamics in a competitive market. In this regard, not only will different tools from optimal control, mathematical programming and economic theory be applied to determine the equilibrium premium strategies, but also, we will consider different market conditions. Finally, we will check which of the parameters involved in the model are most sensitive for climate change, focusing on how uncertainty in these parameters may impact on insurance equilibrium pricing and reserving. Additionally, we will demonstrate the impact of these projections on various financial calculations, and will provide a number of ways of quantifying, both graphically and numerically, the model risk in such calculations.

For the purpose of our study, data from the Australian and EU insurance markets will be considered. We focus on temperature changes as our proxy to climate change, while extensions can be obtained by including other important climate change factors such as (lack of) rainfall. A large number of simulation case studies will be conducted. In particular, an objective of our study is to quantify the sensitivity of the profit of the insurers towards the uncertainty of the long-term trend of climate change.

**Selected References**:

- Bobb, J.F, Peng, R.D., Bell, M.L., Dominici, F. Heat-related mortality and adaptation to heat in the United States.
*Environmental Health Perspectives*, 122 (8) (2014), 811-816. - Boonen, T.J., Pantelous, A.A., Wu, R. Non-cooperative dynamic games for general insurance markets,
*Insurance: Mathematics and Economics*, 78 (2018), 123-135. - Seklecka, M., Pantelous, A.A., O’Hare, C. Mortality effects of temperature changes in the United Kingdom,
*Journal of Forecasting*, 36 (2017), 824-841. - Vardoulakis, S., Dear, K., Hajat, S., Heaviside, C., Eggen, B., McMichael, A. J.. Comparative assessment of the effects of climate change on heat- and cold-related mortality in the United Kingdom and Australia.
*Environmental Health Perspectives*,*12*2 (12) (2014), 1285–1293. - Wu, R., Pantelous, A.A. Potential games with aggregation in non-cooperative general insurance markets,
*ASTIN Bulletin*, 34(1) (2017), 269-302.