Actuarial Models meet the cloud – a perfect marriage?
This article discusses the benefits of migrating actuarial models to the cloud.
Editorial Comment from The Digital InsurerIn this guest article Dennis Stanley, from Milliman, shares the results being achieved from early adopter of cloud based computing for actuarial models. It seems inevitable that this type of intensive, calculation dense computing which also has large peaks and troughs in demand for processors can and should migrate to cloud based solutions.
The cost of actuarial modelling has been rising
In the early 1980’s, life insurance company actuarial models migrated from the mainframe computer to the personal computer. This transition eliminated the need for a controlled environment data centre and corporate IT professionals to manage the actuarial computing infrastructure. The annual operating cost of this actuarial computing infrastructure was perhaps $5,000 (US).
Over time, the actuarial models became increasingly more compute intensive. The actuarial department was typically the first in line to acquire the most powerful machines produced by the hardware vendors. The annual operating cost of this actuarial computing infrastructure grew to say $50,000.
During the past 15 years, the actuarial models have migrated from processing a single best estimate scenario to processing thousands of stochastic scenarios. This huge increase in calculation intensity has led life insurance companies to invest in clusters of compute intensive servers with 500 or more computing cores. This type of actuarial computing infrastructure requires a controlled environment data centre that is managed by corporate IT professionals. The annual operating cost for this actuarial computing infrastructure has grown to $500,000 or more.
Peak Capacity – Actuarial models do not have a constant demand for processing power
The IT infrastructure to support actuarial models has come full circle over the past 30 years. Back in the mainframe days, the actuarial models were consuming excess capacity. Today, the actuarial models frequently require special purpose hardware that is significantly underutilised due to the seasonal peak demand nature for running the actuarial models.
The following diagram illustrates that significant cost savings can be realised if the actuarial computing infrastructure is more fully utilised.
Over the past three years, Milliman and a large UK life insurance company have collaborated to eliminate this degree of underutilised IT infrastructure by migrating the actuarial models to Microsoft Windows Azure. Using the Windows Azure platform, the company routinely meets peak demand by scaling their actuarial computing infrastructure from 100 cores to 12,000 cores. This is achieved by an authorised technician “flipping the switch”.
By using Windows Azure, the life insurance company shifts the utilisation management responsibility to Microsoft. However, when you consider that Microsoft has millions of processing cores hosted in eight data centres around the globe, you quickly realise high utilisation is much easier to achieve with scale.
Testing massive capacity
In September 2012, Milliman collaborated with Microsoft to test a compute intensive actuarial model using Windows Azure. Key statistics/parameters for this test included the following:
- Processing launched from a single data centre and the work distributed to six global data centres
- 5,380 jobs
- 1,000 economic scenarios for each job
- 45,500 computing cores
- 20 hour run time
- 260 terabytes of input data transfer
- 260 terabytes of output data transfer
- Scenario for the Milliman Microsoft “massive” test
There were two important results for this test. First, the success rate was 100%. Of course, this is what you would expect from a mission critical IT application. Second, the same 5,380 jobs could be processed in 200 hours using 4,550 computing cores. This linear scalability means that actuarial models are ideally suited for cloud computing.
The cost to access cloud compute resource to process this job would be in the region of US$75,000. Traditional infrastructure with a US$1 million annualised cost of ownership would process this job in approximately 30 days. Thus the cloud provide a cost-effective option to process large-scale time sensitive jobs.
Increasing computing demands from actuarial models
Numerous initiatives are underway that will result in actuarial models requiring increasing computing capacity. The common thread for these initiatives is the transition from deterministic models to stochastic models. These initiatives include the following:
- Enterprise risk management and economic capital models
- European Union Solvency II capital models
- United States principles based statutory reserves
- International Account Standards Board IFRS 4 Phase 2 valuation
- Financial Accounting Standards Board version of IFRS for US-GAAP
The frequency that the actuarial models are run is also increasing. Today, the industry best practice is quarterly financial reporting based upon a complete run of the actuarial models.
But is the cloud safe?
A Microsoft executive recently spoke at a Milliman client seminar. He summarised the top six challenges/questions he receives from insurance companies considering the cloud:
Of course, each company will need to conduct their own cloud security assessment. However, we believe the overarching question should be “what steps must we take to be comfortable with cloud security?”.
While the cost savings associated with migrating actuarial models to the cloud are significant, the true value is flexibility. Suppose the financial reporting cycle has a 5-day window for the actuarial model processing. At the end of day 2, the company recognises that there is bad input data and a restart is necessary. With a cloud solution the company has the option to increase the amount of compute resource and still achieve the 5-day window.
Over the next 5 years, Milliman expect that most large life insurance companies will have migrated their computer intensive actuarial models to the cloud.