A good many years ago, I ran a finance company with a specialist insurance agency. There was sufficient volume of business to consider forming a captive insurance company. This would increase margins by reducing the number of intermediaries between customer and underwriter, and give us direct access to the wholesale insurance market.

The captive operation required an insurance licence, and the key requirement for obtaining the licence was a financial model proving the concept to the regulator. I'm going to talk about this model, because it used some unusual techniques and the insurance industry itself (the Lloyds market) had some fascinating aspects which emerged during my research.

The first difference between insurance and normal businesses is that margins can't be assumed. Usually, cost of sales can be estimated fairly accurately. In the insurance business, claims are the cost of sales, and claims are inherently uncertain. This unpredictability is the reason we need insurance in the first place.

To deal with the uncertainty, I assumed a portfolio of assets at risk, estimated the average loss rate and assumed a normal distribution of claims above and below the average. I then used a random number generator to select a loss profile for each iteration of the model. This is a simulation technique that I have seldom found a need for since.

There were some weaknesses in the loss estimating process because of the surprising lack of statistical data - more on this later. However, the average loss rate could be inferred from the premium rates, which were long established, and the regulator seemed happy with the assumed standard deviation.

So far so good - the claim costs were calculated for each iteration and the premium income came direct from the assumed portfolio of assets at risk.

It's useful at this point to step back and assess the insurance industry as one would for any new business.

First, what need is being addressed? Clearly, people would like to eliminate any possibility of a really serious or catastrophic loss, even one of very low probability. Peace of mind is something people will pay for.

Second, what value proposition is being delivered? The insurance market is a risk pooling mechanism that gives individuals access to the average loss rate for the whole risk pool. People can choose to pay something related to the average loss every year, rather than risk losing their entire asset in any particular year.

This pooling mechanism is costly to set up and manage, and good profits are essential to its continued existence in the private sector. Therefore, premiums will always exceed losses, and this fact raises other issues.

This can be a question large businesses need to address, and answering it gives further insights into the insurance industry.

There are some conditions that need to be present for insurance to make financial sense for the customer:

However, this company had three premises that were critical to around 30% of their total revenues. Losing this revenue would have been very serious, so this risk was insured.

Back to the model.

We have our revenues and our uncertain cost of sales - profits in some years but unsustainable losses in others. To ensure its long term viability, the captive insurance company has to plug into the pooling mechanism of the insurance market and spread the risks it assumes. This is what the regulator was checking into and is what the model had to demonstrate.

Risk sharing was achieved through suite of reinsurance treaties which I will describe below. This is the wholesale insurance market - insurance companies insuring each other. It is a whole new world of specialist policies, concepts and language that the ordinary business person never has reason to see.

There were separate specialist Risk XL underwriters for the different risk levels. The First XL reinsurer expected losses every year and had the lowest averaging period. The higher level XL specialists had longer averaging periods and expected few or very few claims.

As I started my research, I expected to find plenty of statistical evidence for the premium ratings. However, I found none. The underwriters simply used loss experience gained over many years, and a cost-plus-margin approach to pricing.

Modelling Risk XL required many years of claims history and I achieved this by using the Escalator Table, which I describe later.

Quota Share and Risk XL were the main reinsurance treaties. There were several others to cover various eventualities.

A proper set of reinsurance treaties should prevent the insurance company running out of cash no matter what losses emerge.

Again, I expected to find detailed statistical workings being used by underwriters to prove their reinsurance strategies. Instead, I was offered a single rule-of-thumb, which was clearly the result of centuries of experience in the Lloyd's insurance market, dating back to the days of marine policies written in Lloyd's coffee house. The rule was: "never allow any single retained loss to exceed 4% of your total premium income". Remarkably simple. When I tested it with my model, it proved reliable. If I allowed a situation to exist where one loss could exceed 4% of premium income, then sooner or later, with enough iterations, the company became insolvent. Otherwise, it never did.

To run the model over many iterations (years), I used what I will term an Escalator Table, since a down escalator gives a visual analogy of how it works.

When my model was complete - with administration costs, profit and loss account, balance sheet, cash flow, reinsurance calculations, etc. - I set up a table with columns for all the main figures, including cash balances and reinsurance claims.

The top row of the table picked up these figures every time the model was recalculated. Every recalculation produced a new claims scenario and a new profit or loss and cash flow. The top row of the table was the top step of the escalator.

I then copied the whole table and pasted it down one row. A new recalculation produced a new top row. Repeating this gave a further year's experience, with each earlier year moving further down the escalator.

I used a 50-year escalator. When the initial year reached the bottom, it disappeared when overwritten. I created a simple macro to repeat the process 50 times and make a completely new 50-year simulation. In the technology of the day, the simulations would take several minutes to work through.

The reinsurance premiums were calculated using the escalator table figures for the claims history. The cash and profit figures for the balance sheet picked up their opening figures from the previous year in the table.

It was fascinating to watch the years roll out when the simulation was running. Large losses would appear, the reinsurance costs would go up and the profits would be affected. Then the reinsurance claims would drop out of the average calculation and things would improve again. When properly set up, the cash balances would stay healthy.

The model was ultimately audited by one of the large accounting firms, the application went in and the insurance licence was obtained.

The captive operation required an insurance licence, and the key requirement for obtaining the licence was a financial model proving the concept to the regulator. I'm going to talk about this model, because it used some unusual techniques and the insurance industry itself (the Lloyds market) had some fascinating aspects which emerged during my research.

*Dealing with probability and uncertainty*The first difference between insurance and normal businesses is that margins can't be assumed. Usually, cost of sales can be estimated fairly accurately. In the insurance business, claims are the cost of sales, and claims are inherently uncertain. This unpredictability is the reason we need insurance in the first place.

To deal with the uncertainty, I assumed a portfolio of assets at risk, estimated the average loss rate and assumed a normal distribution of claims above and below the average. I then used a random number generator to select a loss profile for each iteration of the model. This is a simulation technique that I have seldom found a need for since.

There were some weaknesses in the loss estimating process because of the surprising lack of statistical data - more on this later. However, the average loss rate could be inferred from the premium rates, which were long established, and the regulator seemed happy with the assumed standard deviation.

So far so good - the claim costs were calculated for each iteration and the premium income came direct from the assumed portfolio of assets at risk.

*Insurance - the value proposition*It's useful at this point to step back and assess the insurance industry as one would for any new business.

First, what need is being addressed? Clearly, people would like to eliminate any possibility of a really serious or catastrophic loss, even one of very low probability. Peace of mind is something people will pay for.

Second, what value proposition is being delivered? The insurance market is a risk pooling mechanism that gives individuals access to the average loss rate for the whole risk pool. People can choose to pay something related to the average loss every year, rather than risk losing their entire asset in any particular year.

This pooling mechanism is costly to set up and manage, and good profits are essential to its continued existence in the private sector. Therefore, premiums will always exceed losses, and this fact raises other issues.

*When to insure?*This can be a question large businesses need to address, and answering it gives further insights into the insurance industry.

There are some conditions that need to be present for insurance to make financial sense for the customer:

- The potential loss should be serious - threatening the existence or profitability of the business
- The probability of the event should be low

However, this company had three premises that were critical to around 30% of their total revenues. Losing this revenue would have been very serious, so this risk was insured.

*Reinsurance*Back to the model.

We have our revenues and our uncertain cost of sales - profits in some years but unsustainable losses in others. To ensure its long term viability, the captive insurance company has to plug into the pooling mechanism of the insurance market and spread the risks it assumes. This is what the regulator was checking into and is what the model had to demonstrate.

Risk sharing was achieved through suite of reinsurance treaties which I will describe below. This is the wholesale insurance market - insurance companies insuring each other. It is a whole new world of specialist policies, concepts and language that the ordinary business person never has reason to see.

**Quota Share**reinsurance - the reinsurer takes a percentage of each and every claim - up to a specified maximum for any individual claim - and receives the same percentage of the premium, less a generous discount for cost of business acquisition and administration. This is the working level of reinsurance and is simple to model.**Risk XL**(excess of loss) reinsurance, in three bands. The reinsurer pays every individual loss between upper and lower limits (after earlier reinsurance has paid out). The reinsurance premium is a minimum percentage of total premium income, increased if needed so that the reinsurance premium is at least the average burning cost (claims) over a stated period of years, plus a margin for costs and profit.There were separate specialist Risk XL underwriters for the different risk levels. The First XL reinsurer expected losses every year and had the lowest averaging period. The higher level XL specialists had longer averaging periods and expected few or very few claims.

As I started my research, I expected to find plenty of statistical evidence for the premium ratings. However, I found none. The underwriters simply used loss experience gained over many years, and a cost-plus-margin approach to pricing.

Modelling Risk XL required many years of claims history and I achieved this by using the Escalator Table, which I describe later.

Quota Share and Risk XL were the main reinsurance treaties. There were several others to cover various eventualities.

**Catastrophe**reinsurance covered claims where a single event, like a fire or a plane crash, wiped out a number of insured assets at the same time. The premium principle was the same as for Risk XL.**Stop Loss**reinsurance covered a situation where total claims exceeded total premium income by a small margin, and covered losses from this point up to say 150% of total premium income. The reinsurance premium was again a percentage of total premium income.A proper set of reinsurance treaties should prevent the insurance company running out of cash no matter what losses emerge.

Again, I expected to find detailed statistical workings being used by underwriters to prove their reinsurance strategies. Instead, I was offered a single rule-of-thumb, which was clearly the result of centuries of experience in the Lloyd's insurance market, dating back to the days of marine policies written in Lloyd's coffee house. The rule was: "never allow any single retained loss to exceed 4% of your total premium income". Remarkably simple. When I tested it with my model, it proved reliable. If I allowed a situation to exist where one loss could exceed 4% of premium income, then sooner or later, with enough iterations, the company became insolvent. Otherwise, it never did.

*The Escalator Table*To run the model over many iterations (years), I used what I will term an Escalator Table, since a down escalator gives a visual analogy of how it works.

When my model was complete - with administration costs, profit and loss account, balance sheet, cash flow, reinsurance calculations, etc. - I set up a table with columns for all the main figures, including cash balances and reinsurance claims.

The top row of the table picked up these figures every time the model was recalculated. Every recalculation produced a new claims scenario and a new profit or loss and cash flow. The top row of the table was the top step of the escalator.

I then copied the whole table and pasted it down one row. A new recalculation produced a new top row. Repeating this gave a further year's experience, with each earlier year moving further down the escalator.

I used a 50-year escalator. When the initial year reached the bottom, it disappeared when overwritten. I created a simple macro to repeat the process 50 times and make a completely new 50-year simulation. In the technology of the day, the simulations would take several minutes to work through.

The reinsurance premiums were calculated using the escalator table figures for the claims history. The cash and profit figures for the balance sheet picked up their opening figures from the previous year in the table.

It was fascinating to watch the years roll out when the simulation was running. Large losses would appear, the reinsurance costs would go up and the profits would be affected. Then the reinsurance claims would drop out of the average calculation and things would improve again. When properly set up, the cash balances would stay healthy.

The model was ultimately audited by one of the large accounting firms, the application went in and the insurance licence was obtained.