Skip to content

Special Classification

Territorial Ratemaking

Challenges

  • Highly correlated with other variables \(\implies\) Use multivariate techniques
  • Territories often set to such small areas, that the data in each territory may not be credible

Steps

  1. Establish territorial boundaries
    • Define the basic geographic unit (zip codes)
  2. Determine indicated rates for each territory (using GLMs, preferably to account for exposure correlation with other rating variables)
    • Estimate the geographic systematic risk, distinguish it from:
      • random noise
      • systematic risk of other correlated non-geographic rating variables

Spatial Smoothing Techniques

GLMs incorporate geo-physical variables (avg rainfall) and geo-demographic variables (population density).

These explain a part of the geographic variation in loss experience. There is still some unexplained geographical variance, incorporated by creating a new variable to account for this.

#Spatial Smoothing Techniques are applied to this residual variable to smooth results across neighboring geographic areas.

  1. Distance Based
    • Current geographic unit's data is credibility-weighted with data from other geographic units
    • weights diminishing with distance
    • ADV: Easy to understand and implement
    • DIS
      • Assumes that distance has same impact for urban and rural risks
      • Doesn't consider physical boundaries (e.g. rivers)
    • \(\implies\) Best suited for weather perils, since weather isn't constrained by physical boundaries (air distance)
  2. Adjacency Based
    • Same
    • weights diminishing with wider rings
    • ADV
      • Better reflects urban and rural differences
      • Accounts for physical differences better
    • \(\implies\) Best suited for socio-demographic perils (e.g. theft)

Balance Over- and Under-smoothing

  • Over-smoothing: Using unrelated experience from distant areas
  • Under-smoothing: Giving little local data too much credibility

Basic geographic units can be grouped into territories by:
- Quantile methods
- Similarity methods


Increased Limits Ratemaking

  • We are basically doing \(\dfrac{\text{Increases Limits Losses}}{\text{Basic Limits Losses}}\implies \text{Sev}\times \text{Freq}\times \text{ULAE}\)

  • When we are doing \(LAS(100k)+P(X>100k)\times LAS(150k\text{ xs }100k)\)

    • \(P(X>100k)\) answers "What portion of losses were above 100k from whichever had limits above 100k?"

ILF vs GLM

  • ILF assumes frequency is same for all limits. But not GLM.
  • GLM results may be counter-intuitive.
    • GLM results lower for 250k (than 100k), \(\implies\) "You should charge a lower premium for a higher limit" \(\implies\) Bull-shevik
    • Which might be due to, it seems frequency may be lower for 250k insureds.

Deductible Pricing

Deductible applies to Loss & ALAE

  • When loss distribution is given, and you are trying
\[ \text{Ind. Rel} = \text{Excess Ratio(D)} = 1- LER(D) \]
  • The average amount of value that the
\[ LER(D) = \dfrac{\text{L\&LAE below D}}{\text{Ground-Up L\&LAE}} \]
  • Okay, what will you do if you are given loss distribution? Well, it is possible to calculate the LER purely from the loss distribution. By looking at frequency as counts of losses, which means you can calculate total losses on the numerator and denominator.

  • Different base… learn to read the table. The table shows three different (rows) types of coverages.

Policy Deductible Net Reported
Losses
Net Reported
Losses
Assuming a
$250 Deductible
Net Reported
Losses
Assuming a
$500 Deductible
Full Coverage $680,000 $590,000 $525,000
$250 $2,900,000 $2,900,000 $2,600,000
$500 $5,200,000 N/A $5,200,000

Solve to learn

  • 2003 Exam 5 - Q38

True Ground-up unknown

Notation:

  • \(L_{D}:\) Losses when deductible is \(D\),
  • \(L_{B}:\) Losses when base deductible is \(B\) (where \(B\) can be \(=0\), meaning ground-up losses)
\[ LER(D,\text{with base }B) = \dfrac{\sum(L_{D} - L_{B})}{\sum L_{B}} \]

So, note that if \(B=0\) then it is simply \(LER(D)\).


Work Comp Size of Risk

  • Fixed expenses as % of premium should decrease when as the size of policy increases.
  • To ensure that small risks are not undercharged, and large risks are not overcharged for fixed expenses.

Component #1: Expense Constant (Expense Fee), accounts for expenses that don't vary by the size of risk

Component #2: Premium Discount, applies a % discount to larger policies to recognize that expenses are a lower % of the premium.

Premium Discount

  • Truly variable provisions like tax and profits remain constant… 3% and 5% throughout.
  • Sum up all the % expenses, and order them by size of loss (so it will be decreasing)
  • Expense reduction = this - previous (total expense %)
  • Discount % = \(\dfrac{\text{Expense Reduction}}{1- \text{Truly variable}}\)
  • Premium discount = Standard premium in range \(\times\) Premium Discount
    • e.g. For a policy with $400,000 in standard premium
    • multiply the standard premium in range values by the discount percentages
    • sum them up to get the premium discount.

Loss Constant

Reasons why smaller risks generally show higher Loss ratios

  • Less sophisticated safety training programs \(\implies\) Higher frequency/severity
  • Not impacted by/ Don't qualify for experience rating, don't have an incentive to mitigate losses

Equate the loss ratios of the small risks to that of the large risks

\[ \dfrac{\text{Small Risk Losses}}{\text{Small Risk Premium} + \#\text{Small Policies} \times \text{LC}} = \text{Large Risk Loss Ratio} \]

e.g.

\[ \dfrac{63000}{75000 + C} = 72\% \implies C = 125 \]

Insurance to Value

Coverage Amt = Face Amt

\[ ITV = \dfrac{\text{Coverage Amt. Selected}}{\text{Replacement Cost}} = \dfrac{\text{Insurance}}{\text{Value}} \]

We talk about "rate per $100" and its calculations in some of the problems. Basically, we are just talking about the rate which would be charge if there were no expenses and we didn't need profits.
- Hence we are saying, how much you should pay per unit of coverage. \(\dfrac{\text{Premium}}{\text{Coverage}}\)

Sometimes we deal with "percentage to total value". This is simple, the numerator (expected losses) and denominator (coverage) has the same total value, which gets cancelled and we are left with percentages to work with.

  • Conditional probability refers to "if a loss has occurred, what is the probability?"
    • The frequency is yet to be multiplied to it
  • Unconditional probability refers to "if a loss has occurred, what is the probability?"

    • Conditional probability \(\times\) Frequency
  • 50% ITV

  • Replacement cost: $200,000
  • Loss freq: 3%
Expected Severity =B6*(SUMPRODUCT(C11:C12,D11:D12)+(1-SUM(C11:C12))*0.5)
Pure Prem rate per 100 =(C19*B7)/(B6*0.5/100)

Example

\(\text{Claim Sev.} \sim U(0,\text{Replacement Cost})\)

The real crux: Compare H2 (home 2) and H3.
- H2 has a replacement cost of 200k and has purchased coverage of 200k
- Total expected Severity = $100k
- H3 has a replacement cost of 250k and has purchased coverage of 200k
- Total expected Severity = $120k
- Now, if we look at the Expected loss per $1000 of coverage with frequency of 1% for all
- H2: \(= \dfrac{1\% \times 1,00,000}{2,00,000 / 1000} = \dfrac{10}{2}= 5\)
- H3: \(= \dfrac{1\% \times 1,20,000}{2,00,000 / 1000} = \dfrac{12}{2}= 6\)

On paper, we would only know that the coverage amount is 200k. But the real difference comes from the fact that the claim severity distribution is based on the replacement costs which are different for both the houses. As such, we should account for the fact that the houses have not been covered to the full value.

In general, the indicated rate per $1000 of coverage will be higher for underinsured homes as long as partial losses are possible1.

Solving:

  • Instead of severities in dollar amounts, we may be given the % loss and thus we can just sumproduct to get the Avg % losses, and divide them by the \(\dfrac{ITV}{\text{\%1,000}}\) to get the PP rates.

Theoretical answers:

  • Rate for an underinsured house would be higher since there is a higher chance for total losses, the replacement cost being higher.

ITV Initiatives

  • Guaranteed Replacement Costs
  • Coinsurance

Coinsurance

\(ITV = \dfrac{\text{Insurance}}{\text{Value}}\)

  • The value is set as a threshold: e.g. 80% ITV. Meaning, we can have 80% of the replacement cost as the value, and not incur any penalty.
  • Insurance is the coverage amt purchased, which can be higher than 80% of replacement cost, thus for correctly insured houses, would be \(\gt 1\)

Penalty, \(a = \min(ITV,1) = \min(\dfrac{\text{Coverage Amt.}}{\text{Req. Coinsurance}\times \text{Repl. Cost}}, 1)\)

Indemnity payment (BEFORE APPLICATION OF DEDUCTIBLE),

\[ I = \min(a \times \text{Loss Amt}, \text{Coverage Amt}) \]

  1. Not all losses are total losses