Printer Friendly

Looking under the hood: has anyone validated the numbers you rely on in making decisions?

The senior underwriter was obviously nervous as he shook my hand and motioned me to a chair. "I'm told you're pretty good with computers," he began. "Do you know anything about FORTRAN?"

"Sure," I replied. "Back in graduate school I learned FORTRAN programming and became pretty good at it. Why?"

"A couple of years ago, in 1981, we hired a summer intern to help us out with pricing, which involves a lot of tedious calculations. Turned out he knew a lot about computers, and wrote us a FORTRAN program that was really slick. Of course, we didn't just take his results on faith. We checked them against our hand calculations on dozens of underwriting submissions, and his results agreed with ours every time."

"Terrific," I interjected. "It must have saved you a lot of time."

"It did," he agreed. "We got kind of lazy, and started using his program all the time, so that we hardly ever did hand calculations any more. Some of our junior people probably don't even know how; they just rely on the program. But now I'm wondering whether using his program was really a good idea after all."

"Why?" I asked. "What's the problem?"

"Our business is pretty competitive. Other firms usually come up with prices close to ours. When we first started using his program, we'd typically win some accounts and lose others, as usual. But lately we're winning almost all the time. That bothered me, so I decided to check the prices from his program against hand calculations, like we'd done before. Turns out his program now gives us prices that are too low. I can't figure out why, and I'm hoping that you can."

"I'll see what I can do," I responded. "I'll need his program and an example of your manual calculations."

Two days later I had the answer. Apparently the intern wasn't aware that interest rates could change--so his program permanently assumed a 14% interest rate, which was no longer feasible and now resulted in a low price relative to competitors. We fixed the problem by making the assumed interest rate an input variable, but we couldn't undo the damage already done.

I was reminded of this incident a few years later at another firm, when the head of portfolio management asked me to take on a similar assignment. "We have this investment portfolio whose performance has simply been magnificent," he began. "Trouble is, one client keeps asking how we did it, and we simply don't know." He described all the analyses they had done, to no avail. "We're stumped, and we're hoping you can figure out what's going on."

Days later, I too was stumped. I had scrutinized the portfolio and its performance, and found nothing amiss. There was only one thing left to check, and I couldn't imagine its being wrong.

The next morning I met with the programmer who had implemented the performance calculations. "Your numbers make no sense. Did you use the pricing formula that the bond department sent you?" I asked.

"No," he replied. "Their formulas were really long and they were in a hurry, so I shortened them."

I couldn't believe my ears! "You shortened them?"

"Yeah," he responded. "This was a rush job."

As it turned out, the effect of his changes was to make the portfolio's calculated performance considerably higher than its actual performance. Mystery solved.

These two incidents--and many others like them--are significant for three reasons:

* First, all of us depend heavily on computer programs and databases for the numbers we use to make decisions.

* Second, decisions based on faulty data or wrong numbers have potentially devastating consequences. Both of the incidents just described reduced profits, from poor pricing or the loss of client fees. Notably, both situations could have been far worse.

* Third, both problems could have been prevented by hiring a third party to look "under the hood" and check the programs before they were used. This is seldom done since the cost typically seems excessive. So in practice, problems are discovered, if at all, only when errors become highly visible.

The problem I describe is becoming potentially more severe. Encouraged by rating agencies, many firms have purchased or created sophisticated software to guide decisions concerning Enterprise Risk Management but failed to look "under the hood" to validate the numbers being produced. Does this describe your firm?

William H. Panning, a Best's Review columnist, is executive vice president at Willis Re Inc. He can be reached at bill.panning@willis.com.
COPYRIGHT 2007 A.M. Best Company, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2007, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Property/Casualty
Author:Panning, William H.
Publication:Best's Review
Date:Jan 1, 2007
Words:752
Previous Article:Security breaches pose a growing problem.
Next Article:Counterterrorism should focus on cruise ships.
Topics:


Related Articles
Robin Hood Foundation expands with new lease at 111 Broadway.
Survey looks at claims challenges. (Technology: Technology Notes).
The case for outsourcing claims: outsourcing can be a wise move, but insurers need to look at their unique circumstances before taking this step.
Picking the books over a title shot.
Where to turn: stocks, bonds, cash--no investment option seems certain for insurers these days.
Miss. sues for insurers to cover flood damage.
Digital rescue: how three insurers used technology to solve underwriting problems, including speed, pricing and accumulation of risk.
Medical Communications for Combat Casualty Care (MC4) public affairs (May 25, 2006): Army leader earns 2006 Arthur S. Flemming Award.
New polyester for powder-primed SMC.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters