Friday, February 25, 2011

Computerless Risk

by Rick Nason, PhD, CFA

Partner, RSD Solutions Inc.

www.rsdsolutions.com

info@rsdsolutions.com

 

I was teaching a course in the Bahamas recently when the power went out.  It does not happen that often, but it happens.  In fact the power is more likely to go out when I am at home in Halifax – thanks you Mr. Winter Storm. 

The issue with the power is that I was giving a lecture on model building in Excel.  How do you teach computer model building if there is no power for your computer (and the battery is dead in your laptop and the overhead projector will not project)?  And yes, for those of you who are curious, we had already covered the theory. 

It got me to thinking – how well would your risk management system work without computers?  How well would you be able to model your risk exposures?  More importantly, how would you understand the trends in your risk exposure? 

Now to go to the next level of questions, how well do the people in your organization understand the risk model data that your system currently spits out?  Do they know it well enough to “game it”?  Do they know it well enough to know where its weak points are?  Do they know it well enough to know when it will provide non-intuitive or incorrect results?  Do people in your organization still have intuition about risk exposures – or is it all computer driven? 

I fully understand that your organization likely has redundant systems, and so an isolated computer glitch will not change things.  But why not also have redundant people systems – that is people who can back up the computer if the computer is down or producing irrational results?  Why simply not have the best computer of all working for your risk department – the human brain? 

By the way – our class went on quite well even without power.  Amazing thing the human brain.  No power cord!

Thursday, February 24, 2011

The (Forgetful) Dismal Scientists or “Yesterday’s logic is illogical today”

by Michael Arbow MBA

Partner, RSD Solutions Inc.

www.rsdsolutions.com

info@rsdsolutions.com

 

As the US Federal Reserve Chairman’s own version of the QE2 sets sail, commodity prices continue their upward move. The upward march in oil was initially caused by the increase in demand for product as world oil consumption moved from about 88 million barrels/day last year towards an expected 90 million this year. More recently, Brent crude has been pushed above $100 because of increased uncertainty of continued delivery from the Middle East.

Now those of us with long memories, say 12 months, will recall that the dismal scientists as economists are affectionately (?) known, predicting that the next recession would start when oil trades above $100 USD/bbl. Interestingly, the Street has not renewed their talking of this outcome but rather they focus on the renewed and continued strength of the US economy. It is strange that all the arguments Street economist gave about the effects of triple digit oil and $4.00/gallon gasoline 12 short months ago are no longer gain attention.

This blog raises 2 points:

  1. It takes a paradigm shift for yesterday’s logic to be illogical or have reduced impact. Effective risk management must be dynamic but not at the expenses of forgetting the past.
  1. It looks like price volatility will continue in 2011 and downside risk is starting to appear.For corporate risk managers: what are you doing now that the likelihood of downside economic risk is increasing or are you seeking safety in the crowd?

In fairness.  I caught the following (taken from the Globe and Mail) just prior to posting:

"The International Energy Agency’s (IEA) executive director Nobuo Tanaka said prices above $100 per barrel for the rest of the year could drag the global economy back into a repeat of the 2008 economic crisis."

Wednesday, February 23, 2011

The Flawed Risk Question

by Rick Nason, PhD, CFA

Partner, RSD Solutions Inc.

www.rsdsolutions.com

info@rsdsolutions.com

 

I am currently teaching an Enterprise Risk Management course to senior people in the MBA Financial Services program at Dalhousie.  I recently got back one of the first sets of assignments and was marking them on a plane ride to visit one of RSD’s clients. 

The marking was going quite well until I came to one student’s answer that gave me pause.  The student started their response to the question by stating – “This question is flawed.”  Interesting I thought.  All of the other students had answered the question without any such trouble.  The point is that the student was correct.  I had asked a quite reasonable academic question, but it was flawed in that the question made some implicit assumptions.  It was flawed in that although it was a good question, but it was not the right question.  It was not an impactful question.  It was not a question that would challenge to the proper degree.  It was not a question that would lead to the core of the issue.  The question was flawed. 

How often do we ask a flawed question?  In risk how often do we ask a flawed question that produces correct, but flawed answers as a result of the flaw in the question? 

The student got a good mark for their response.

Tuesday, February 22, 2011

Turing Risk Test

by Rick Nason, PhD, CFA

Partner, RSD Solutions Inc.

www.rsdsolutions.com

info@rsdsolutions.com

 

I am currently reading the book “The Man Who Invented the Computer”, by Jane Smiley. (http://tinyurl.com/4vo9oua)  I highly recommend it. It is a thrilling read – even if you are not a computer geek (or a geek in general). 

Reading the book reminded me of my early days in academia (early 1980’s) when we debated where computers were going to go. The big debate we had was about the Turing Test. The Turing test involves having someone ask simultaneously questions of a computer and of a human. If the person asking the questions cannot tell the responses from the human, from the responses of the computer then the computer has “passed” the test. 

At the time the “test” was proposed, the thinking was that the computer would give the weaker responses and thus it was up to computer designers to develop more elaborate and sophisticated computers. 

The question I would like to propose is the Turing Risk Test. It goes like this – if you read a risk analysis of your organization produced by a computer, and a risk analysis of your organization produced by a person (or a team of people), will you be able to tell the difference? Which will be the most sophisticated analysis? Which will be the most useful analysis?

 

Monday, February 21, 2011

Studying Risk

by Rick Nason, PhD, CFA

Partner, RSD Solutions Inc

www.rsdsolutions.com

info@rsdsolutions.com 

 

As a professor I get to observe students and their studying habits all of the time.  One of the more interesting traits that comes across is the diligence with which students study the parts of the material that they already know, but almost completely ignore material which they do not know as well – or do not understand at all. 

Is this a human trait?  Do we naturally go to the aspects or practices that we know – over the aspects that we know we should know better?  Is this your risk department – sticking to the tools and techniques that were useful twenty years ago, but which admittedly are past their prime?  Is the unwillingness to learn new methodologies an inborn trait?  Is this smart?  Is this smart risk management?

Sunday, February 20, 2011

Pentagon Political Risk Model: $125 million. Reliability: Ummm,...

by Michael Arbow MBA

Partner, RSD Solutions Inc.

www.rsdsolutions.com

info@rsdsolutions.com

 

Over the past 4 decades the US Pentagon (Department of Defense) has spent untold millions attempting to build a predictive political risk model, the latest iteration of which cost around $125 million USD.  Despite the vast amounts of data gathered from world political experts and the complexity of the algorithms, they failed to predict the recent events in Egypt.  Luckily the direct impact on the US in this particular instance is minimal; for now. 

Rick Nason, a partner at RSD, has pointed out that risk can be simple, complicated and complex.  When people and human emotions are involved it seems that complex risk management wins the day.  Is your organization operating under a “tick the box” risk management system?  Alternatively, has your organization put faith in algorithms? 

For a link to “Pentagon’s Prediction Software Didn’t Spot Egypt Unrest” from Wired Magazine click:

http://tinyurl.com/6jc3zzc