Competitive Intelligence

Tactical, Operational & Strategic Analysis of Markets, Competitors & Industries

Are we at the limits of quantitative forecasting?

A polemic for us competitive intelligence types...

The financial blog ZeroHedge points out that JP Morgan has now taken out a $3 billion reserve to hedge against the potentially faulty judgments of their quantitative analysts.

For those of us in the world of largely qualitative analysis, this is a fairly unprecedented move, one that cuts across the grain of most schools of modern managerial thought. After all, wouldn't you say that numbers, spreadsheets, ATTRACTIVE PIE CHARTS, no matter how fallacious, are still better to most executives than incisive, effective, insightful analysis? Or, in the words of our dear colleague August Jackson, fake numbers will trump real insight almost every time.

Surely, as intelligence mavens, we're not against hard numbers, but you should be able to analyze the assumptions behind those numbers. Speaking of which, the ZeroHedge article pulls a shocking statistic out of the history of the subprime debacle. Check out what the quantitative model predicted subprime losses to be, as opposed to the actual losses, factors of 100 greater. Holy cats...

Given all the fake numbers in forecasts lately, what do you think of the future between quantitative and qualitative analysis?

Views: 136

Reply to This

Replies to This Discussion

That was a key learning in MBA school. You can use numbers to tell your story. They are easy enough to manipulate, just like words. I think that forecasting is here to stay using the combo! You just need good analysts who can read between the lines which includes both the numbers and the words. Unfortunately we can't do anything about deception. We can only hope that people get exposed who are lying and punished accordingly.
Hi Eric,

I think it is the exaggeration that has cause reactions, not the use of structure and some math by itself. When the American banking systems near collapsed in 2007-09 it was much due to an over belief in calculations as a steering tool. All banks were watching the Value Adjusted Risk (VAR) ratio. The only major bank that would have survived without aid, JP Morgan Chase, was also the only bank that relied more on some common sense rules of thumb; do not keep too much of anything, and be sure that what you have will bring some future income. The over-mathematization of the banking and finance sector was partly encouraged by my fellow economists – many who now write books about why what they previously supported had to be wrong – and was partly encouraged by the industries themselves who saw profit opportunities in obscuring their own profession. The Black and Scholes model in option pricing is a good example. I remember I warned against this in a paper written for the Milano Stock exchange already I 1993.

Klaus
Absolutely, Klaus, it's not a question of getting rid of numbers entirely. That would most certainly be a return to pure management by intuition, a practice that would no doubt result in a reinforcement of superstition and poorly examined assumptions - worse decisions and autocracy would abound.

It's funny, I was speaking with the chair of medicine of a hospital the other day who reminded me that the era of evidence-based medicine is only five to ten years old. That is to say, doctors were previously making decisions more based on anecdotes shared among colleagues that global meta-analyses of data, and that the practice of letting numbers come first is barely a product of the new millennium!

The point is not to get rid of numbers, which would obviously make intelligence less valuable, bridges less stable, and accounting almost impossible. That said, we can take the financial crisis as an example of how purely quantitative analysis can often serve as false comfort for risky decisions, convincing us that things are more certain than they really are.
I think it's a good point about that great term "over-mathematization" of yours Klaus - numbers, being concrete and compelling almost by definition create ludic fallacies that we struggle to understand otherwise stochastically-determined phenomena statistically. For more philosophical arguments on this idea see NN Taleb for more. Intelligence inhabits the gap between induction and deduction, that is, abduction - e.g., "the art of the possible" or what could be rather than interpreting what is assumed to be true.
Hi Eric

The use of mathematical techniques are widespread in the management of investment funds. .
As Klaus said Europe is very normal in the use of VAR techniques for managing investment funds, which are to determine the maximum variation (3 sigma) from the revaluation of the fund's shares. Other cases where very simple analysis using ANOVA test as in other cases using complex techniques like genetic algorithms or neural networks.
For example a group of very interesting investment funds using numerical techniques are managed by the French company AXA (Axa quant), they use very complex algorithms with excelent results
In general the use of these methods only serves to classify between actions that companies can buy and that you should buy, not to manage the funds on a total
Yet we can not be replaced by computers!!!!!!!!!!!
Fortunately!!

In fact, on my opinion, the best approach should combine both qualitative and quantitative analysis. And the trick to obtain relevant results and a reliable analysis will be based on three steps:

- choosing the best numbers to start
- deciding how to operate with them and, later on,
- read the results properly.

The wise human intervention in these three phases will determine the success of the process. Failing in any one will drive us to a wrong conclusion, eventhough the math operations were perfect.

Interesting topic Eric. Thanks. See you in Washington.
Blame not the hammer for its misuse by the carpenter - blame the carpenter.

As has been stated quantitative results in general, and visualized results in particular are exceedingly influential in human decision-making and play into a number of our built-in wetware biases and heuristics.

Consequently, it is morally and ethically incumbent upon all analysts to explain their quantitative results in context and to explain the methodology and limitations of the results so that decision-makers can self-determine the information value of the results for their own purposes. Basically all our data is a statistic with some associated uncertainty -- and so for our results to be fully believable and understandable, we need to provide an understanding of the associated uncertainty. Either with confidence intervals, best-case/expected-case/worst-case scenarios, etc...

Decision-makers actually prefer to be just that -- people who get to make decisions about options and information they are provided. By provided them the context of the results along with its limitations and sensitivities, we empower them to make choices and then perhaps take actions based upon those choices.

The analyses, the data, and the results continue to provide the additional information value (along with associated uncertainties) that they have in the past. Perhaps some of the exogenous variables have peaked (while others have waned), thus limiting the predicted power of some models and techniques -- but if these were properly presented in terms of their limitations and uncertainties, then certainly the impact of large-scale changes in the external environment would have been understood by audiences.

As Intelligence "carpenters", I believe we have failed to properly apply the quantitative hammer by becoming enamored with the significant impact such approaches had on convincing decision-makers of our narratives -- and so we emphasized the results and diminished or failed to explain the limitations. I'd say it's a failure to properly understand our role in the decision-making ecosystem more than a failure of techniques, methods, or data.

But, as usual, that's probably just me.

Max Nelson
Well put Max - thanks - hey, you're in DC this week?
Spot on Max! You are not alone! So, it makes, at least two of us...

Who knows, maybe there are...

Question - How much is the decision maker willing to be fully aware of the 'full picture'? I believe you are assuming they are willing but then again "that's probably just me" :-)

Miguel DF
I know that I am a little late on posting to this thread, but because I come from the accounting/finance function, I can't help but comment on this.

Everyone in business wants to know the future. Financial analysts are charged with predicting future firm performance based on a variety of inputs that ideally would incorporate strategic direction derived from solid competitive analysis. It is the qualitative piece that doesn't get communicated or doesn't get interpreted correctly into the financial picture. This is one thing that can cause an organization to be unable to hit projections because they didn't include scenario analysis into the numbers. The sales forecasts are off, which then drives the rest of the forecast. If you have accountants, who mostly focus on preserving the reliability of the past and present numbers, conducting financial projections, it is likely that the future forecasts become a mere extrapolation of the past. Projections are only as good as the inputs and the inputs should not originate in the accounting department because that is not where the numbers are made.

Most forecasts published by companies are only good for a few things when it comes to competitive analysis. It tells you where they think they are going or at the very least, what they hope everyone else will believe they are going. Having said this, you would think that competitive intelligence would be a sought after partner when it comes to gathering good inputs. But over reliance on the quantitative side and the historical financials creates a false sense of comfort in numbers that are thinner than they appear.

Competitive intelligence practitioners who can take the qualitative information and make it measurable and actionable will be better suited to communicate with people who are accustomed to seeing their information in the form of a dashboard. You need both but I will be the first to admit that the qualitative side is the most difficult to read and communicate, but most critical in creating value-added analysis.
Hi Tiauna,

I cannot but help comment on this. Before I became an academic, 10 years ago, I worked as an auditor by KPMG and before that as an accountant. I struck me then that managers often tell acccountants what kind of quarterly results they want, to fit a certain story, a story that often fits the managers' careeres and their incentive plans. This does not always give you a good idea of the company''s real performance. E.g. you take costs this year because your profit is high, or you wait because it is low, etc. Mots of these manouvers are withing the legal limit, so engineering-the-books is more relevant than coocking-the-books. Still, an analyst who do not understand how accounting figures are made will fall for the story.

That is why relying on income statements and balance sheets will not do by itself. You need to know the business, that is you need to know the macro factors. I believe it is one of the factors that has made Warren Buffet one of the world's most succesful investors. It is also an important lesson for the field of competitive Intelligence.

Klaus
Klaus, did you just place understanding of strategic megatrends over a fetish for semi-fictional quantitative accounting statements? Only an accountant can get away with this - when I say the same thing, the quant jocks just assume I was too lazy to take nine years of calculus at university.

I hope you won't be insulted, but I may hug you if I see you in America. ;-)

RSS

Free Intel Collab Webinars

You might be interested in the next few IntelCollab webinars:

RECONVERGE Network Calendar of Events

© 2024   Created by Arik Johnson.   Powered by

Badges  |  Report an Issue  |  Terms of Service