I recently read this article in the Wall Street Journal on a just released book written by David Freedman called "Wrong: Why Experts Keep Failing Us—And How
to Know When Not to Trust Them". It reminded me of some of the issues we have in CI. Has anyone read the book? Do we as CI professionals try to make out we are experts when we deliver our reports and/or briefings?
While I have not yet acquired the book, I do intend to visit the bookstore and at least peruse it. That said, there has been a tremendous amount of chatter as of late about one of the biggest issues of our day (the housing/financial crisis) and why the experts failed to predict it, why they were all wrong,wrong,wrong.
So with that in mind, while Freedman's central thesis (at least as far as I can discern from the synopsis review) appears to focus on errors in measurement and the idiocy of crowds as it relates to being wrong-looking at the biggest issue on the plate at present - the financial crisis-the underlying issue here was not an error in measurement and there is ample evidence that it was predicted quite presciently within the very firms that created the mess.....and the external FP analysts also knew, but of course could not write about it, because they are part of the machine, and you don't bite the hand that feeds you...
So anyway as it relates to the supposed experts being WRONG, I think there is something far more insidious going on and I wonder if Freedman touches on it. I like to call it "the let's play stupid/plausible deniability principle...." Point is, the experts quietly get put out to pasture, the C suite folks deny they ever saw or heard warnings etc...give the BP thing time, and I have no doubt some of their risk management folks will come out and say we told them so, the big mucks will deny they were ever warned and it will be the same thing all over again...hm-mm I think there is something else entirely going on with regard to the big issues of our day and the experts being wrong all the time.... don't you?
My concern is that "experts" are leading the way in every endeavour of business, social, political, technological, medical (to name a few) arenas. C suites will always look to plausible deniability when it comes to business expertise in the business arena - yet on mass we still defer to experts in every endeavour of our lives and I wonder how this impacts how we deliver CI. Are we seen by our intelligence customers as the "experts"? Or is that the role/position we seek to attain in the mind of the C suite when we deliver our CI briefings and reports? Is this the role we play in our organisations? And when we are wrong, are we punished or is it OK to fail?
I must admit I make it clear that while I am experienced (been doing it for over 18 years) with the CI process, my report is not all knowing but the best available insight at this point in time with what we were able to uncover. All our clients know we undertake every assignment on a best endeavour basis.
I have however seen people purport to be experts when we interview them and yet we have found on numerous occasions they did not know some of the information we uncovered.
So maybe the bigger thing is - are we becoming more cynical and/or more stupid as a whole and relying on "experts" who really are not and who could be wrong? What a scary thought for the future. Maybe in today's world, there is no room for experts?
Wow, you covered a lot of ground in your commentary above. So ok, to answer your questions--keeping in mind this is just my perspective based on my experience.
1) Are we seen by our intelligence customers as "experts?"
MN: Well, yes; at least within the organizations I have worked for. The CI folks were regarded as internal experts in certain business categories and some on certain competitors, and they were consulted because they followed things over time and were able to bring depth and perspective to the table accordingly for various cross organizational and cross functional audiences.
2) Is the expert role one we seek as CI practitioners when we present to the C level?/Is this the role we play?
MN: Well, if it isn't; again I would think we have a problem. Frankly, I don't think the CEO wants to talk to someone about the competition if they don't bring depth to the table that the average person doesn't possess.
3) When we are wrong are we punished? Or, is it OK to fail?
MN: Well, punishment will certainly come if a strategic analyst isn't accurate the vast majority of the time - the punishment would be that the person simply wouldn't be left in the role for long.
As far as failure on a particular mission for analysts that on the whole have been successful, well all factors have to be considered to the degree possible I think as to why the failure occurred.It's got to be OK to fail sometimes...ie, If the resources or the $ were not there, or that there is no way legally to procure certain intel.
Now If the analyst interpreted the signals wrong,well that's another matter.I mean what was the reason for the failure? There does have to be reasonable expectation, and the folks asking for certain intel have to understand the constraints at play, and that it isnt always possible like Arik says to have "perfect knowledge"...but most of the time if you get the top level stuff right in my experience that works.
As to your last set of Q's, are we becoming more stupid and relying on experts who arent? Uh yeah, I think so unfortunately....
I think the root cause of the trouble is the deification of expertise by a society beset with uncertainties it cannot seem to understand. We want a priesthood vested with incantations beyond our novice understanding to read the entrails and consult their oracles to tell us what's going to happen next. Unfortunately, most such forecasts are more palliative than predictive - they are designed to make us feel better about our decisions, not to actually surface the risks which accompany them.
We have created whole professions invested in being right where all the rewards are aligned around the elimination of risk by turning heuristics for decision-making into algorithms that are more reliable in predicting those outcomes. What this inevitably leads to is an abandonment of the original mysteries that drove our inquiries in the first place - we fail to reexamine the original endeavor because we believe we understand the issues implicitly.
I've been spending much of my summer vacation "in the lab" (so to speak) applying abductive logic methods to a set of hypothetical business problems as a means for intelligence tool selection. My takeaway so far is that what we do is about understanding our world not as it is nor as it should be, but as it might become.
That's the underlying failure of expertise. Experts find it impossible - indeed, are paid to be certain it's impossible - that any future they haven't already seen before can ever happen. The expert relies on what they know. We must inhabit the vast gulf between what they think they know and what they actually know if we are to disabuse them of their false assumptions before it's too late to save the institutions they govern from similar disasters.
There are, of course, multiple ways to respond. We can certainly talk about "experts" as if that was a group that we did not belong to or with. There is a fascination with seeing ostensibly smart people be proven wrong on a fairly regular basis. Kind of makes me feel better about my own mistakes.
That leads me to the other way to view the issue. That is (as you suggest), how do we hold ourselves out as experts to our clients? Is there a good and useful way to do this while protecting them from our inevitable mistakes and their likely misunderstandings?
I would contend that there is a way. My sense is there is a healthy and productive way to balance humility and expertise in the service to our clients. Indeed, prompted by your question, I have written a blog entry to expand on this thought.
An excellent blog as usual. And you are right there needs to be a balance of humility with knowledge.
I would like to suggest that depending on the KIT or KIQ, it is our job to become the "expert" in terms of the client relationship so as to help in their deliberations. Maybe the word should not be "expert" but "guide"?
I believe what Wrong says is -- full disclosure I haven't actually read it either -- that it's OK to be wrong, in fact it more than OK, it is the engine of our collective intellectual development. But that we over-glorify being "right", since that's what wins grant proposals, gets headlines…and, to build on Babette's terrific analogy, wins kudos from management.
I see this as an extension of our culture of celebrity -- the celebrity of the "correct idea".
But if you study intellectual history, you'll know that "correct ideas" are themselves trend-driven and socially-reinforced. In 1491, “everybody knew” that the earth is flat. See Kuhn's Structure of Scientific Revolutions (required reading for any serious study of organizational knowledge.)
Most of us know from experience that one of CI's biggest problems is what I call the "Cassandra complex" -- that CI people are often pigeon-holed as nay-sayers, gloom-and-doomers, Chicken Littles, non-team players, Cassandras, Debbie Downers, and so on. And that much of CI training is devoted to how to “sell your findings” -- which often amounts to sugar coating or spinning them so they will be palatable to management.
We may in fact have a cultural bias toward the “glorification of rightness” – which ironically has the unintended consequence of making us organizationally more stupid. And this may be culture-specific.
Others have pointed out the "happy talk" effect -- that management wants to hear good news that reinforces their ideas and/or decisions that have already been made. This is certainly true of most US companies, where the unspoken rule is "no surprises, especially bad ones".
But there's evidence that this is not universal. Some organizational cultures even systematically encourage the airing of bad news, because it often indicates actual problems that actually need fixing…the sooner the better. I’m told that at Toyota, for example, one is encouraged to deliver bad news to one's superior. As a result, bad news goes up the chain quickly, and can be acted on before it spins out of control. Their recent acceleration problem is a good example of how this worked in their favor. It was undeniably bad news, but has been dealt with expeditiously by the company.
On the other hand, BP’s Deepwater Horizon reaction is an example of the opposite. As I understand it, on-site information after the explosion at the rig was not transmitted quickly, and the response by management was both late and ham-fisted. Consequently, significant damage -- possibly irreparable -- has been done to the company’s reputation and market value.
If you're a CI person in a happy-talk culture, you'll spend every day swimming upstream -- and get precious little thanks for it. You may become marginalized, or even fired (as happened at the late Lehman Brothers, where the early warning messenger of financial doom was shot in a professional sense.)
Thanks, Babette, you’ve inspired me to read Wrong, as well as The Rational Optimist, another book referred to in the Journal piece.
Allow me to share a 'view' that reflect my understanding of your question and a potential answer.
"(...) Thus there are two kinds of experts. A soufflé chef really is an expert and can be trusted. An economist is a pseudo-expert (...) Don’t focus on probability. Focus on consequences. (...)".
I am also interested in the topic not specifically as it applies to CI as experts, but the challenges of IDing SMEs for Humint. One of the down sides of the web is that it enables anyone to position themselves, and look more knowledgeable / connected then they actually are.
Several comments really resonated for me in this thread so far
1) Guide or Sherpa as Role. Sometimes I feel like one the greatest values I bring is not to have all the answers, at least not at first, but to help ID/raise issues & clarify them.
2) Trend-Driven Nature of Ideas.
3) Underlying Failure of Experts. The catch 22 for me here is how invested the expert is in the rear view mirror can limit their ability to course correct or factor in possibilities that that challenge their assumptions. I'm not taking a side on my example, but Stephen Hawking is a case in mind. Not CI, science -- but the problem is similar.
There's also the whole truth/facts component. It seems like reality is becoming increasing more subjective....
From most of the comments, I think we seem to be saying that as CI professionals all we can set out to say is that we GUIDE our clients (whether internal or external) in their thinking and decision making. We hopefully challenge their mental status quo to enable them to discover different or alternative solutions. Would you agree with this 'new' positioning? Do you think there is a difference between a guide and an expert (ie a CI practitioner versus a souffle chef)?
If we agree we are in fact guides (and not experts per se), then I see we have two key responsibilities. One we need to acknowledge that we don't know everything but we DO know a process that will enable us to help our clients to do their jobs more effectively and two we need to keep challenging our own status quo whether it be through reading, attending seminars, or participating in forums like this one - to name a few.
If you positioned yourself as a "guide" - do you agree with these priorities? If not, what would be your priorities?