Competitive Intelligence

Tactical, Operational & Strategic Analysis of Markets, Competitors & Industries

Are We In a Rut, Or Just Reluctant to Share?

Over the last several months, on separate work projects, I have been looking closely at Goldman Sachs, McKinsey, and BCG and how they structure their work and serve their clients. All of these are stellar organizations that succeeed by innovating, expanding the sophistication of their processes, and constantly keeping ahead of their clients (and competitors).

I was struck by the contrast with the CI discipline, where I fear that we are stuck in a rut. Go to any conference and you'll see presenters (myself included) dust off the decades-old intelligence cycle, Porter's Five Forces, etc. Is this the best that we can do? Where is the new thinking? Is anyone innovating out there? While I realize that good CI is a lot more than slick models and methodologies, these are indirect indicators of the level of thinking that goes on within a discipline.

In addition, most of these models are about how to "do" CI, and there seems to be little work done on how to elevate the discipline, build it into the DNA of an organization, or make it an integral part of a company's strategic thinking.

Of course, this type of work may indeed be going on, but practitioners are simply unwilling to share their secrets outside of their organizations. My guess, however, is that this represents only part of the issue, and that the problem runs deeper.

Any thoughts? Am I on-target here or off-base?

Views: 455

Reply to This

Replies to This Discussion

Bill - In a rut? Yes. All the fault of CI practitioners? No.

Why do BCG, Goldman Sachs, McKinsey, et al thrive and innovate? Because their clients pay them huge gobs of money to do so. These firms thrive on the deep relationships they build with their clients, the credibility and differentiated "product" they bring that enables them to charge huge fees, and by fostering the notion that their clients can't live without them. Do CI practitioners have similar relationships with their "clients" - the intelligence users in their companies? No.

Further integration with other corporate disciplines, new models, and ever-more-advanced analysis methodologies are only going to get us part of the way there. Until CI is viewed by executives, strategic planners, senior managers, and other users as a must-have, can't-live-without contribution to corporate management (OK, most likely without the gobs of money), then whatever innovation we require of ourselves will be bottoms-up efforts by the CI practitioner, consultant, and academic community. We will try all sorts of new tools and methods to get us out of the rut we seem to be in, but without any certainty that any of these innovations will result in a benefit that our users perceive as beneficial.

Why don't our customers view us the same way that McKinsey's clients view McKinsey? Mainly because, as has been said before, executives who do, or should, use CI really don't know what they should be getting from us, nor what to ask for. How does that relate to the rut we're in? Well, if our users don't know what they need, it makes it awfully hard for us to innovate in a way that delivers the goods that they want.

So, the question isn't just "are we in a rut?" and "are we afraid to share?" but it's also "what do we have to do to provide a lasting, on-going, valuable service to our clients -- clients who, after 20+ years, still don't have a sound understanding of what CI is?
I have a short answer for you on this. I am not going to share with you what my company is doing that is innovative in CI, because if I do, I will lose the advantage that it gives me over my competitors. Just as I am not going to share with you what kind of technical innovation I am doing, why would I want to share other things that give me a long-term, sustainable competitive advantage.
Excellent point. Any of the really cool stuff I've seen at a client, or that I've worked with them on -- is not going to appear on this blog or anywhere else in print. One of my clients had to beg her management for ten years before she was even able to make a (totally redacted) presentation at SCIP.

It's sort of like playing poker and asking us all to show our hands -- it ain't gonna happen. This is the dilemma of advancing the state of the art in intelligence. What people can talk about doesn't matter and what matters, people can't talk about.
When I was at Chase we were doing some really amazing cutting-edge CI, I shared some of what we did with the larger CI community, but I never shared the outstanding work that we were doing that would have compromised our competitive position. I don't think paractitioners will share what matters.

My other arguement here is that most companies are still in infancy in terms of sophistication in CI and therefore, rather than being in a rut, we are still in early adolesence and there is not much worth sharing.
Hello Melanie. I totally agree with you. What is the objective of doing CI? It is to gain a competitive advantage over your competitors. So why share (with yours competitors) how you do it ?

Rut or early adolescence? It does not matter. If we believe CI is strategic to our company future successes, then sharing what & how we do it is a nonsense.
I'm sure we all understand the need for discretion when it comes to employers and clients.

Still, a perennial complaint of CI practitioners is that many executives don't know the value of the discipline. This leads to all kinds of nasty things like layoffs and budget cuts.

It seems to me that the value of the intelligence professional - however you want to call that person - requires a real public relations effort. It won't just happen on its own. You've got to spread the gospel, or face the dread consequence of decisions made by "the gut" of whichever executive is currently in favor.

Moreover, I can't say I hear of that much more sophisticated work in other fields. I remain pro-CI, so long as it keeps evolving.

Eric Garland
Competitive Futures, Inc.
This is a valid point, Melanie. As I have been thinking about the issue of sharing I have been wondering if vendors and practitioners would look at this differently. For example, would practitioners be more likely to share methods but hold actual data or cases close to their vest, while vendors would flip that share/protect dichotomy?

Adding on to Bill's point about BCG and McKinsey, how do they manage to maintain considerable thought leadership while still protecting their competitive advantage? Any given issue of McKinsey Quarterly or Deloitte podcast is exploding with great information that could easily be considered proprietary. They put it out there, and they are recognized as thought leaders. Customers read or hear the material and respond "I have to hire these guys and pay them boatloads of money!" instead of thinking that McKinsey has just given them the keys to the kingdom. Does this have something to do with expectations of execution?

There is something that I have observed that impacts both practitioners and vendors alike, and that has to do with the question of customer expectations. It seems like we spend an awful amount of energy either preparing our customers to understand what is and is not possible or doing damage control if the deliverable doesn't meet expectations. I can't pawn all of this off to the uneducated executive who simply assumes "this is all in some database somewhere" and we can "just Google it." McKinsey and the like go into complex engagements with similar levels of uncertainty as any CI practitioner. Whether or not customers love the final product the proof of the perceived value somewhere in the corporation is in McKinsey's revenue and billable rates. CI projects that are as rigorously developed and just as valid are often dismissed because we have not delivered on that expectation that we have that crystal ball with 20/20 vision of the competitors, the market and the future. What gives?
Not to be glib, but I think the answer to why the big consulting companies make money hand over fist is somewhat analogous to, "No one ever got fired for buying Microsoft."
Hi all, I'm Scott Brown, relative newcomer to the CI space, but glad to be a part of this Ning community and really grateful to be able to be a part of your discussions. My apologies, too, if what I state here is duplicative of what's been posted already.

I see two themes in Bill's initial post: "old thinking", for want of a better paraphrase, and sharing. I'll leave the "old thinking" piece out in my response.

Bill states, "practitioners are simply unwilling to share their secrets outside of their organizations." A couple of things. One, yes, most practitioners won't share their areas of focus, models, etc. outside of the organization for a variety of very good legal, ethical and competitive reasons. But, what I think Bill is driving at is not "sharing secrets" but simply sharing: sharing techniques, experience, thoughts, etc.

I think Seena gets to the root of this. Look at all of the amazing social tools available these days: blogs, wikis, social networks like Ning (!), Facebook, LinkedIn.

There are a few ways to approach these tools. As I'm sure you all know, these can be amazing intelligence tools, great for mining information, much of which simply wasn't available before (or certainly not available on a public site you could sign on to for free!). We could certainly share how we approach these tools, on a spectrum from a beginner point of view to an expert point of view.

What interests me is, frankly, this Ning community and other ways we as practitioners can connect and utilize these tools for sharing and growing what we do. If we don't use them, simply because we're not used to sharing, we're missing the boat.

As I say, I consider myself a relative newcomer, only having really even discovered CI in 2000. I'm no expert on CI, and to be completely honest, I don't consider myself highly connected to the CI community. So I may be missing something that's going on that I simply don't know about. If I am, someone, please clue me in!

I went to my first SCIP conference in 2002, and loved it, though I found it a bit hard to connect with other folks. I went to my first SLA (Special Libraries Association) conference in 2004, and was totally taken by surprise by how much everyone was willing to share - about their work, their struggles, their successes, their lives. It was all about helping each other, and about building a community. Today, I'm Chair of the five-year-old CI Division within SLA.

I think this Ning community is a great first step to bring that sort of feeling and sharing to competitive intelligence.
For the record, I think that "Intelligence Ning Community" sounds like a group of highly-trained executive NINJAS!

Let's cultivate this image for maximum effect! It's the rut-buster we've been looking for!
I wonder whether the problem is that when we define "competitive intelligence" we are using the cold war views on intelligence - and that this limits us. Another discussion looks at this - suggesting "competitive analysis" as a better term. Analysis - as a general term - forces us to look at new approaches.

John McGonagle comments that the CI cycle as it is taught by most CI practitioners is wrong. I actually think that the standard cycle is dangerous as it limits thinking. It implies that CI can only come about in a planned, defined manner - and that anything that doesn't fit the cycle should be discarded. Of course this is nonsense - often the random pieces of information from unexpected sources are absolutely key. You can't plan to collect such information because they don't fit into any existing "key intelligence topic" (another bête noire) and so a rigorous following of the KIT process and the intelligence cycle could mean such information is missed or discounted, with disastrous results. (Just look at many of the "intelligence failures" that are discussed - especially with respect to national defense - and you can the consequences when information was overlooked as it didn't fit into the expected pattern.) CI should be planned - but part of the plan should allow for the unexpected.

You mention Porter's 5-forces. It's interesting how that is still pushed to the extent it is. Of course it's important, but even Porter has stated that there have been developments in the model since then (for example in the footnote in the revised 1998 introduction mentioning the work of Brandenberger and Nalebuff). The 5-forces model reflects the age it was written in - the 1970s (Competitive Strategy was published in 1980). In other words an age before true globalisation and consolidation of industries. Not just pre-Internet, but the tail-end of the industrial revolution (using Alvin Toffler's view of ages - with us now being in the "informational revolution" stage). Essentially it is a model that needs updating - and indeed it has been, by Cinzia Parolini, for example, in her book "The Value Net". This takes into account how processes have changed, the rise in outsourcing, and changing production methods impacting the value chain, and supplier/buyer power as expressed from the original model.

In that same introduction, Porter refers to Sharon Oster's book "Modern Competitive Analysis" as a work that takes his ideas forward, as well as other work by Nalebuff and Dixit. To many CI practitioners, these are unknown names. Maybe it's not CI that's stuck in a rut, but too many CI practitioners that have just not kept up to date with the latest thought on understanding the competitive environment in which companies operate.

I'd even go further. Many CI practitioners haven't even kept up-to-date in the latest collection techniques. I'm basing this on the classes I've given on Internet / Online research. Too many see basic Google searches as their only way of finding information on the Internet - without understanding how Google works and other tools that often give better results. As a result, you still hear CI practitioners saying that the Internet is rubbish with too much useless information and should not be used as a source for CI.
I agree on the fact that most of the avalaible CI models talk about how to "do" it but to my knowledge also it rarely goes beyond.

For the CI industry to become a mainstream discipline, it is also crutial to focus on demonstrating why CI (or CTI for my part) is important to an organization. Nobody today would question the value of having a Marketing, Sales or Finance department for example but when it comes to CI it is not as obvious.

I am wondering if it is because we still need to come up with clear and simple ways to establish the link between the actual CI unit deliverables, the decision at stake and the final outcomes (increase in sales, cost avoidance or else for example...) with precision and in way that it can be replicated over time.

That way, the overall understanding and awareness of CI will raise from industry to academia to government.

This is something that has probably already been adressed before at SCIP?

I know there is a workshop about "ROI on CI" in a few weeks but I wish I could find a model that would allow me to show to corporate decision-makers what CI impacts are and how they are measured and finally why they need it. Some of you may be aware of such a model or studies in that areas and how they came up of tangible measures that speaks the language of CEOs and Al.?

I agree that sharing about what we do is difficult due to the confidential/proprietary nature of our work but I believe this is a "shareable" topic of common interest for the profession and a way to bring the discipline to the next level?

My 2 cents...


Free Intel Collab Webinars

You might be interested in the next few IntelCollab webinars:

RECONVERGE Network Calendar of Events

© 2024   Created by Arik Johnson.   Powered by

Badges  |  Report an Issue  |  Terms of Service