Competitive Intelligence

Tactical, Operational & Strategic Analysis of Markets, Competitors & Industries

Staying updated about analysis methods for CI?



I would like some ideas on how to stay updated about the development of new methods for CI analysis.


I've been a CI practitioner since 1996 and have taken most of the relevant courses at least in Sweden, where I'm based (and given several of them as well) and I have read many of the "must-reads" in the field. Hence, I have a substantial toolbox at my disposal, to use in different cases that I encounter in my job as CI manager (and as CI consultant before that).


However, I wouldn't be doing my job if I settled with this, especially not in a field such as CI - fast evolving, pragmatic, and focusing on change as it's very core... Instead I'm left with the all-to-familiar unsettling feeling that I might be missing something here :-) So I would appreciate some ideas from other practitioners - how do you make sure that you stay updated about the latest developments in analysis?


Looking forward to your insights!


Kind regards,






Views: 114

Reply to This

Replies to This Discussion

Thanks for bringing up such a great discussion topic, Henrik. Apologies for the long reply, but it struck a nerve, and although I don't know if I ever answered your question, it brought up some related questions for us to consider.

Outside academia, I think it's pretty challenging actually (without an explicit "pro-am" style community like this one built around sharing such new techniques in context) to keep abreast of analytical methods as they come afield. Indeed, I think SCIP's old JCIM imprint would/could have been a material contributor in this respect. So, I'd like to noodle a little on the core problem you've identified, if you'll allow me, because I think it'll help us solve it and, hopefully, make for a better way of doing exactly what you suggest - stay abreast of new analysis methods as they're developed. Please forgive my solipsist's dialog as a literary device in getting at that result ;-)

Because of the "competitive" context of discussions conducted on sites like this one, public LinkedIn groups, and even hashtagged Twitter streams, many of us often wonder if the most useful tools are actually being kept exclusive or confidential, away from prying eyes like ours, like some sort of secret weapon? Or at least until a would-be guru can turn it into a pop business book to capitalize on their mashup of a couple of old ideas to create something new. I've had colleagues presenting at conferences even tell me as much - they'll share some of their stuff, but the best ideas they save for themselves (and, presumably, their employer).

Thankfully, this is changing because of the very nature of the social web's breakdown of barriers to knowledge transfer across organizations. As a former consultant, Henrik understands that, knowledge transfer such as that described above, can be done in a high-trust, confidence-secure context, and has always been one of the management consultant's primary value propositions for clients new and old alike. Why would you hire a new consultant without some promise of learning something new you couldn't get more easily by simply reading a book, which, you can then apply to your business to gain a 10X return on that investment in knowledge capital? Where do you think all those consultants learned those techniques in the first place? Do you think it was immaculately conceived, fatherless, and reared to adulthood by the singular brilliance of the consultant themselves?

Hardly. That novel knowledge came from having observed patterns in prior experience long enough to know what was both valid and reliable enough to build a heuristic for future applications. Proprietary? Perhaps. But unique to that individual? Not very often. Having done it myself for substantially my entire adult life, I always find it hard to believe new methods (for analysis or anything else) are born of novel invention by a single individual.

Happily, I believe the cultivation of communities for sharing such knowledge will become the primary role of professional social media going forward. Not that consultants and experts will diminish in value - simply that, their value will be enumerated by the sheer volume of transparent knowledge transfers which can be attributed to them... never again in terms of how safely they guard their priesthood of secrets.

All this again compels me to ponder the words we use to describe what we do - the term "analysis" for example is often as misleading as the words "competitive intelligence" - so, what do we mean, precisely, by analysis?

To answer my deliberately rhetorical question, I'll suggest most of us use analysis as shorthand to describe the applied frameworks of interpreting meaning from the corpus of information surrounding us. However, to quote Wikipedia, analysis is actually the process of breaking things down - the opposite of synthesis, which is to collect things together to form a new whole.

I apologize for my "semanticism" (is that a word?) on all of this, but I think our nomenclature grows more obsolete every year. I think, too often, we unintentionally limit our scope of study to topics like "collection" or "analysis" because some or another of our professional ancestors have dogmatically hewn to a definition that no longer applies.

Take, for example, prediction markets: PMs have emerged over the past couple of decades as valid, reliable methods for forecasting events by aggregating information held by a trading population who place a value or probability on one or another particular outcome, where the knowable facts support a hypothesis beyond simple guessing. Indeed, PMs have furnished us with an incredibly valuable managerial decision-making toolkit that nearly takes the analysis out of decisions entirely. If your crowd can tell you how many LCD TVs you're going to sell this Christmas, why would a firm like Best Buy bother doing research on it?

Methods such as PMs actually call for a broader definition of our identity - far beyond "decision support" or "early warning" I think... We need to know how to ask the right questions in the first place, figure out how to engage that collective intelligence and earn the trust of decision and policy makers as the principle skeptics in the room designed to poke holes in their delusions of grandeur while plans are still little more than thought experiments and before precious capital is invested foolishly.

Unfortunately, that job doesn't pay very well in most of our institutions... at least, for now.

However, I think it will someday. We need to take advantage of the current crises around us to help create a new field of understanding - one where new methods are driven, not by the desire of some would-be guru to build a book-writing or keynote-speaking career, but by our society's collective need to avoid the mistakes that will eventually bring down our most important institutions.

I'm reminded of that mythic (in other words, false) quote attributed to Charles Duell, former Commissioner of the U.S. Office of Patents, which claims that he said, "Everything that can be invented has been invented." Bill Fiora had a post here some time ago about the apparent absence of innovation in the CI field that really struck a lot of people in its poignancy - indeed, perhaps looking at CI as a "field" is part of the problem. It's often been suggested that CI should instead be thought of as a collection of skills and competencies that, put together, offer a value to both the individual and organization that individual serves.

In that respect, maybe we should start identifying ourselves with the outcome value of our skills and competencies and forget about the obsolete nomenclature once and for all?
Arik, thanks for a very interesting response!

You bring up some dimensions to this issue which I had not really thought about. I agree for example that there is a tendency to keep some of the tools to oneself (even though they are not necessarily "reared to adulthood by the singular brilliance of the consultant themselves"). On the other hand, it might be possible to make a distinction between tools that are already "out there" and therefore fully possible to discuss in forums like this and those that someone still feels, with or without reason, are better kept under wraps.

One could also discuss the nomenclature, of course. You make some very valid points here, and looking at the Wikipedia definition, it is easy to feel that the term doesn't quite cover what I, and maybe most of us, refer to. Personally, I see analysis (or whatever we prefer to call it), more as a way of rearranging the information, sometimes breaking it down, sometimes making a synthesis, sometimes just - rearranging.

As for prediction markets I totally agree that it gives a whole new way to forecast the future. However, I'm somewhat doubtful as to whether it frees us from limiting assumptions and biases about the future. I don't see how even collective intelligence would achieve that. But as a valuable tool, absolutely. Does it make the distinction between information collection and analysis more blurred? Quite possibly. But I think we will still need the tools that were formerly known as methods for analysis (I'm coming to think of a famous artist here...), and they will remain useful in the future.

But, as I outlined in the beginning, we need to stay on our toes to discover new tools to put in the toolbox. PMs would be one relatively new tool. And communities such as this one will absolutely be part of solution I'm looking for!

Thanks Henrik - agreed on all counts.

Perhaps rather than "analysis" a more accurate definition would be "interpretation" because it implies contextualized meaning - as in, translation from a foreign language - as well as internationalization for the individual (in our case, organization)? Just thinking out loud on the nomenclature...

Regarding PMs, I think the blowback that organizations find in their use most often applies to seeing them as a panacea for prediction - maybe again a nomenclature problem. PMs are *only* ever valid for answering questions where there is knowable information which can be aggregated. For example, the PMs a while back wondering where the next Olympics would be held were false because knowledge of that information exists *only* among the IOC mafiosi - who I suspect would find themselves very quickly outcast under their version of the omerta code of silence were it learned they were trading securities around such a high-stakes question. Likewise, forecasting guesses such as a 5-year CAGR are based, most fundamentally, on averages compiled from knowledgeable insiders - buyers and sellers of the product - and analyst firms try to predict unit/revenue volumes by "statistically" applying heuristics from prior successful (or otherwise) similar forecasts. PMs would, I think, be pretty weak in such an application.

The real key to markets for any uncertainty is to compel users to play with their own capital - vested interests alone reflect how probabilities are internalized. As a result, there is significant question about whether play-money markets can ever really be seen as anymore valid that the PE firms and hedge funds who invest with the house's money (and ultimately make terrible decisions about where to place their bets).

You might want to start a group on the topic to bring interested Ning members together around it - enthusiasm is a marvelous host ;-) Call it Analysis, Synthesis and Interpretation if you like... or whatever. Assuming there isn't one already that is - you might have a ready-made group here waiting for you to share something.

My response to your question is so vague as to be almost non-responsive. However, something that helps me continue to re-evaluate my own analytical skills and tools is to ponder, repeatedly, the biases and blind spots I and my CI clients suffer from. There are so many it's staggering!

Still, I expect you are looking for more specific 'new' analytical tools and approaches. It will be great to hear what others have to say about this.


Absolutely, staying - or getting - aware of biases and blind spots is vital to doing good CI work. And hopefully, that is why one might feel that more skills and tools are needed, and this feeling might in it self be signal that the biases are not that big...

I am setting up a tool for tracking CI info. Will post a link here in a couple of weeks. It is based on a Freemium model (tracking one topic will be free). This is how it works.

1. You choose a topic
2. We look up the topic in dmoz (an open directory) and identify sub-topics (if it does not exist on dmoz we automatically search on Google and extract some of the top links and manually curate them a bit)
3. You can choose one or more sub topics to track (for example CI for healthcare)
4. We find blogs and feeds in this space (topic/subtopic) and subscribe to them
5. We take the feeds and store them in a search server (Solr)
6. We take each subtopic and create a search
7. The feeds are read at regular intervals (a couple of times a day) and the search is run daily and produces an infostream (either as email alert or RSS feed).

Why are we telling you all these geeky details? You can probably do this all yourself (some of them manual and some of them automated).

We also have a product (been there for about 5 years) called InfoMinder that lets you track web pages for changes. You can locate a few CI portals and track them using InfoMinder.

I hope this is not inappropriate. I will be happy to help you on finding DIY (do-it-yourself) tools to stay on top of information.
Thanks, good tools for information collection are always useful, and an important part of the job. Right now, I was more looking into the "analysis toolbox", where you would find tools such as scenario planning, Porter 5 forces, early warning system, wargames, SWOT and the likes, though.

Thanks for the reply. I thought I was answering the meta question " how do you make sure that you stay updated about the latest developments in analysis?" Apologize if that did not come through well in my reply.


On top of all the suggestions above, I believe this, if it motivates you, we'll all benefit:

If, after learning all analysis techniques, after taking all relevant CI courses, after reading all management theories known to mankind and after keeping in touch with all relevant communities sharing knowledge in the CI field, you believe there could be more elsewhere, then that little extra is what you're about to discover. In other words, if you believe there is one more notch to take CI to, I am not sure what it would be, but you could be on the brink of something new, and a deep introspection could reveal it.

I will leave it here for others to comment further.


I must admit that I am far from able to say that I have read all management theories known to mankind... However, your answer is still very motivating! We'll see if I'm on the brink of something new or just chasing ghosts.


Of course, that's a good point!


That is such a great question and Arik's response is just amazing.... Got me too thinking of other things.

Aside from information collection techniques and an understanding of ourselves vis a vis blindspots, the key to the CI process as we all know is the analysis.

The short answer to your question however is that I personally keep playing with new and different techniques I come across or have written about when an appropriate intelligence question arises.

I see analysis truly as a mixture of both art and science and tend to define analysis as a multidisciplinary combination of scientific and non-scientific processes by which an individual interprets data or information to produce insightful intelligence findings and recommendations for action.

For me, this means that analysis is constantly evolving in light of the issue at hand. We can use some tools that are well known (science) and then give them a little twist (art) or combine well known techniques together (ie The Nine Forces) to assist with the examination of the information.

Surprisingly over the past 15 years of doing CI assignments, I have found two key learnings have stood out.

Firstly, executives in companies tend to ask the same questions over and over again. I have done over 300 CI projects and have to say I do not frequently have the opportunity to move out of the same dozen techniques. Why?

Two reasons – 1. because my CI clients questions were essentially the same. The words may have been different but the end goal was the same – Should I enter this market? What are my competitors up to – I heard this rumour? Is there a market for this new product idea? Etc. etc. 2. Trying to explain complicated rather than simplified analytical techniques tended to distance the client from the project and the insight.

The second learning I have gained is that evolving analysis works well when the persons applying the techniques understand the basic analytical process of a particular tool well. This allows for what I call creativity. Because you understand the limitations and/or strengths of a tool or tools, you are able to devise ways to enhance the value of the output.

When Craig and I wrote our first two books on analysis, we identified over 200 analytical techniques in management – some old, some new, some easy, some complicated. The question is which ones are right for you?

As I mentioned earlier, I think that will depend on what your intelligence questions are and how much time you have to answer them. So if you want to stay aware, and learning, of different analytical techniques, my suggestion would be to read and then practice, practice, practice with all the different intelligence questions you face in your role. Sorry there is no easier way!

As a cynic I would also suggest that many people don't actual apply analysis in their CI process but rather synthesis (see Arik's comment).

I am happy to discuss this further with you to address specific issues you may have and to share my favourite tool box. I will share if you will!!


Free Intel Collab Webinars

You might be interested in the next few IntelCollab webinars:

RECONVERGE Network Calendar of Events

© 2024   Created by Arik Johnson.   Powered by

Badges  |  Report an Issue  |  Terms of Service