Tactical, Operational & Strategic Analysis of Markets, Competitors & Industries
I am currently investigating the topic of Strategic Early Warning System (SEWS).
If many examples exist in the field of industry and services, surprisingly i could not find any application in Agriculture, except a research in China.
If you know any SEWS application in Agriculture, i d be glad if you could share these examples.
I'm not sure what specific early warning angle you are looking at with regard to agriculture, but I'll share a few items from a report I put together a couple years ago on SEWs systems across multiple domains that does cross over into monitoring things like salmonella outbreaks,species depletion, as well as infectious diseases in animals/plants/ humans, contamination of food/ water resources, bioterrorism and exposure to chemical and radio-nuclear agents which could naturally impact the food supply.
Let me know if you want more.
A. For the purpose of anticipating disease outbreaks, bioterrorism, enabling better response
(1) Health Map
Health Map commenced in 2006 as a “side gig” project between John Brownstein of Harvard Medical School and a SW developer named Clark Freifeld of the Children’s Hospital informatics Program, which is part of the Harvard- MIT Division of Health Sciences and Technology. The goal was to expand disease surveillance and pick up on outbreaks sooner than official channels leveraging mathematical algorithms, data visualization technology and semantic web capabilities.
Health Map caught Google.Org’s (philanthropic arm) attention in late 2007, and they have been funding the project in conjunction with The National Library of Medicine (NLM), the Canadian Institutes of Health Research (CIHR), as well as the CDC since. Google however, seems to be going in their own direction as of late with the InSTEDD project, see Section I. A.3)
Global Public Health Intelligence Network (GPHIN), PROMED MAIL, DoD Global Emerging Infection System, Health Protection Agency, Wildlife Conservation Society, and the CDC are collaborators/users.
So far, the program can accurately determine the validity of a report 95% if of the time, often days before the WHO and the CDC itself. A recent example of this is a salmonella outbreak in the US that sickened more than 1, 000 people; the cause of which remains unknown. Health Map detected this outbreak before the CDC announced anything.
The Health Map team is presently expanding operations and increasing the amount of detailed information for each specific outbreak and incorporating more “noisy” sources of information, like blogs and chat rooms.
Also offer a global Wildlife disease map. URL: http://www.healthmap.org/en
I. A. For the purpose of anticipating disease outbreaks, bioterrorism, enabling better response
(2) GPHIN I and II
The initial version of Canada’s Global Public Health and Intelligence Network (GPHIN I) was created in 1998 by the Public Health Agency of Canada (PHAC) in conjunction with the World Health Organization (WHO). Notably, Ted Turner’s Nuclear Threat Initiative (NTI) provided some financial assistance. The idea was to have an early warning system of public health threats WW and to enable emergency preparedness and planning. A second iteration, GPHIN II came online in 2004, and added expanded language capabilities. It was collaboratively developed with Nstein, a Montreal based corporation that brought to the table linguistic AI acumen. Future iterations will incorporate advanced analytical capabilities.
GPHIN is a secure, internet based “early warning system” that gathers preliminary reports of health significance in seven languages on a real time 24/7 basis by monitoring global media sources such as news wires and web sites. GPHIN leverages linguistic, machine translation and data mining IP. GPHIN tracks topics such as disease outbreaks, infectious diseases in animals/plants/ humans, contamination of food/ water resources, bioterrorism and exposure to chemical and radio-nuclear agents. GPHIN also monitors issues related to the safety of products, drugs and medical devices.
GHPIN’s budget is in the vicinity of $1.5M per year. GHIN has a staff of 17-18 which includes 14 analysts, one IT specialist, an office clerk, and a program chief.
Specifically, Canadian Security Intelligence Services (CSIS), Canadian Food Inspection Agency (CFIA), Royal Canadian Mounted Police, UN Food and Agriculture, World Organization for Animal Health (OIE), European Commission, European Center for Disease Prevention & Control (ECDC), CDC.
v Government Institutions: Ministries of Health, Military, Depts. of Agriculture, Chemical Safety…
v Nations: Algeria, Australia, France, Italy, Germany, Netherlands, Switzerland, Wales, United States.
PHAC indicates that the system has been useful and cost effective. GPHIN had a 96% accuracy rate in 2006. Fees for using the system run from $30K per user for government and nonprofit agencies to $100K per user for others.
Health Map collaborates with, feeds/ pulls from GPHIN. Earlier work in this bio surveillance/epidemic predictive arena, i.e.- Bio-Alirt & Bio Sense from The Division of Emergency Response, focused more on connecting internal data between local, regional, national healthcare providers and data with government actors, not necessarily leveraging the web and seemingly NGO , individuals.
I. A. Predicting disease outbreaks, monitoring bioterrorism, enabling better response to pandemics
(3) Google InSTEDD
Given Google.Org has focused heavily on global public health to spearhead strategic initiatives for their philanthropic efforts, Larry Brilliant came up with the idea of building InSTEDD, which stands for International Networked System for Total Early Disease Detection, with a similar mission as PHAC’s GPHIN. Initially, Brilliant proposed that Google’s system would leverage PHAC’s GPHIN, but Google abandoned that idea in 2007. Reason being, Brilliant thought Google could do better with their extensive range of technologies and build a “GPHIN on steroids” with 150 language capabilities vs. GPHIN’s seven, etc. With that in mind, Google appears to be proceeding in this direction.
Leverages multiple Web Crawlers, Geo Chat SMS, infobots, historical databases, satellite photography, a system entitled Evolve. Evolve consists of several high-level modules, including: 1) Data aggregation and gathering, 2) Automatic feature extraction, data classification and tagging, 3) Human input, hypotheses generation and review, 4) Predictions and alerts output, and 5) Field confirmation and feedback.
The data aggregation and gathering module allows users to collect (or extract, transform and load (ETL) information from several sources (SMS messages (e.g., Geo chat), RSS feeds, email list (e.g., Pro Med, Veratect, Health Map, Biocaster, EpiSpider), Open ROSA, Map Sync, Epi Info™, documents, web pages, electronic medical records (e.g., Open MRS), animal disease data (e.g., OIE, AVRI hotline), environmental feed, NASA remote sensing, etc.).
The automatic feature extraction, data classification and tagging module is an architecturally extensible module that allows the introduction of machine learning algorithms (e.g., Bayesian, SVM). These components extract and augment the features (tags or metadata) from multiple data streams; such as: source and target geo-location, time, route of transmission (e.g., person-to-person, waterborne), etc. In addition, these components help detect relationships between these extracted features within a collaborative space or across different collaborative spaces. Furthermore, with human input, these components can suggest possible events or event types (e.g., at the earliest stages of a disease outbreak: “there is an unknown respiratory event, transmitted person-to-person, detected in location X, and with a certain spatio-temporal pattern”).
The human input and review module is a set of functionalities that allows users to comment, tag, and semantically rank the elements (positive, neutral, or negative). Additionally, users can generate and test multiple hypotheses in parallel, further collect and rank sets of related items (evidence), and model against baseline information (for cyclical or known events). The system maintains a list of ongoing possible threats allowing domain experts to focus their field information and either confirm or reject the hypotheses created, then that feedback is fed into the system to update (increase or decrease) the reliability of the sources and credibility of the users in light of their inferences or decisions.
Also working on other cutting edge components to support the InSTEDD effort, see here: http://www.lunchoverip.com/2008/03/instedd-update.html
Resources/Budget: Google has thrown in at least $5M. Rockefeller Institute and NTI are also participating.
I. A. Predicting disease outbreaks & pandemics, enabling better response
Veratect, which was founded in 2008, is a small, privately held company headquartered in Kirkland, WA. They have a round 75 employees and offer pandemic type early warning and regional specific surveillance to corporate, Insurance, financial services, NGO, government sectors in the form of their Foreshadow/PEER View analytical product lines. Their claim to fame, is that they were recently attributed with first detecting the “Swine flu” outbreak; beating the WHO and the CDC.
The Google INSTEDD system pulls from Veratect. See here: http://www.veratect.com/
Systems localized in 30 languages, and Veratect indicates they have 500 + customers.
I. B. For predicting ecological change
(1) University of East Anglia &Stockholm University
University of East Anglia researchers Victor Galaz in conjunction with The Natural Resource Management group at Stockholm University propose in the journal Frontiers: Ecology and the Environment (March 2009) that web crawlers can be used to anticipate ecological change; referencing the successful work that has been done in this domain around disease outbreaks. (See: Health Map, GPHIN) They stipulate that current official monitoring systems have significant gaps when it comes to being anticipatory, and having sufficient data. For the article see here: http://www.esajournals.org/doi/abs/10.1890/070204
First, web crawlers can collect information on the diverse drivers of ecosystem change, rather than the resultant ecological response. For example, if rapidly emerging markets for high value species lead to overexploitation and collapse of fisheries, web crawlers can be designed to collect information on changes in prices of the key species, landing or investments in particular regions.
Secondly, but less certain, future early warning systems can make use of recent insights that illustrate ecosystems sometimes “signal” a pending collapse. The variability of fish populations for example, has been shown to increase in response to over exploitation.