Doctrinally, Priority Intelligence Requirements (PIRs) represent information that the commander needs to know in order to make a decision or achieve a desired effect. Networked warfare provides the intelligence officer with access to multitudes of sensor outputs and reports, often from unfamiliar sources. Counterinsurgency requires evaluating information across all PMESII-PT categories: Political, Military, Economic, Social, Infrastructure Information, Physical Environment and Time. How should analysts evaluate this information? NATO's STANAG (Standard Agreement) 2022 requires that every piece of information in intelligence reports used to answer PIRs should be evaluated along two independent dimensions: the reliability of its source and the credibility of the information. Recent developments in information retrieval technologies, including social search technologies, incorporate metrics of information evaluation, reliability and credibility, such as Google's PageRank. In this paper, we survey various current approaches to automatic information evaluation and explore their applicability to the information evaluation and PIR answering tasks. (Presented at Fusion 2010)
Current Approaches to Automated Information Evaluation and their Applicability to Priority Intelligence Requirement Answering
1. Fusion 2010 13 th International Conference on Information Fusion EICC, Edinburgh, UK Thursday, 29 July 2010 Current Approaches to Automated Information Evaluation and their Applicability to Priority Intelligence Requirement Answering
6. Question-Answering Technologies by Source Data Format VIStology | FUSION 2010 | Edinburgh www.vistology.com Information Source Format Familiar Application Advanced Application Tables (Relational DBs, Spreadsheets) Structured Query Language (SQL) Wolfram Alpha (Mathematica) Text Web Search Engines (Google, Yahoo!, Ask) Systems from AQUAINT (IC) competition; IBM Watson Tagged Text Google Patent Search Metacarta; Palantir Logic Statements Prolog Powerset (acquired by MS Bing); Cyc Trusted Teammates Personal Communication Yahoo! Answers; Vark (acquired by Google); US Army Intelligence Knowledge Network Shoutbox
7. Structured Data Q-A: Wolfram Alpha www.vistology.com VIStology | FUSION 2010 | Edinburgh Wolfram Alpha identifies Tupelo as where Elvis was born (Elvis disambiguated as Elvis Presley) and provides map overlay and additional info, like current city population. Reference sources listed by title on another screen, no access to source data. Query “Where was Elvis born?” automatically translated to Mathematica query: Elvis Presley, place of birth.
8. Text: Google VIStology | FUSION 2010 | Edinburgh www.vistology.com Google PageRank disambiguates query: Elvis = Elvis Presley by PageRank. Top-ranked snippets can easily be scanned for consensus answer from independent sources: Tupelo, MS. PageRank less useful in MI context because reports are not hyperlinked.
9. Text-Based Q-A: IBM Watson www.vistology.com VIStology | FUSION 2010 | Edinburgh IBM’s text-based algorithms identified these phrases as top potential “Jeopardy” answer, with scores displayed. In “Jeopardy”, answer is in form of question. Query in “Jeopardy” format (including category “Musical Pastiche”)
10. Tagged Text: Metacarta VIStology | FUSION 2010 | Edinburgh www.vistology.com Query identifies documents that contain “elvis”, “born” and a location. Answers literally all over the map. Consensus answer not obvious from location clusters. Documents are recent news articles. Query: “Where was Elvis born?”
11. Logic-Based Q-A: Powerset VIStology | FUSION 2010 | Edinburgh www.vistology.com Answers involve multiple “Elvises”. Source data is Wikipedia only.
12. Social Question-Answering: Vark VIStology | FUSION 2010 | Edinburgh www.vistology.com Routed to unknown user in my ‘network’ computed as likely to provide answer; Answer returned in less than minute. Optimized for mobile environment. Feedback Vark queries need to be over certain length, hence this phrasing.
13. Comparison by Technology www.vistology.com VIStology | FUSION 2010 | Edinburgh STANAG Requirement Tables: Wolfram Alpha Text: Google IBM Watson Tagged Text: Metacarta Palantir Logic Statements: Powerset Teammates: Vark Y! Answers Source Wolfram: Reference document title (no url) URL of document in which info appears (usually: not Watson). No further attempt to match info to source. I.e. not: 1000 demonstrators according to police. Teammate known. May not say where info originates. Source Reliability Curated data: Reference works, Government data. Centrality measures: Google PageRank (eigenvector centrality); Technorati Authority (inlink centrality); VIStology blogger authority (centrality + engagement) Curated data: Wikipedia. Wikipedia has PageRank: 9 out of 10 (reliable) Track record, Reputation. Votes on answers. Longevity. # of answers Source Independence No. One unified datastore. Duplicate document detection; Explicit source tracking (href; bit.ly); Leskovec meme tracking. SNA metrics of independence. No. Single data source. User Authentication. Information Credibility Partial Integrity constraints. Can’t easily verify info. Consensus answers; same answer identified in multiple distinct sources. Could check integrity constraints; URI co-ref a problem. Contradictions halt inference. Demonstrated area of expertise
14.
15.
16.
Notas do Editor
Wolfram Alpha identifies Tupelo as the place of Elvis’ birth (Elvis disambiguated as Elvis Presley) and provides additional information on the city.. Reference sources by title, not easily checked. Add box.
Elvis disambiguated as “Elvis Presley” by PageRank. Consensus answer apparent by inspection. Highest ranking document doesn’t contain answer in its snippet.
(Partial) geographic overlay tied to trailing month archive of news articles. Matches are documents with a location that contain “Elvis” and “born”. At least one document contains the correct answer, but many false hits, including an article about an Elvis-loving Episcopal priest in Alaska. Hits in France, Spain, Haiti, etc. Somewhat denser around Tupelo, but not enough to indicate answer clearly.
Elvis disambiguated to several Elvises. Birthplaces highlighted in each by Powerset. Uses Wikipedia data only.
Correct answer routed to a self-identified Elvis expert (assumed Elvis = Elvis Presley) and correct answer returned in less than a minute. Feedback can be provided “Was Gregory’s answer helpful”? Yes, Kind of, but not for me. No. Question phrased this way because questions have to be over a certain length.
Green means an automated solution exists; Yellow means solution is partial or not wholly automated (requires human judgment). Red means no automated solution.