Total Information Awareness
Total Information Awareness was a mass detection program by the United States Information Awareness Office. It operated under this title from February to May 2003 before being renamed Terrorism Information Awareness.
Based on the concept of predictive policing, TIA was meant to correlate detailed information about people in order to anticipate and prevent terrorist incidents before execution. The program modeled specific information sets in the hunt for terrorists around the globe. Admiral John Poindexter called it a "Manhattan Project for counter-terrorism". According to Senator Ron Wyden, TIA was the "biggest surveillance program in the history of the United States".
Congress defunded the Information Awareness Office in late 2003 after media reports criticized the government for attempting to establish "Total Information Awareness" over all citizens.
Although the program was formally suspended, other government agencies later adopted some of its software with only superficial changes. TIA's core architecture continued development under the code name "Basketball". According to a 2012 New York Times article, TIA's legacy was "quietly thriving" at the National Security Agency.
Program synopsis
TIA was intended to be a five-year research project by the Defense Advanced Research Projects Agency. The goal was to integrate components from previous and new government intelligence and surveillance programs, including Genoa, Genoa II, Genisys, SSNA, EELD, WAE, TIDES, Communicator, HumanID and Bio-Surveillance, with data mining knowledge gleaned from the private sector to create a resource for the intelligence, counterintelligence, and law enforcement communities. These components consisted of information analysis, collaboration, decision-support tools, language translation, data-searching, pattern recognition, and privacy-protection technologies.TIA research included or planned to include the participation of nine government entities: INSCOM, NSA, DIA, CIA, CIFA, STRATCOM, SOCOM, JFCOM, and JWAC. They were to be able to access TIA's programs through a series of dedicated nodes. INSCOM was to house TIA's hardware in Fort Belvoir, Virginia.
Companies contracted to work on TIA included the Science Applications International Corporation, Booz Allen Hamilton, Lockheed Martin Corporation, Schafer Corporation, SRS Technologies, Adroit Systems, CACI Dynamic Systems, ASI Systems International, and Syntek Technologies.
Universities enlisted to assist with research and development included Berkeley, Colorado State, Carnegie Mellon, Columbia, Cornell, Dallas, Georgia Tech, Maryland, MIT, and Southampton.
Mission
TIA's goal was to revolutionize the United States' ability to detect, classify and identify foreign terrorists and decipher their plans, thereby enabling the U.S. to take timely action to preempt and disrupt terrorist activity.To that end, TIA was to create a counter-terrorism information system that:
- Increased information coverage by an order of magnitude and afforded easy scaling
- Provided focused warnings within an hour after a triggering event occurred or an evidence threshold was passed
- Automatically queued analysts based on partial pattern matches and had patterns that covered 90% of all previously known foreign terrorist attacks
- Supported collaboration, analytical reasoning and information sharing so that analysts could hypothesize, test and propose theories and mitigating strategies, so decision-makers could effectively evaluate the impact of policies and courses of action.
Components
Genoa
Unlike the other program components, Genoa predated TIA and provided a basis for it. Genoa's primary function was intelligence analysis to assist human analysts. It was designed to support both top-down and bottom-up approaches; a policymaker could hypothesize an attack and use Genoa to look for supporting evidence of it or compile pieces of intelligence into a diagram and suggest possible outcomes. Human analysts could then modify the diagram to test various cases.Genoa was independently commissioned in 1996 and completed in 2002 as scheduled.
Genoa II
While Genoa primarily focused on intelligence analysis, Genoa II aimed to provide means by which computers, software agents, policymakers, and field operatives could collaborate.Genisys
Genisys aimed to develop technologies that would enable "ultra-large, all-source information repositories". Vast amounts of information were to be collected and analyzed, and the available database technology at the time was insufficient for storing and organizing such enormous quantities of data. So they developed techniques for virtual data aggregation to support effective analysis across heterogeneous databases, as well as unstructured public data sources, such as the World Wide Web. "Effective analysis across heterogenous databases" means the ability to take things from databases which are designed to store different types of data—such as a database containing criminal records, a phone call database and a foreign intelligence database. The Web is considered an "unstructured public data source" because it is publicly accessible and contains many different types of data—blogs, emails, records of visits to websites, etc.—all of which need to be analyzed and stored efficiently.Another goal was to develop "a large, distributed system architecture for managing the huge volume of raw data input, analysis results, and feedback, that will result in a simpler, more flexible data store that performs well and allows us to retain important data indefinitely".
Scalable social network analysis
Scalable social network analysis aimed to develop techniques based on social network analysis to model the key characteristics of terrorist groups and discriminate them from other societal groups.Evidence extraction and link discovery
Evidence extraction and link discovery developed technologies and tools for automated discovery, extraction and linking of sparse evidence contained in large amounts of classified and unclassified data sources.EELD was designed to design systems with the ability to extract data from multiple sources. It was to develop the ability to detect patterns comprising multiple types of links between data items or communications. It is designed to link items relating potential "terrorist" groups and scenarios, and to learn patterns of different groups or scenarios to identify new organizations and emerging threats.
Wargaming the asymmetric environment
Wargaming the asymmetric environment focused on developing automated technology that could identify predictive indicators of terrorist activity or impending attacks by examining individual and group behavior in broad environmental context and the motivation of specific terrorists.Translingual information detection, extraction and summarization
Translingual information detection, extraction and summarization developed advanced language processing technology to enable English speakers to find and interpret critical information in multiple languages without requiring knowledge of those languages.Outside groups were invited to participate in the annual information retrieval, topic detection and tracking, automatic content extraction, and machine translation evaluations run by NIST. Cornell University, Columbia University, and the University of California, Berkeley were given grants to work on TIDES.
Communicator
Communicator was to develop "dialogue interaction" technology to enable warfighters to talk to computers, such that information would be accessible on the battlefield or in command centers without a keyboard-based interface. Communicator was to be wireless, mobile, and to function in a networked environment.The dialogue interaction software was to interpret dialogue's context to improve performance, and to automatically adapt to new topics so conversation could be natural and efficient. Communicator emphasized task knowledge to compensate for natural language effects and noisy environments. Unlike automated translation of natural language speech, which is much more complex due to an essentially unlimited vocabulary and grammar, Communicator takes on task-specific issues so that there are constrained vocabularies. Research was also started on foreign-language computer interaction for use in coalition operations.
Live exercises were conducted involving small unit logistics operations with the United States Marines to test the technology in extreme environments.
Human identification at a distance
The human identification at a distance project developed automated biometric identification technologies to detect, recognize and identify humans at great distances for "force protection", crime prevention, and "homeland security/defense" purposes.The goals of HumanID were to:
- Develop algorithms to find and acquire subjects out to 150 meters in range.
- Fuse face and gait recognition into a 24/7 human identification system.
- Develop and demonstrate a human identification system that operates out to 150 meters using visible imagery.
- Develop a low-power millimeter wave radar system for wide field of view detection and narrow field of view gait classification.
- Characterize gait performance from video for human identification at a distance.
- Develop a multi-spectral infrared and visible face recognition system.
Carnegie Mellon University's Robotics Institute worked on dynamic face recognition. The research focused primarily on the extraction of body biometric features from video and identifying subjects from those features. To conduct its studies, the university created databases of synchronized multi-camera video sequences of body motion, human faces under a wide range of imaging conditions, AU coded expression videos, and hyperspectal and polarimetric images of faces. The video sequences of body motion data consisted of six separate viewpoints of 25 subjects walking on a treadmill. Four separate 11-second gaits were tested for each: slow walk, fast walk, inclined, and carrying a ball.
The University of Maryland's Institute for Advanced Computer Studies' research focused on recognizing people at a distance by gait and face. Also to be used were infrared and five-degree-of-freedom cameras. Tests included filming 38 male and 6 female subjects of different ethnicities and physical features walking along a T-shaped path from various angles.
The University of Southampton's Department of Electronics and Computer Science was developing an "automatic gait recognition" system and was in charge of compiling a database to test it. The University of Texas at Dallas was compiling a database to test facial systems. The data included a set of nine static pictures taken from different viewpoints, a video of each subject looking around a room, a video of the subject speaking, and one or more videos of the subject showing facial expressions. Colorado State University developed multiple systems for identification via facial recognition. Columbia University participated in implementing HumanID in poor weather.