IARPA has released the Crowdsourcing, Evidence, Argumentation, Thinking, and Evaluation (CREATE) BAA, IARPA-BAA-15-11. The CREATE program will is seeking proposals to “develop, and experimentally test, systems that use crowdsourcing and structured analytic techniques (STs) to improve analytic reasoning.”
STs “help people better understand the evidence and assumptions that support—or conflict with—conclusions. Secondarily, they will also help users better communicate their reasoning and conclusions. STs hold promise for increasing the logical rigor and transparency of analysis. They can help reveal underlying logic and identify unstated assumptions. Yet they are not widely used in the Intelligence Community or elsewhere—possibly because current versions are cumbersome or require too much time. Crowdsourcing has the potential to solve these problems by dividing the labor, allowing dispersed groups of analysts to contribute information and ideas where they have comparative advantages. Crowdsourcing can help analysts identify and understand alternative hypotheses, arguments, and points of view. Crowdsourcing of structured techniques may facilitate rational deliberation by integrating different perspectives, so that analysis can effectively benefit from ‘crowd wisdom’.”
There will be three phases to the program:
- Phase 1 will develop and experimentally test one or more STs that are amenable to crowdsourcing on constrained problems. They may be, but need not be, crowdsourced during Phase 1.
- Phase 2 will develop and experimentally test crowdsourced versions of one or more Phase 1 STs on constrained problems of increasing complexity.
- Phase 3 will develop and experimentally test crowdsourced STs on constrained and unconstrained problems similar in complexity to those encountered by IC analysts.
The Program Manager is Steve Rieber. Proposals are due April 16, 2016. Multiple awards are anticipated.