Crowdsourcing. It’s been used for everything from handicapping the next election to making a kick-ass encyclopedia to building a better movie recommendation tools for Netflix. America’s spy agencies are hoping it can help ‘em predict what’s going to happen on the world stage next. In other words, Washington’s next sharp-eyed intelligence analyst could be you. An early test of the system starts this week.
On Friday, Applied Research Associates, Inc will launch the Aggregative Contingent Estimation System (ACES), a website that lets members of the public test out methods to crowdsource intelligence predictions. Funded by Iarpa, the intelligence community’s advanced research shop, ACES invites users to try their hand at making predictions and sharpening up their forecasting skills. The resulting data, ARA and Iarpa hope, will let spooks find out if the crowd can build a better crystal ball for the intel world. The project will test out crowd-based forecasting for the intelligence community by testing its methods out on a website. That site opens to the public this Friday.
“We’re trying to make good use of everybody’s individual opinions and trying to determine what aspects of them might be important and would lead to a good forecast,” says Dr. Dirk Warnaar, the principal investigator for the ACES project at Applied Research Associates.
The idea behind tapping into collective intelligence is simple: There’s bits of useful information distributed among the members of diverse crowds, so aggregating their judgments should yield a better answer — better even than experts’ — to a particular question.
But ARA is looking to do something just a little different from other crowdsourcing efforts. While many similar tools assign equal weight to participants’ inputs, ACES will be looking for the most accurate predictors over time and weighting their judgments more heavily than other users.
Tools that tap into the smarts of the crowd have shown promise on a host of challenges over the years. Researchers have used Twitter buzz about upcoming movies to make pretty accurate predictions about box office sales on opening weekends. Tools like Ushaidi, nonprofit software that lets users map incidents, resources or people in the midst of crises, has helped rescuers find and save victims of disasters like Haiti’s earthquake.
ACES is a crowdsourcing tool somewhat similar to an online poll. People who sign up for the website will be asked whether an event in the fields of politics, economics, science, society or security will take place and what probability they assign to it. Their answers are then aggregated to see if the group produces an accurate prediction. It’s not a prediction market like the famous InTrade, which lets users bet on just about anything you can think of. No money changes hands on ACES — only opinions.
For ACES to be successful, it needs to attract a diverse pool of users and keep them engaged. But the online world is home to a lot of crowdsourcing tools and prediction markets, posing stiff competition for the marginal predictor’s attention. In addition to sites like InTrade, there’s a slew of other options available. The University of Iowa’s Iowa Electronic Market has been around since 1988 letting users bet on presidential elections or Federal Reserve Policy. Cinephiles use the Hollywood Stock Exchange website to bet on opening weekend box office hauls and other movie-related events.
Warnaar says he’s hoping some of ACES’s features designed to research analytical skills for the intelligence community will also prove interesting to the average user. “We’re thinking that people will be interested in competing with others and maybe learning how to become better forecasters.”
To accomplish that, ARA is working on a tool intended to help forecasters better calibrate their estimates of a prediction’s probability. For example, let’s say ACES finds out that when you say there’s a 60 percent chance an event will happen, it turns out that over time your “60 percent” guesses actually end up having a probability of closer to 40 percent. The site will then clue you in to your serial over or under-confidence, giving you feedback on your forecasting ability and allowing you to adjust accordingly.
Other features planned for the site would allow a social dimension to the prediction process. There’s some evidence that indicates information sharing among participants may undermine the accuracy of the crowd, turning a wise crowd into a dull mob. Warnaar says the jury is still out on the effect of social influence. “It’s not clear, for example, whether or not collaboration between participants helps or hurts.” Either way, he plans to find out, as ACES will experiment with features that lets users work together on prediction challenges.
Just like with math problems, ACE wants you to show your work after making a prediction. ARA has yet to determine how exactly participants can elaborate and collaborate on predictions. But some approaches could involve having users write about their rank their reasons behind it and let others vote on them.
“We think there’s a way of doing it this way that would basically give us the arguments for [a particular event] and split the problem apart in such a way that people start thinking about this problem a little deeper than they otherwise would,” says Warnaar.
Turning to the crowd has proven pretty popular with security types. In-Q-Tel, the CIA’s venture capital arm, recently began work on a prediction market aimed at forecasting computer security events. Darpa, the nerd cousin of Iarpa over at the Defense Department, tucked away some cash in its budget last year to farm intelligence, surveillance and reconnaissance data out to the crowd in search of better analysis. The Navy has even turned to crowdsourcing via online multiplayer games in order to hunt for better ideas against piracy.
So how would ACES be applied in the intelligence community? ARA’s press literature mentions National Intelligence Estimates (NIEs) as an area of potential application. NIEs represent the top-level assessment of America’s various spook agencies and senior-most analysts on a particular topic. Right now, ACES is still in the research phase and any firm ideas on how or would just be speculative, but it’s not hard to imagine potential uses. A validated ACES tool that polls intelligence analysts could complement an NIE, letting intelligence consumers gain a more complete picture of the intelligence community’s views on an issue.
But that’s only possible if ACES turns out to be accurate, something that’s yet to be proven. It’ll be a long while before answers are forthcoming — the project won’t be completed for another four years.
Photo: Focus Features
See Also:
Authors: