Use of SAT can aid foreign affairs analysis in dealing with two major sources of analytic error: cognitive biases and random noise.
Let’s start with the issue of information versus noise.[v]
Data is toxic to analytic judgments even in moderate quantity. It has been empirically proven that more data does not result in better estimates. It only raises analyst’s confidence in the accuracy of own conclusions. (For more details, look up Paul Slovic’s study of racetrack handicappers).
Search for more information tends to contaminate analysis with more noise. Continuous updating of hypothesis burns up a tremendous amount of time. While bringing little, if any, increase in accuracy, it seriously risks getting analytic reasoning hijacked by confirmation bias.
Most of data signals surrounding an analyst and continuously assaulting his mind is not information but precisely noise – data that is random, untrue, faked, irrelevant, incomplete and meaningless.
Since we consider own judgments prized possessions, these turn sticky. More random noise will only make one more confident in one’s initial hypothesis formed on the basis of vague and incomplete information.
A typical error of collection is aiming to continuously monitor data sources. The more frequently one looks at data, the lower the ratio of signal to noise tends to become. With time, noise prevails over evidence and skews analytic reasoning. Analysis degenerates into fantasy.
It turns out that standard collection practices can encourage and even REQUIRE analysts to AIM AT CAPTURING NOISE. The result is predictable: junk in – junk out.
In order to arrive at a particular judgment analysts do not need more data. What they need, is to focus and filter out the noise that is hijacking their reasoning.
If my objective is to cross the street, I shall focus solely on the traffic lights and a very real possibility of some madman jumping them. I should, at least for the duration of that endeavour, forget about shapely figures of fellow pedestrians, an untied shoe lace, the need to balance a melting ice cream ball in the cone or a waving friend madly signalling from the other side of the street.
Those who learned to do that with confidence survived the crossing and passed their genes on. With a bit of training and a bit of practice this inherited life skill can be carried over into the analytic domain.
Humans as a species tend to be demonstrably arrogant about what they think they know. They have a tendency to think that they know a bit more than they actually do. While overestimating what they know, people at the same time underestimate uncertainty. Overconfidence can have a toxic effect on analysts. It motivates them to adopt vastly more risky positions than warranted by available information. Analyst overconfidence results in blindness to errors and is a primary reason for the most epic intelligence failures.
Overconfidence is often borne out of disregard for the uncertainty inherent in most analytic problems. Probability is a non-intuitive concept – it cannot be easily derived from everyday experience. We study it in certain depth, but here is a quick primer on the subject.