Study gauges overconfidence in national security officials

National security officials, despite their expertise and access to sensitive information, exhibit significant overconfidence in their judgments about global affairs, a large-scale study has revealed. The research, involving nearly 2,000 professionals, found that this cognitive bias is pervasive, consistent across demographics, and can lead to substantial errors in assessing uncertainty, with potentially high-stakes consequences for international policy and military decisions.

The study, led by Dr. Jeffrey Friedman, an associate professor at Dartmouth College, presented officials with tens of thousands of questions on military, political, and economic matters. The findings suggest that this overconfidence is not an unchangeable trait but a learned habit that can be corrected. Even minimal training and structured feedback were shown to improve the accuracy of officials’ judgments, highlighting a clear path toward better-calibrated decision-making within the national security community.

Research Methodology and Scale

The study was notable for its extensive scale and access to a typically hard-to-reach group of professionals. Friedman’s research was conducted in partnership with military education institutions, which integrated online surveys into their core curricula, leading to participation rates exceeding 90% in most cases. The project gathered more than 60,000 assessments from nearly 2,000 national security officials from the United States and other NATO allies and partners.

Participants were presented with surveys containing a rotating set of more than 250 unique questions. These questions covered a wide range of topics in international affairs. Some required assessments of current facts that could be verified, while others involved forecasts of future events that would be evaluated months or even years later. Officials were asked to state their confidence in their answers using either precise percentages or qualitative descriptions like “likely” or “almost certain,” allowing researchers to measure the gap between their perceived certainty and their actual accuracy.

A Widespread Cognitive Bias

A primary finding of the research is that overconfidence is a remarkably consistent bias among national security professionals, transcending institutional and cultural boundaries. The tendency to be overconfident was observed equally in military officers and their civilian counterparts. It was also displayed equally by men and women, as well as by American and non-American participants, indicating that the cognitive flaw is not specific to one group or nation.

This research contributes to the ongoing debate about whether elite decision-makers are more rational than the general public. While some theories suggest that the high-stakes environment of national security should foster more accurate assessments, this study’s data indicates otherwise. National security officials exhibited the same overconfidence bias seen in the general population. In fact, they were found to be even more overconfident than the participants in the Good Judgment Project, a large-scale forecasting tournament that recruited ordinary citizens.

The Data on Miscalibration

The study’s quantitative results reveal a significant disconnect between confidence and accuracy. When national security officials were completely certain they had the correct answer—assigning a 100% or 0% probability to a statement—they were wrong 25% of the time. This demonstrates a substantial gap between their feeling of certainty and the reality of the situation. The problem persisted with qualitative expressions of confidence as well; when officials described an outcome as “almost certain,” their judgments were incorrect 32% of the time.

The miscalibration was also evident at other levels of confidence. For example, when officials believed a statement had a 90% chance of being true, it was actually true only 57% of the time. Conversely, when they assigned a low probability, such as a 10% chance of being true, the statement turned out to be correct 32% of the time. This indicates a general tendency to overestimate their knowledge and assign too much certainty to their beliefs. The research also uncovered that officials were more prone to false positives, meaning they were more likely to believe something was true when it was not.

Psychological Drivers of Overconfidence

The study suggests that this overconfidence stems from common cognitive biases rather than a lack of expertise. While the officials demonstrated genuine knowledge—when they believed one outcome was more likely than another, they were generally correct in a relative sense—they consistently overestimated the precision of that knowledge. According to the data, 96% of the participants would have achieved better accuracy if they had simply made every judgment with less certainty.

A key reason for this persistent bias is the lack of structured feedback in the national security field. It is difficult to assess the accuracy of any single judgment about an uncertain event. Furthermore, there is a natural human tendency to attribute successes to skill while blaming failures on external factors, such as poorly worded questions or unpredictable events. This makes it difficult for even successful professionals to recognize the gaps in their judgment without direct, clear feedback that demonstrates their biases.

Improving Judgment and Decision-Making

Despite the sobering findings, the research offers a positive outlook. The study found that even a brief, two-minute training session could significantly improve the accuracy of officials’ judgments. This suggests that overconfidence is a malleable habit that can be addressed through targeted interventions. By providing officials with structured feedback and tools to better calibrate their assessments of uncertainty, organizations can foster more reliable decision-making.

The ultimate goal is not to discourage officials from making assessments but to help them align their confidence levels with their actual knowledge. The expertise of national security professionals is valuable and necessary, but its effectiveness is diminished by overconfidence. By implementing systems of feedback and training, the national security community can help its members provide more accurate and nuanced judgments, leading to more prudent and effective policy choices in a complex and uncertain world.

Leave a Reply

Your email address will not be published. Required fields are marked *