Lead Cognitive Scientist
Raytheon, Intelligence and Information Systems
Donald Kretz has more than 30 years of experience in information technology research and development, military applications, and intelligence analysis. He has served as a Principal Investigator in the IIS Analytics Technology Center, leading research in areas such as collective identity resolution; passenger risk assessment for airline security; cognitive "debiasing" techniques for intelligence analysis; natural language and semantics; automated tools for human profiling, matching, and exploitation; and enhanced pattern classification for recognizing and labeling patterns of human activity. He is currently a Principal Research Associate at Raytheon, supporting the User-Centric Grand Challenge. Don holds a B.A. in Psychology from the University of Texas at San Antonio and a M.S. in Applied Cognition and Neuroscience from the University of Texas at Dallas, where he is currently pursuing his Ph.D. in Cognition and Neurosciences.
Q: How does your background support Raytheon's research of the analyst tradecraft?
A: Prior to joining Raytheon as lead cognitive scientist in the IIS Analytics Technology Center, I served as a military and intelligence analyst – analyzing communication networks, predicting and warning of regional tension and conflict, and providing threat assessments for the White House and the U.S. Congress. After working as a lead engineer for DARPA and U.S. military customers, I wanted to find another way to help the intelligence community. I joined Raytheon to contribute to the analyst tradecraft and further empower analysts.. Raytheon's role in advancing intelligence capabilities is important to helping our leaders make critical decisions.
Q: What is the User-Centric Analytics Grand Challenge and what's your role?
A: The User-Centric Analytics Grand Challenge is Raytheon's contribution to improving analytic efficiency and quality. One important component of it is a series of cognitive analytic gaming exercises. We are using scenario-based simulations to assess the analytic process and recommend alternate approaches to ultimately help analysts improve the quality of their intelligence products.
After Sept. 11, 2001, there were a body of studies and reports addressing the intelligence failures. Congress passed the Intelligence Reform and Terrorism Prevention Act of 2004 to reform the intelligence community and intelligence-related activities, in particular mandating the community to adopt the use of "alternative analyses." Not a lot of progress has been made. There have been some relatively small, disconnected projects, but not much has made it into the hands of working analysts. In my role on Grand Challenge, I am designing exercises to better understand how analysts think, then develop ways to overcome some of the human limitations imposed by our brains. I'm not looking for ways to constrain their thinking, but rather to open up their thought processes in ways that led them to better judgments.
Until now, there's been very little research into how analysts solve problems and make decisions and none of it that has achieved any measurable success in overcoming cognitive bias. What are we doing to address the human process? This is what inspired me to pursue my Ph.D in Cognition and Neuroscience.
Q: Why do you think this research is important?
A: It's important because in the end, it won't just benefit Raytheon, through improved capability development and closer customer relationships, but the intelligence community as a whole. This is a national security issue that requires immediate attention.
Another piece to this is studying cognitive bias. All of us humans, including analysts, subconsciously apply decision heuristics, or mental shortcuts, in order to reduce complex problems to simpler ones. While these heuristics ease our "cognitive load" and often produce accurate judgments, they sometimes result in judgment biases, or errors caused by incomplete mental processing. The good news is that some of the negative effects of judgment biases can be reduced through focused corrective measures that we call analytic multipliers. These are sort of like "cognitive interventions", designed to improve the quality and accuracy of our decisions by encouraging us to think more deeply or evaluate alternative explanations. The objective of understanding analysts' biases through analytic decision games is to discover ways to prevent intelligence analysis errors.
Cognitive biases are impairing the judgment of analysts and they don't even know it. Furthermore, just telling them about the problem isn't enough to fix it. That's why we're trying to address these. But before we offer solutions, we have to understand the analysts, and we can't get the information we need just by talking to them or or even watching them. We need data that tells us how they think, and then we need to involve them in the design and evaluation of solutions. If we can do that, we'll really make a difference.