Even as science and technology carve paths to understanding our world at an unprecedented pace, more and more people are demonstrating an eagerness to reject the proof placed before them.
In the age of ‘fake news’ and pay-to-publish research journals, it’s understandable why so many are cynical. No matter where your beliefs lie, knowing who and what to trust is not (for better or worse) as easy as it once was.
This corruption of credibility, and the lack of clarity that results, are not just problems for the general public, but in the scientific community as well. That’s why DARPA – the US Department of Defence’s emerging technology arm – has put out a call for ideas on how to develop “(semi)automated capabilities to assign ‘Confidence Levels’ to specific studies, claims, hypotheses, conclusions, models, and/or theories found in social and behavioural science research”.
Simply put, they want to develop a bullshit detector, one that will put a definitive stamp of proof on social science research, or throw it in the proverbial bin.
Like all DARPA projects, the project is tailored for military purpose. Understanding the psychology of the battlefield, of how extremism is fostered, or how to design an army in which human soldiers and robots can coordinate effectively: these are the agency’s primary concerns. That said, findings on such topics have far-reaching applications.
Take, for instance, propaganda. We know it works, but ask a dozen social scientists why it works and you’ll receive a dozen different papers based on a dozen different theories. Where do you start? Where do you end? What do you choose to believe, and why?
DARPA’s system would, on paper, be able to apply a range of factors to each theory in order to determine which is most scientifically accurate. The result would clarify what makes for the most impactful propaganda, not just in war times, but in everything from advertising to political speeches.
By presenting these results in a way that non-experts can understand them, people will more easily be able to distinguish when others are using this science to manipulate them.
It sounds like an ideal way to educate society while simultaneously restoring trust in science, but DARPA face some serious challenges.
The first is developing a benchmark for credible science.
Brian Nosek is the head of the Centre for Open Science at the University of Virginia. In an interview with Wired, he explains that this may be possible by establishing a competition in which people who have developed credibility-assessing models test them against a series of chosen studies. “The only way to develop confidence in the evidence is to look at the problem in lots of different ways and see where you start to get convergence,” Nosek says.
This would go some way in eliminating the second concern: subjectivity.
In recent years, concerns about ‘p-hacking’ – a term for when scientific experiments are manipulated so that they appear to support specific theories – and the replication crisis which has resulted from revelations that more than two-thirds of published studies can’t be replicated by the author’s peers has lead to growing distrust within the scientific community. Deciding who to trust to decide what should be trusted may prove as challenging as designing the system itself.
Finally, the most obvious issue DARPA faces is working out how the system will actually work.
On what basis does it rank validity in relation to what publication the research first appeared in? How would it know if a paper received funding from a source that may lend bias to the result? And how much control will the human operators of the system have?
Like so much of the future tech being discussed today, the truth is nobody knows. Still, if any organisation can bring such a system to fruition, it’s DARPA. As former Director Dr. Regina Dugan put it, “first impossible, then improbable, eventually inevitable”.
Such a system may still be years away, but DARPA’s callout should be seen as a beacon of hope. Hope that a time in which it’s proving increasingly difficult to tell fact from fabrication will soon come to an end, and that truth will prevail, and make us better as a society.