Paper

Automatic Claim Review for Climate Science via Explanation Generation

There is unison is the scientific community about human induced climate change. Despite this, we see the web awash with claims around climate change scepticism, thus driving the need for fact checking them but at the same time providing an explanation and justification for the fact check. Scientists and experts have been trying to address it by providing manually written feedback for these claims. In this paper, we try to aid them by automating generating explanation for a predicted veracity label for a claim by deploying the approach used in open domain question answering of a fusion in decoder augmented with retrieved supporting passages from an external knowledge. We experiment with different knowledge sources, retrievers, retriever depths and demonstrate that even a small number of high quality manually written explanations can help us in generating good explanations.

Results in Papers With Code
(↓ scroll down to see all results)