Skip to main content Skip to main navigation

Publikation

Argument Mining in Tweets: Comparing Crowd and Expert Annotations for Automated Claim and Evidence Detection

Neslihan Iskender; Robin Schaefer; Tim Polzehl; Sebastian Möller
In: Natural Language Processing and Information Systems. International Conference on Applications of Natural Language to Information Systems (NLDB-2021), June 23-25, Saarbrücken/Virtual, Germany, Pages 275-288, NLDB, ISBN 978-3-030-80599-9, Springer International Publishing, 5/2021.

Zusammenfassung

One of the main challenges in the development of argument mining tools is the availability of annotated data of adequate size and quality. However, generating data sets using experts is expensive from both organizational and financial perspectives, which is also the case for tools developed for identifying argumentative content in informal social media texts like tweets. As a solution, we propose using crowdsourcing as a fast, scalable, and cost-effective alternative to linguistic experts. To investigate the crowd workers' performance, we compare crowd and expert annotations of argumentative content, dividing it into claim and evidence, for 300 German tweet pairs from the domain of climate change. As being the first work comparing crowd and expert annotations for argument mining in tweets, we show that crowd workers can achieve similar results to experts when annotating claims; however, identifying evidence is a more challenging task both for naive crowds and experts. Further, we train supervised classification and sequence labeling models for claim and evidence detection, showing that crowdsourced data delivers promising results when comparing to experts.