Date of Award
Bachelor of Science
Dr. Joonsuk Park
Argument mining is the automatic identiﬁcation and extraction of structure from argumentative language. Previous works have constrained the argument structure to forming strictly trees in order to utilize eﬃcient tree-speciﬁc techniques. However, arguments in the wild are unlikely to exhibit this constrained structure. Given the recent trend of ﬁne-tuning large pre-trained models to reach state of the art performance on a variety of Natural Language Processing (NLP) tasks, we look to leverage the power of these deep contextualized word embeddings towards the task of non-tree argument mining. In this paper, we introduce a new pipeline which utilizes pre-trained BERT based models as well as Proposition Level Bi-aﬃne Attention and Weighted Cross Entropy Loss for predicting arguments where the structure forms a directed acyclic graph. Our experiments demonstrate the eﬃcacy of using deep contextualized word embedding from BERT based models while also suggesting future directions involving recurrence for modelling hierarchical relationships.
Chen, Ting, "BERT Argues: How Attention Informs Argument Mining" (2021). Honors Theses. 1589.