Date of Award
4-2021
Document Type
Thesis
Degree Name
Bachelor of Science
Department
Mathematics
First Advisor
Dr. Joonsuk Park
Abstract
Argument mining is the automatic identification and extraction of structure from argumentative language. Previous works have constrained the argument structure to forming strictly trees in order to utilize efficient tree-specific techniques. However, arguments in the wild are unlikely to exhibit this constrained structure. Given the recent trend of fine-tuning large pre-trained models to reach state of the art performance on a variety of Natural Language Processing (NLP) tasks, we look to leverage the power of these deep contextualized word embeddings towards the task of non-tree argument mining. In this paper, we introduce a new pipeline which utilizes pre-trained BERT based models as well as Proposition Level Bi-affine Attention and Weighted Cross Entropy Loss for predicting arguments where the structure forms a directed acyclic graph. Our experiments demonstrate the efficacy of using deep contextualized word embedding from BERT based models while also suggesting future directions involving recurrence for modelling hierarchical relationships.
Recommended Citation
Chen, Ting, "BERT Argues: How Attention Informs Argument Mining" (2021). Honors Theses. 1589.
https://scholarship.richmond.edu/honors-theses/1589