In Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy
will defend her proposal
Content and Stylistic Models for Authorship, Stance and Hyperpartisan Detection
AbstractWe explore user-generated text data to find how individuals write in general, express their opinion against an issue, and prepare news articles while belonging to a political party. First, we study the case of deception detection where sockpuppets try to deceive other people by providing some fake reviews using multiple user-ids for or against a service or product. We extract the grammatical features of a sentence to model the writing style of users and propose the spy-induction method to learn from unlabeled test data during training. We, then, study the authorship verification problem on other types of datasets including Amazon reviews, English essays, novels, scientific articles, etc using deep neural networks. Nonpartisan accurate information from both sides of the contemporary issues is known to be an `antidote in confirmation bias'. While these types of information help the educators to improve their vital skills including critical thinking and open-mindedness, they are relatively rare and hard to find online. With the well-researched non-biased arguments on controversial issues shared by Procon.org, detecting the stance of arguments is a crucial step to automate organizing such resources. We use a universal pretrained language model with weight-dropped LSTM neural network to leverage the context of an argument for stance detection on the proposed dataset. Lastly, we will study hyperpartisan news detection and infer the bias when authors report an event. The importance of detecting hyperpartisanship in news has been dramatically increased after the 2016 presidential election. Although some resources have identified the skewness of a big amount of news articles manually, the high speed of news release requires an efficient automatic solution that can detect the status of hyperpartisanship of news articles.
Date: Tuesday, April 2, 2019
Time: 1:30 - 3:00 PM
Place: PGH 550
Advisors: Dr. Arjun Mukherjee
Faculty, students, and the general public are invited.