"Incorporating discourse knowledge into the pre-trained transformer-based models for NLP tasks"

by Dr Elizaveta Goncharova, National Research University, Artificial Intelligence Research Institute (AIRI)

Update: the event has now finished (Apr 22nd 2022).

Abstract

During the talk I will cover the topic of incorporating discourse structure into the pre-trained transformer-based models. It is a known fact that the pre-trained transformers achieve state-of-the-art results on the bunch of NLP benchmarks. However, its pure data-driven paradigm causes some limitations for its performance on such complicated tasks as machine reading comprehension (MRC), summarization, or argumentation mining (AM), for example. Incorporating discourse knowledge about a text into the pre-trained models can be quite useful for such complex NLP tasks. During the talk we will discuss some techniques that can allow the BERT model be more discourse-aware and perform better on the complex NLP tasks than the vanilla BERT model.

Speaker’s bio

Elizaveta Goncharova is a research fellow in Artificial Intelligence Research Institute (AIRI). The major is computer science, natural language processing, and machine learning. I am specifically interested in NLP domain discovering the techniques for combining linguistic information with modern pre-trained models. My recent interest also lies in the domain of multi-task learning in the NLP field. While most models are able to successfully perform only one task that they have been fine-tuned on, it is a challenging issue to create a model that could be easily adopted for many downstream tasks simultaneously without requiring huge computational resources.

CONTACT DETAILS


RGCL
University of Wolverhampton
Wulfruna Street
Wolverhampton, WV1 1LY
United Kingdom