Incorporating external information in neural networks for NLP tasks

In NLP, neural networks have become the state-of-the-art methods for most tasks. The strength of these models is that they are able to efficiently learn features through supervised training. However, these approaches ignore a great amount of knowledge collected in semantic lexicons, knowledge bases, annotated datasets in related tasks, and datasets available in different languages, which are available and often offer a complimentary view.

This thesis will contribute by comparing ways to incorporate external information in neural networks. Possible avenues to explore are 1) methods that augment the feature space of a neural network, 2) methods that use the external information to regularize the network, and 3) multi-task learning approaches. The precise task can be adapted to the student, but could include sentiment or emotion analysis.

The project presupposes a good level of technical expertise. Good programming skills, experience with machine learning and a solid background in NLP are relevant qualifications. Please contact the supervisors to discuss further details.

Emneord: neural networks, multi-task learning
Publisert 7. okt. 2019 14:33 - Sist endret 7. okt. 2019 14:33

Omfang (studiepoeng)

60