posted on 2022-10-05, 04:35authored byVU THI THUY TRANG
Deep neural networks have revolutionised and brought many exciting natural language processing applications. They typically require tremendous human supervision through thousands or millions of examples to achieve good performance. This thesis aims to improve the performance of tasks under the conditions of limited human supervision by taking advantage of prior knowledge from raw text and related tasks. We propose several effective methods to collect additional relevant examples and transfer knowledge from related source tasks to a target task of interest.