<div class="csl-bib-body">
<div class="csl-entry">Yordanov, Y., Kocijan, V., Lukasiewicz, T., & Camburu, O.-M. (2022). Few-Shot Out-of-Domain Transfer Learning of Natural Language Explanations in a Label-Abundant Setup. In Y. Goldberg, K. Zornitsa, & Y. Zhang (Eds.), <i>Findings of the Association for Computational Linguistics: EMNLP 2022</i> (pp. 3486–3501). Association for Computational Linguistics. https://doi.org/10.18653/v1/2022.findings-emnlp.255</div>
</div>
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/192516
-
dc.description.abstract
Training a model to provide natural language explanations (NLEs) for its predictions usually requires the acquisition of task-specific NLEs, which is time- and resource-consuming. A potential solution is the few-shot out-of-domain transfer of NLEs from a parent task with many NLEs to a child task.In this work, we examine the setup in which the child task has few NLEs but abundant labels. We establish four few-shot transfer learning methods that cover the possible fine-tuning combinations of the labels and NLEs for the parent and child tasks. We transfer explainability from a large natural language inference dataset (e-SNLI) separately to two child tasks: (1) hard cases of pronoun resolution, where we introduce the small-e-WinoGrande dataset of NLEs on top of the WinoGrande dataset, and (2) commonsense validation (ComVE). Our results demonstrate that the parent task helps with NLE generation and we establish the best methods for this setup.
en
dc.language.iso
en
-
dc.subject
natural language explanations
en
dc.subject
few-shot out-of-domain transfer
en
dc.title
Few-Shot Out-of-Domain Transfer Learning of Natural Language Explanations in a Label-Abundant Setup
en
dc.type
Inproceedings
en
dc.type
Konferenzbeitrag
de
dc.contributor.affiliation
University of Oxford, United Kingdom of Great Britain and Northern Ireland (the)
-
dc.contributor.affiliation
Kumo.ai
-
dc.contributor.affiliation
University College London, United Kingdom of Great Britain and Northern Ireland (the)
-
dc.description.startpage
3486
-
dc.description.endpage
3501
-
dc.type.category
Full-Paper Contribution
-
tuw.booktitle
Findings of the Association for Computational Linguistics: EMNLP 2022