Conference Proceedings

Semi-supervised Stochastic Multi-Domain Learning using Variational Inference

Yitong Li, Timothy Baldwin, Trevor Cohn

CoRR | Association for Computational Linguistics | Published : 2019

Abstract

Supervised models of NLP rely on large collections of text which closely resemble the intended testing setting. Unfortunately matching text is often not available in sufficient quantity, and moreover, within any domain of text, data is often highly heterogenous. In this paper we propose a method to distill the important domain signal as part of a multi-domain learning system, using a latent variable model in which parts of a neural model are stochastically gated based on the inferred domain. We compare the use of discrete versus continuous latent variables, operating in a domain-supervised or a domain semi-supervised setting, where the domain is known only for a subset of training inputs. We..

View full abstract

Grants

Funding Acknowledgements

This work was supported by an Amazon Research Award. We thank the anonymous reviewers for their helpful feedback and suggestions.