Conference Proceedings

Massively Multilingual Transfer for NER

Afshin Rahimi, Yuan Li, Trevor Cohn

Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics | Association for Computational Linguistics | Published : 2019

Abstract

In cross-lingual transfer, NLP models over one or more source languages are applied to a lowresource target language. While most prior work has used a single source model or a few carefully selected models, here we consider a “massive” setting with many such models. This setting raises the problem of poor transfer, particularly from distant languages. We propose two techniques for modulating the transfer, suitable for zero-shot or few-shot learning, respectively. Evaluating on named entity recognition, we show that our techniques are much more effective than strong baselines, including standard ensembling, and our unsupervised method rivals oracle selection of the single best individual mode..

View full abstract

Grants

Awarded by Defense Advanced Research Projects Agency Information Innovation Office (I2O), under the Low Resource Languages for Emergent Incidents (LORELEI) program


Funding Acknowledgements

This work was supported by a Facebook Research Award and the Defense Advanced Research Projects Agency Information Innovation Office (I2O), under the Low Resource Languages for Emergent Incidents (LORELEI) program issued by DARPA/I2O under Contract No. HR0011-15C-0114. The views expressed are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. Government.