Conference Proceedings

Neural Speech Translation using Lattice Transformations and Graph Networks

Daniel Beck, Trevor Cohn, Gholamreza Haffari

Proceedings of the Thirteenth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-13) | Association for Computational Linguistics | Published : 2019

Abstract

Speech translation systems usually follow a pipeline approach, using word lattices as an intermediate representation. However, previous work assume access to the original transcriptions used to train the ASR system, which can limit applicability in real scenarios. In this work we propose an approach for speech translation through lattice transformations and neural models based on graph networks. Experimental results show that our approach reaches competitive performance without relying on transcriptions, while also being orders of magnitude faster than previous work.