Multilingual Denoising Pre-training for Neural Machine Translation

Volume: 8, Pages: 726 - 742
Published: Dec 1, 2020
Abstract
This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks. We present mBART—a sequence-to-sequence denoising auto-encoder pre-trained on large-scale monolingual corpora in many languages using the BART objective (Lewis et al., 2019 ). mBART is the first method for pre-training a complete sequence-to-sequence model by denoising full texts in...
Paper Details
Title
Multilingual Denoising Pre-training for Neural Machine Translation
Published Date
Dec 1, 2020
Volume
8
Pages
726 - 742
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.