BART
-
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension 논문 리뷰논문 리딩 2022. 10. 23. 19:24
https://arxiv.org/abs/1910.13461 BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text. It uses a standard Tranformer-b..