NJUNLP’s Submission for CCMT 2022 Quality Estimation Task

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:CCMT (18. : 2022 : Lhasa) Machine Translation
1. Verfasser: Zhang, Yu (VerfasserIn)
Weitere Verfasser: Geng, Xiang (VerfasserIn), Huang, Shujian (VerfasserIn), Chen, Jiajun (VerfasserIn)
Format: UnknownFormat
Sprache:eng
Veröffentlicht: 2022
Schlagworte:
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Titel Jahr Verfasser
Hot-Start Transfer Learning Combined with Approximate Distillation for Mongolian-Chinese Neural Machine Translation 2022 Wang, Pengcong
Multi-strategy Enhanced Neural Machine Translation for Chinese Minority Languages 2022 Wu, Zhanglin
Life Is Short, Train It Less: Neural Machine Tibetan-Chinese Translation Based on mRASP and Dataset Enhancement 2022 Wang, Hao
Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation 2022 Wang, Yisong
Dynamic Fusion Nearest Neighbor Machine Translation via Dempster-Shafer Theory 2022 Yang, Zongheng
Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation 2022 Hao, Wenjie
ISTIC’s Thai-to-Chinese Neural Machine Translation System for CCMT 2022 2022 Guo, Shuao
NJUNLP’s Submission for CCMT 2022 Quality Estimation Task 2022 Zhang, Yu
Review-Based Curriculum Learning for Neural Machine Translation 2022 Hui, Ziyang
PEACook: Post-editing Advancement Cookbook 2022 Tao, Shimin
Improving the Robustness of Low-Resource Neural Machine Translation with Adversarial Examples 2022 Sun, Shuo
A Multi-tasking and Multi-stage Chinese Minority Pre-trained Language Model 2022 Li, Bin
Target-Side Language Model for Reference-Free Machine Translation Evaluation 2022 Zhang, Min
An Improved Multi-task Approach to Pre-trained Model Based MT Quality Estimation 2022 Yuan, Binhuan
CCMT 2022 Translation Quality Estimation Task 2022 Su, Chang
Effective Data Augmentation Methods for CCMT 2022 2022 Wang, Jing
Alle Artikel auflisten