Folgen
Meng Zhang
Meng Zhang
Huawei Noah's Ark Lab
Bestätigte E-Mail-Adresse bei huawei.com - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Word-level Textual Adversarial Attacking as Combinatorial Optimization
Y Zang, F Qi, C Yang, Z Liu, M Zhang, Q Liu, M Sun
Proceedings of the 58th Annual Meeting of the Association for Computational …, 2020
4222020
Adversarial training for unsupervised bilingual lexicon induction
M Zhang, Y Liu, H Luan, M Sun
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
3152017
Earth mover’s distance minimization for unsupervised bilingual lexicon induction
M Zhang, Y Liu, H Luan, M Sun
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
1702017
Building earth mover's distance on bilingual word embeddings for machine translation
M Zhang, Y Liu, H Luan, M Sun, T Izuha, J Hao
Proceedings of the AAAI Conference on Artificial Intelligence 30 (1), 2016
402016
Mlslt: Towards multilingual sign language translation
A Yin, Z Zhao, W Jin, M Zhang, X Zeng, X He
Proceedings of the IEEE/CVF conference on computer vision and pattern …, 2022
362022
Bilingual lexicon induction from non-parallel data with minimal supervision
M Zhang, H Peng, Y Liu, H Luan, M Sun
Proceedings of the AAAI Conference on Artificial Intelligence 31 (1), 2017
322017
Simulslt: End-to-end simultaneous sign language translation
A Yin, Z Zhao, J Liu, W Jin, M Zhang, X Zeng, X He
Proceedings of the 29th ACM International Conference on Multimedia, 4118-4127, 2021
292021
Universal conditional masked language pre-training for neural machine translation
P Li, L Li, M Zhang, M Wu, Q Liu
arXiv preprint arXiv:2203.09210, 2022
282022
Learning to generate explainable plots for neural story generation
G Chen, Y Liu, H Luan, M Zhang, Q Liu, M Sun
IEEE/ACM Transactions on Audio, Speech, and Language Processing 29, 585-593, 2020
24*2020
Prior knowledge and memory enriched transformer for sign language translation
T Jin, Z Zhao, M Zhang, X Zeng
Findings of the Association for Computational Linguistics: ACL 2022, 3766-3775, 2022
182022
Simullr: Simultaneous lip reading transducer with attention-guided adaptive memory
Z Lin, Z Zhao, H Li, J Liu, M Zhang, X Zeng, X He
Proceedings of the 29th ACM International Conference on Multimedia, 1359-1367, 2021
162021
Multi-head highly parallelized LSTM decoder for neural machine translation
H Xu, Q Liu, J van Genabith, D Xiong, M Zhang
Proceedings of the 59th Annual Meeting of the Association for Computational …, 2021
152021
Self-supervised quality estimation for machine translation
Y Zheng, Z Tan, M Zhang, M Maimaiti, H Luan, M Sun, Q Liu, Y Liu
Proceedings of the 2021 Conference on Empirical Methods in Natural Language …, 2021
132021
Uncertainty-aware balancing for multilingual and multi-domain neural machine translation training
M Wu, Y Li, M Zhang, L Li, G Haffari, Q Liu
arXiv preprint arXiv:2109.02284, 2021
122021
Inducing bilingual lexica from non-parallel data with earth mover’s distance regularization
M Zhang, Y Liu, H Luan, Y Liu, M Sun
Proceedings of COLING 2016, the 26th International Conference on …, 2016
112016
Listwise ranking functions for statistical machine translation
M Zhang, Y Liu, H Luan, M Sun
IEEE/ACM Transactions on Audio, Speech, and Language Processing 24 (8), 1464 …, 2016
112016
Mc-slt: Towards low-resource signer-adaptive sign language translation
T Jin, Z Zhao, M Zhang, X Zeng
Proceedings of the 30th ACM International Conference on Multimedia, 4939-4947, 2022
92022
Dynamic multi-branch layers for on-device neural machine translation
Z Tan, Z Yang, M Zhang, Q Liu, M Sun, Y Liu
IEEE/ACM Transactions on Audio, Speech, and Language Processing 30, 958-967, 2022
72022
Two parents, one child: Dual transfer for low-resource neural machine translation
M Zhang, L Li, Q Liu
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 …, 2021
72021
Triangular transfer: Freezing the pivot for triangular machine translation
M Zhang, L Li, Q Liu
arXiv preprint arXiv:2203.09027, 2022
62022
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20