Seguir
Ziqing Yang
Ziqing Yang
iFLYTEK Research
Dirección de correo verificada de iflytek.com
Título
Citado por
Citado por
Año
Pre-training with whole word masking for chinese bert
Y Cui, W Che, T Liu, B Qin, Z Yang
IEEE/ACM Transactions on Audio, Speech, and Language Processing 29, 3504-3514, 2021
12002021
Efficient and effective text encoding for chinese llama and alpaca
Y Cui, Z Yang, X Yao
arXiv preprint arXiv:2304.08177, 2023
1102023
PERT: pre-training BERT with permuted language model
Y Cui, Z Yang, T Liu
arXiv preprint arXiv:2203.06906, 2022
412022
Textbrewer: An open-source knowledge distillation toolkit for natural language processing
Z Yang, Y Cui, Z Chen, W Che, T Liu, S Wang, G Hu
arXiv preprint arXiv:2002.12620, 2020
392020
Benchmarking robustness of machine reading comprehension models
C Si, Z Yang, Y Cui, W Ma, T Liu, S Wang
arXiv preprint arXiv:2004.14004, 2020
302020
CINO: A Chinese minority pre-trained language model
Z Yang, Z Xu, Y Cui, B Wang, M Lin, D Wu, Z Chen
arXiv preprint arXiv:2202.13558, 2022
242022
On the evaporation of solar dark matter: spin-independent effective operators
ZL Liang, YL Wu, ZQ Yang, YF Zhou
Journal of Cosmology and Astroparticle Physics 2016 (09), 018, 2016
212016
Pre-training with whole word masking for chinese bert. arXiv 2019
Y Cui, W Che, T Liu, B Qin, Z Yang, S Wang, G Hu
arXiv preprint arXiv:1906.08101, 0
21
Improving machine reading comprehension via adversarial training
Z Yang, Y Cui, W Che, T Liu, S Wang, G Hu
arXiv preprint arXiv:1911.03614, 2019
202019
A sentence cloze dataset for Chinese machine reading comprehension
Y Cui, T Liu, Z Yang, Z Chen, W Ma, W Che, S Wang, G Hu
arXiv preprint arXiv:2004.03116, 2020
152020
The leptophilic dark matter in the Sun: the minimum testable mass
ZL Liang, YL Tang, ZQ Yang
Journal of Cosmology and Astroparticle Physics 2018 (10), 035, 2018
102018
Critical behaviors and universality classes of percolation phase transitions on two-dimensional square lattice
Y Zhu, ZQ Yang, X Zhang, XS Chen
Communications in Theoretical Physics 64 (2), 231, 2015
102015
TextPruner: A model pruning toolkit for pre-trained language models
Z Yang, Y Cui, Z Chen
arXiv preprint arXiv:2203.15996, 2022
62022
HFL at SemEval-2022 task 8: A linguistics-inspired regression model with data augmentation for multilingual news similarity
Z Xu, Z Yang, Y Cui, Z Chen
arXiv preprint arXiv:2204.04844, 2022
52022
Adversarial training for machine reading comprehension with virtual embeddings
Z Yang, Y Cui, C Si, W Che, T Liu, S Wang, G Hu
arXiv preprint arXiv:2106.04437, 2021
52021
Pre-Training with Whole Word Masking for Chinese BERT. arXiv e-prints, art
Y Cui, W Che, T Liu, B Qin, Z Yang, S Wang, G Hu
arXiv preprint arXiv:1906.08101, 2019
52019
Gradient-based intra-attention pruning on pre-trained language models
Z Yang, Y Cui, X Yao, S Wang
arXiv preprint arXiv:2212.07634, 2022
42022
Interactive gated decoder for machine reading comprehension
Y Cui, W Che, Z Yang, T Liu, B Qin, S Wang, G Hu
Transactions on Asian and Low-resource Language Information Processing 21 (4 …, 2022
42022
IDOL: indicator-oriented logic pre-training for logical reasoning
Z Xu, Z Yang, Y Cui, S Wang
arXiv preprint arXiv:2306.15273, 2023
32023
Bilingual alignment pre-training for zero-shot cross-lingual transfer
Z Yang, W Ma, Y Cui, J Ye, W Che, S Wang
arXiv preprint arXiv:2106.01732, 2021
32021
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20