Seguir
Shun Kiyono
Shun Kiyono
LY Corp.
Dirección de correo verificada de lycorp.co.jp - Página principal
Título
Citado por
Citado por
Año
An empirical study of incorporating pseudo data into grammatical error correction
S Kiyono, J Suzuki, M Mita, T Mizumoto, K Inui
arXiv preprint arXiv:1909.00502, 2019
1632019
ESPnet-ST: All-in-one speech translation toolkit
H Inaguma, S Kiyono, K Duh, S Karita, NEY Soplin, T Hayashi, ...
arXiv preprint arXiv:2004.10234, 2020
1562020
Encoder-decoder models can benefit from pre-trained masked language models in grammatical error correction
M Kaneko, M Mita, S Kiyono, J Suzuki, K Inui
arXiv preprint arXiv:2005.00987, 2020
1442020
Lessons on parameter sharing across layers in transformers
S Takase, S Kiyono
arXiv preprint arXiv:2104.06022, 2021
592021
Rethinking perturbations in encoder-decoders for fast training
S Takase, S Kiyono
arXiv preprint arXiv:2104.01853, 2021
422021
Effective adversarial regularization for neural machine translation
M Sato, J Suzuki, S Kiyono
Proceedings of the 57th Annual Meeting of the Association for Computational …, 2019
352019
Shape: Shifted absolute position embedding for transformers
S Kiyono, S Kobayashi, J Suzuki, K Inui
arXiv preprint arXiv:2109.05644, 2021
292021
Massive exploration of pseudo data for grammatical error correction
S Kiyono, J Suzuki, T Mizumoto, K Inui
IEEE/ACM transactions on audio, speech, and language processing 28, 2134-2145, 2020
182020
Tohoku-AIP-NTT at WMT 2020 news translation task
S Kiyono, T Ito, R Konno, M Morishita, J Suzuki
Proceedings of the Fifth Conference on Machine Translation, 145-155, 2020
152020
A self-refinement strategy for noise reduction in grammatical error correction
M Mita, S Kiyono, M Kaneko, J Suzuki, K Inui
arXiv preprint arXiv:2010.03155, 2020
142020
Pseudo zero pronoun resolution improves zero anaphora resolution
R Konno, S Kiyono, Y Matsubayashi, H Ouchi, K Inui
arXiv preprint arXiv:2104.07425, 2021
122021
Source-side prediction for neural headline generation
S Kiyono, S Takase, J Suzuki, N Okazaki, K Inui, M Nagata
arXiv preprint arXiv:1712.08302, 2017
112017
On layer normalizations and residual connections in transformers
S Takase, S Kiyono, S Kobayashi, J Suzuki
arXiv preprint arXiv:2206.00330, 2022
102022
Mixture of expert/imitator networks: Scalable semi-supervised learning framework
S Kiyono, J Suzuki, K Inui
Proceedings of the AAAI Conference on Artificial Intelligence 33 (01), 4073-4081, 2019
92019
Diverse lottery tickets boost ensemble from a single pretrained model
S Kobayashi, S Kiyono, J Suzuki, K Inui
arXiv preprint arXiv:2205.11833, 2022
82022
An empirical study of contextual data augmentation for japanese zero anaphora resolution
R Konno, Y Matsubayashi, S Kiyono, H Ouchi, R Takahashi, K Inui
arXiv preprint arXiv:2011.00948, 2020
82020
Unsupervised token-wise alignment to improve interpretation of encoder-decoder models
S Kiyono, S Takase, J Suzuki, N Okazaki, K Inui, M Nagata
Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and …, 2018
82018
B2t connection: Serving stability and performance in deep transformers
S Takase, S Kiyono, S Kobayashi, J Suzuki
arXiv preprint arXiv:2206.00330, 2022
72022
Lessons on parameter sharing across layers in transformers. CoRR abs/2104.06022 (2021)
S Takase, S Kiyono
arXiv preprint arXiv:2104.06022, 2021
52021
Reducing odd generation from neural headline generation
S Kiyono, S Takase, J Suzuki, N Okazaki, K Inui, M Nagata
Proceedings of the 32nd Pacific Asia Conference on Language, Information and …, 2018
42018
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20