Seguir
Mikel Artetxe
Mikel Artetxe
Reka AI
Dirección de correo verificada de reka.ai - Página principal
Título
Citado por
Citado por
Año
OPT: Open pre-trained transformer language models
S Zhang, S Roller, N Goyal, M Artetxe, M Chen, S Chen, C Dewan, ...
arXiv preprint arXiv:2205.01068, 2022
2168*2022
Unsupervised neural machine translation
M Artetxe, G Labaka, E Agirre, K Cho
Proceedings of the Sixth International Conference on Learning Representations, 2018
9552018
Massively multilingual sentence embeddings for zero-shot cross-lingual transfer and beyond
M Artetxe, H Schwenk
Transactions of the association for computational linguistics 7, 597-610, 2019
9492019
Rethinking the role of demonstrations: What makes in-context learning work?
S Min, X Lyu, A Holtzman, M Artetxe, M Lewis, H Hajishirzi, L Zettlemoyer
arXiv preprint arXiv:2202.12837, 2022
7592022
A robust self-learning method for fully unsupervised cross-lingual mappings of word embeddings
M Artetxe, G Labaka, E Agirre
Proceedings of the 56th Annual Meeting of the Association for Computational …, 2018
6602018
On the cross-lingual transferability of monolingual representations
M Artetxe, S Ruder, D Yogatama
arXiv preprint arXiv:1910.11856, 2019
6332019
Learning bilingual word embeddings with (almost) no bilingual data
M Artetxe, G Labaka, E Agirre
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
5932017
Learning principled bilingual mappings of word embeddings while preserving monolingual invariance
M Artetxe, G Labaka, E Agirre
Proceedings of the 2016 Conference on Empirical Methods in Natural Language …, 2016
4412016
Generalizing and Improving Bilingual Word Embedding Mappings with a Multi-Step Framework of Linear Transformations
M Artetxe, G Labaka, E Agirre
Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence …, 2018
2562018
Unsupervised statistical machine translation
M Artetxe, G Labaka, E Agirre
arXiv preprint arXiv:1809.01272, 2018
2462018
Margin-based parallel corpus mining with multilingual sentence embeddings
M Artetxe, H Schwenk
Proceedings of the 57th Annual Meeting of the Association for Computational …, 2019
2002019
An effective approach to unsupervised machine translation
M Artetxe, G Labaka, E Agirre
Proceedings of the 57th Annual Meeting of the Association for Computational …, 2019
1752019
Multilingual autoregressive entity linking
N De Cao, L Wu, K Popat, M Artetxe, N Goyal, M Plekhanov, ...
Transactions of the Association for Computational Linguistics 10, 274-290, 2022
1072022
Translation artifacts in cross-lingual transfer learning
M Artetxe, G Labaka, E Agirre
arXiv preprint arXiv:2004.04721, 2020
912020
Lifting the curse of multilinguality by pre-training modular transformers
J Pfeiffer, N Goyal, XV Lin, X Li, J Cross, S Riedel, M Artetxe
arXiv preprint arXiv:2205.06266, 2022
802022
Analyzing the Limitations of Cross-lingual Word Embedding Mappings
A Ormazabal, M Artetxe, G Labaka, A Soroa, E Agirre
Proceedings of the 57th Annual Meeting of the Association for Computational …, 2019
762019
Efficient large scale language modeling with mixtures of experts
M Artetxe, S Bhosale, N Goyal, T Mihaylov, M Ott, S Shleifer, XV Lin, J Du, ...
arXiv preprint arXiv:2112.10684, 2021
732021
Jingfei Du, et al. 2021. Few-shot learning with multilingual language models
XV Lin, T Mihaylov, M Artetxe, T Wang, S Chen, D Simig, M Ott, N Goyal, ...
arXiv preprint arXiv:2112.10668, 35-40, 2021
712021
Opt: Open pre-trained transformer language models, 2022
S Zhang, S Roller, N Goyal, M Artetxe, M Chen, S Chen, C Dewan, ...
URL https://arxiv. org/abs/2205.01068 3, 19-0, 2023
702023
A call for more rigor in unsupervised cross-lingual learning
M Artetxe, S Ruder, D Yogatama, G Labaka, E Agirre
arXiv preprint arXiv:2004.14958, 2020
652020
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20