Seguir
Tyler A. Chang
Tyler A. Chang
Dirección de correo verificada de ucsd.edu - Página principal
Título
Citado por
Citado por
Año
Co-Scale Conv-Attentional Image Transformers
W Xu, Y Xu, TA Chang, Z Tu
International Conference on Computer Vision, 2021
3242021
Do Large Language Models Know What Humans Know?
S Trott, C Jones, T Chang, J Michaelov, B Bergen
Cognitive Science, 2023
472023
Word Acquisition in Neural Language Models
TA Chang, BK Bergen
Transactions of the Association for Computational Linguistics 10, 1-16, 2022
352022
Language Model Behavior: A Comprehensive Survey
TA Chang, BK Bergen
Computational Linguistics, 2024
322024
The Geometry of Multilingual Language Model Representations
TA Chang, Z Tu, BK Bergen
Conference on Empirical Methods in Natural Language Processing, 2022
242022
Distributional Semantics Still Can’t Account for Affordances
CR Jones, TA Chang, S Coulson, JA Michaelov, S Trott, BK Bergen
Annual Meeting of the Cognitive Science Society 44 (44), 2022
172022
Convolutions and Self-Attention: Re-interpreting Relative Positions in Pre-trained Language Models
TA Chang, Y Xu, W Xu, Z Tu
Annual Meeting of the Association for Computational Linguistics and the …, 2021
142021
When Is Multilinguality a Curse? Language Modeling for 250 High-and Low-Resource Languages
TA Chang, C Arnett, Z Tu, BK Bergen
arXiv preprint arXiv:2311.09205, 2023
42023
Encodings of Source Syntax: Similarities in NMT Representations Across Target Languages
TA Chang, AN Rafferty
5th Workshop on Representation Learning for NLP at ACL, 2020
32020
Does Contextual Diversity Hinder Early Word Acquisition?
TA Chang, BK Bergen
Annual Meeting of the Cognitive Science Society 44 (44), 2022
22022
Structural Priming Demonstrates Abstract Grammatical Representations in Multilingual Language Models
JA Michaelov, C Arnett, TA Chang, BK Bergen
Conference on Empirical Methods in Natural Language Processing, 2023
12023
Crosslingual Structural Priming and the Pre-Training Dynamics of Bilingual Language Models
C Arnett, TA Chang, JA Michaelov, BK Bergen
3rd Multilingual Representation Learning Workshop at EMNLP, 2023
12023
Characterizing Learning Curves During Language Model Pre-Training: Learning, Forgetting, and Stability
TA Chang, Z Tu, BK Bergen
arXiv preprint arXiv:2308.15419, 2023
12023
Different Tokenization Schemes Lead to Comparable Performance in Spanish Number Agreement
C Arnett, PD Rivière, TA Chang, S Trott
arXiv preprint arXiv:2403.13754, 2024
2024
Detecting Hallucination and Coverage Errors in Retrieval Augmented Generation for Controversial Topics
TA Chang, K Tomanek, J Hoffmann, N Thain, E van Liemt, ...
Joint International Conference on Computational Linguistics, Language …, 2024
2024
A Bit of a Problem: Measurement Disparities in Dataset Sizes Across Languages
C Arnett, TA Chang, BK Bergen
arXiv preprint arXiv:2403.00686, 2024
2024
When Is a Word in Good Company for Learning?
L Unger, TA Chang, B Bergen, O Savic, V Sloutsky
Developmental Science, 2024
2024
Characterizing and Measuring Linguistic Dataset Drift
TA Chang, K Halder, NA John, Y Vyas, Y Benajiba, M Ballesteros, D Roth
Annual Meeting of the Association for Computational Linguistics, 2023
2023
Topology of Second Order Tensor Fields
TA Chang
Carleton Digital Commons, 2020
2020
Emergence of Hierarchical Syntax in Neural Machine Translation
TA Chang
Carleton Digital Commons, 2020
2020
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20