Seguir
Ethan Gotlieb Wilcox
Ethan Gotlieb Wilcox
PhD Candidate, Harvard University
Dirección de correo verificada de g.harvard.edu - Página principal
Título
Citado por
Citado por
Año
What do RNN Language Models Learn about Filler-Gap Dependencies?
EG Wilcox, R Levy, T Morita, R Futrell
arXiv preprint arXiv:1809.00042, 2018
1082018
Neural language models as psycholinguistic subjects: Representations of syntactic state
R Futrell, E Wilcox, T Morita, P Qian, M Ballesteros, R Levy
arXiv preprint arXiv:1903.03260, 2019
1072019
A systematic assessment of syntactic generalization in neural language models
J Hu, J Gauthier, P Qian, E Wilcox, RP Levy
arXiv preprint arXiv:2005.03692, 2020
882020
On the predictive power of neural language models for human real-time comprehension behavior
EG Wilcox, J Gauthier, J Hu, P Qian, R Levy
arXiv preprint arXiv:2006.01912, 2020
492020
RNNs as psycholinguistic subjects: Syntactic state and grammatical dependency
R Futrell, E Wilcox, T Morita, R Levy
arXiv preprint arXiv:1809.01329, 2018
432018
Structural supervision improves learning of non-local grammatical dependencies
E Wilcox, P Qian, R Futrell, M Ballesteros, R Levy
arXiv preprint arXiv:1903.00943, 2019
412019
SyntaxGym: An online platform for targeted evaluation of language models
J Gauthier, J Hu, E Wilcox, P Qian, R Levy
Association for Computational Linguistics (ACL), 2020
282020
Hierarchical representation in neural language models: Suppression and recovery of expectations
E Wilcox, R Levy, R Futrell
arXiv preprint arXiv:1906.04068, 2019
272019
What syntactic structures block dependencies in RNN language models?
E Wilcox, R Levy, R Futrell
arXiv preprint arXiv:1905.10431, 2019
152019
Representation of constituents in neural language models: Coordination phrase as a case study
A An, P Qian, E Wilcox, R Levy
arXiv preprint arXiv:1909.04625, 2019
92019
Investigating novel verb Learning in BERT: Selectional preference classes and alternation-based syntactic generalization
T Thrush, E Wilcox, R Levy
arXiv preprint arXiv:2011.02417, 2020
52020
Structural supervision improves few-shot learning and syntactic generalization in neural language models
E Wilcox, P Qian, R Futrell, R Kohita, R Levy, M Ballesteros
arXiv preprint arXiv:2010.05725, 2020
52020
A Targeted Assessment of Incremental Processing in Neural Language Models and Humans
E Gotlieb Wilcox, P Vani, RP Levy
arXiv e-prints, arXiv: 2106.03232, 2021
4*2021
The Role of Prior Beliefs in The Rational Speech Act Model of Pragmatics: Exhaustivity as a Case Study
E Wilcox, B Spector
42019
Using Computational Models to Test Syntactic Learnability
E Wilcox, R Futrell, R Levy
Lingbuzz Preprint: lingbuzz/006327, 2021
32021
Using the interpolated maze task to assess incremental processing in english relative clauses
P Vani, EG Wilcox, R Levy
Proceedings of the Annual Meeting of the Cognitive Science Society 43 (43), 2021
12021
Informative Presupposition & Accommodation
EG Wilcox
2022
Exhaustivity and anti-exhaustivity in the RSA framework: Testing the effect of prior beliefs
A Cremers, EG Wilcox, B Spector
arXiv preprint arXiv:2202.07023, 2022
2022
Evidence for Availability Effects on Speaker Choice in the Russian Comparative Alternation
T Clark, EG Wilcox, E Gibson, R Levy
Proceedings of the Annual Meeting of the Cognitive Science Society 44 (44), 2022
2022
Which presuppositions are subject to contextual felicity constraints?
EG Wilcox
Semantics and Linguistic Theory 31, 345-364, 2021
2021
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20