Seguir
Erin Grant
Erin Grant
Senior Research Fellow, University College London
Dirección de correo verificada de ucl.ac.uk - Página principal
Título
Citado por
Citado por
Año
Recasting gradient-based meta-learning as hierarchical Bayes
E Grant, C Finn, S Levine, T Darrell, TL Griffiths
International Conference on Learning Representations (ICLR), 2018
6382018
Are convolutional neural networks or transformers more like human vision?
S Tuli, I Dasgupta, E Grant, TL Griffiths
Annual Meeting of the Cognitive Science Society (CogSci), 2021
2142021
Reconciling meta-learning and continual learning with online mixtures of tasks
G Jerfel*, E Grant*, TL Griffiths, K Heller
Advances in Neural Information Processing Systems (NeurIPS), 2019
148*2019
Doing more with less: Meta-reasoning and meta-learning in humans and machines
TL Griffiths, F Callaway, MB Chang, E Grant, PM Krueger, F Lieder
Current Opinion in Behavioral Sciences 29, 24-30, 2019
1392019
Evaluating theory of mind in question answering
A Nematzadeh, K Burns, E Grant, A Gopnik, TL Griffiths
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2018
902018
Getting aligned on representational alignment
I Sucholutsky, L Muttenthaler, A Weller, A Peng, A Bobu, B Kim, BC Love, ...
arXiv preprint arXiv:2310.13018, 2023
382023
Universal linguistic inductive biases via meta-learning
RT McCoy, E Grant, P Smolensky, TL Griffiths, T Linzen
Annual Meeting of the Cognitive Science Society (CogSci), 2020
302020
How can memory-augmented neural networks pass a false-belief task?
E Grant, A Nematzadeh, TL Griffiths
Annual Meeting of the Cognitive Science Society (CogSci), 2017
222017
Passive attention in artificial neural networks predicts human visual selectivity
TA Langlois, HC Zhao, E Grant, I Dasgupta, TL Griffiths, N Jacoby
Advances in Neural Information Processing Systems (NeurIPS), 2021
172021
The transient nature of emergent in-context learning in transformers
AK Singh, SCY Chan, T Moskovitz, E Grant, AM Saxe, F Hill
Thirty-seventh Annual Conference on Neural Information Processing Systems, 2023
162023
Distinguishing rule-and exemplar-based generalization in learning systems
I Dasgupta*, E Grant*, TL Griffiths
International Conference on Machine Learning (ICML), 2022
82022
Exploiting attention to reveal shortcomings in memory models
K Burns, A Nematzadeh, E Grant, A Gopnik, TL Griffiths
EMNLP Workshop on BlackboxNLP: Analyzing and Interpreting Neural Networks …, 2018
82018
Gaussian process surrogate models for neural networks
MY Li, E Grant, TL Griffiths
Conference on Uncertainty in Artificial Intelligence (UAI), 2023
6*2023
Learning deep taxonomic priors for concept learning from few positive examples
E Grant, JC Peterson, TL Griffiths
Annual Meeting of the Cognitive Science Society (CogSci), 2019
62019
A computational cognitive model of novel word generalization
A Nematzadeh, E Grant, S Stevenson
Conference on Empirical Methods in Natural Language Processing (EMNLP), 1795 …, 2015
62015
The emergence of gender associations in child language development
B Prystawski, E Grant, A Nematzadeh, SWS Lee, S Stevenson, Y Xu
Cognitive Science, 2022
52022
Bayes in the age of intelligent machines
TL Griffiths, JQ Zhu, E Grant, R Thomas McCoy
Current Directions in Psychological Science, 09637214241262329, 2024
42024
Predicting generalization with degrees of freedom in neural networks
E Grant, Y Wu
ICML 2022 2nd AI for Science Workshop, 2022
42022
Tracing the emergence of gendered language in childhood
B Prystawski, E Grant, A Nematzadeh, SWS Lee, S Stevenson, Y Xu
Annual Meeting of the Cognitive Science Society (CogSci), 2020
22020
The interaction of memory and attention in novel word generalization: A computational investigation
E Grant, A Nematzadeh, S Stevenson
Annual Meeting of the Cognitive Science Society (CogSci), 2016
22016
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20