Seguir
Hassan Sajjad
Hassan Sajjad
Faculty of Computer Science, Dalhousie University
Dirección de correo verificada de dal.ca - Página principal
Título
Citado por
Citado por
Año
What do neural machine translation models learn about morphology?
Y Belinkov, N Durrani, F Dalvi, H Sajjad, J Glass
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
4242017
Robust classification of crisis-related data on social networks using convolutional neural networks
D Nguyen, KA Al Mannai, S Joty, H Sajjad, M Imran, P Mitra
Proceedings of the international AAAI conference on web and social media 11 …, 2017
2442017
Identifying and Controlling Important Neurons in Neural Machine Translation
A Bau, Y Belinkov, H Sajjad, N Durrani, F Dalvi, J Glass
ICLR, 2019
1782019
Compressing large-scale transformer-based models: A case study on bert
P Ganesh, Y Chen, X Lou, MA Khan, Y Yang, H Sajjad, P Nakov, D Chen, ...
Transactions of the Association for Computational Linguistics 9, 1061-1080, 2021
1742021
What is one grain of sand in the desert? analyzing individual neurons in deep nlp models
F Dalvi, N Durrani, H Sajjad, Y Belinkov, A Bau, J Glass
Proceedings of the AAAI Conference on Artificial Intelligence 33 (01), 6309-6317, 2019
1612019
Fighting the COVID-19 infodemic: Modeling the perspective of journalists, fact-checkers, social media platforms, policy makers, and the society
F Alam, S Shaar, F Dalvi, H Sajjad, A Nikolov, H Mubarak, GDS Martino, ...
arXiv preprint arXiv:2005.00033, 2020
1492020
The AMARA Corpus: Building Parallel Language Resources for the Educational Domain.
A Abdelali, F Guzman, H Sajjad, S Vogel
LREC 14, 1044-1054, 2014
1332014
Evaluating layers of representation in neural machine translation on part-of-speech and semantic tagging tasks
Y Belinkov, L Màrquez, H Sajjad, N Durrani, F Dalvi, J Glass
arXiv preprint arXiv:1801.07772, 2018
1172018
Applications of online deep learning for crisis response using social media information
DT Nguyen, S Joty, M Imran, H Sajjad, P Mitra
arXiv preprint arXiv:1610.01030, 2016
1162016
Fighting the COVID-19 infodemic in social media: A holistic perspective and a call to arms
F Alam, F Dalvi, S Shaar, N Durrani, H Mubarak, A Nikolov, ...
Proceedings of the International AAAI Conference on Web and Social Media 15 …, 2021
1152021
Integrating an Unsupervised Transliteration Model into Statistical Machine Translation
N Durrani, H Sajjad, H Hoang, P Koehn
EACL 2014, 148, 2014
1102014
Poor man’s bert: Smaller and faster transformer models
H Sajjad, F Dalvi, N Durrani, P Nakov
arXiv preprint arXiv:2004.03844 2 (2), 2020
1032020
Incremental decoding and training methods for simultaneous translation in neural machine translation
F Dalvi, N Durrani, H Sajjad, S Vogel
arXiv preprint arXiv:1806.03661, 2018
1002018
Analyzing individual neurons in pre-trained language models
N Durrani, H Sajjad, F Dalvi, Y Belinkov
arXiv preprint arXiv:2010.02695, 2020
842020
Verifiably effective arabic dialect identification
K Darwish, H Sajjad, H Mubarak
Proceedings of the 2014 Conference on Empirical Methods in Natural Language …, 2014
782014
Analyzing redundancy in pretrained transformer models
F Dalvi, H Sajjad, N Durrani, Y Belinkov
arXiv preprint arXiv:2004.04010, 2020
742020
Hindi-to-Urdu machine translation through transliteration
N Durrani, H Sajjad, A Fraser, H Schmid
Proceedings of the 48th Annual meeting of the Association for Computational …, 2010
742010
On the effect of dropping layers of pre-trained transformer models
H Sajjad, F Dalvi, N Durrani, P Nakov
Computer Speech & Language 77, 101429, 2023
732023
On the linguistic representational power of neural machine translation models
Y Belinkov, N Durrani, F Dalvi, H Sajjad, J Glass
Computational Linguistics 46 (1), 1-52, 2020
732020
Understanding and improving morphological learning in the neural machine translation decoder
F Dalvi, N Durrani, H Sajjad, Y Belinkov, S Vogel
Proceedings of the Eighth International Joint Conference on Natural Language …, 2017
702017
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20