Seguir
Iman Mirzadeh
Iman Mirzadeh
Otros nombresSeyed Iman Mirzadeh
Dirección de correo verificada de apple.com - Página principal
Título
Citado por
Citado por
Año
Improved knowledge distillation via teacher assistant: Bridging the gap between student and teacher
SI Mirzadeh, M Farajtabar, A Li, H Ghasemzadeh
AAAI Conference on Artificial Intelligence (AAAI), 2019
1281*2019
Understanding the role of training regimes in continual learning
SI Mirzadeh, M Farajtabar, R Pascanu, H Ghasemzadeh
Advances in Neural Information Processing Systems (NeurIPS), 2020
2222020
Linear mode connectivity in multitask and continual learning
SI Mirzadeh, M Farajtabar, D Gorur, R Pascanu, H Ghasemzadeh
International Conference on Learning Representations (ICLR), 2021
1282021
Llm in a flash: Efficient large language model inference with limited memory
K Alizadeh, I Mirzadeh, D Belenko, K Khatamifard, M Cho, CC Del Mundo, ...
arXiv preprint arXiv:2312.11514, 2023
712023
Architecture matters in continual learning
SI Mirzadeh, A Chaudhry, D Yin, T Nguyen, R Pascanu, D Gorur, ...
arXiv preprint arXiv:2202.00275, 2022
712022
Wide neural networks forget less catastrophically
SI Mirzadeh, A Chaudhry, D Yin, H Hu, R Pascanu, D Gorur, M Farajtabar
International Conference on Machine Learning (ICML), 15699-15717, 2022
652022
Dropout as an Implicit Gating Mechanism For Continual Learning
SI Mirzadeh, M Farajtabar, H Ghasemzadeh
The IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR …, 2020
572020
Relu strikes back: Exploiting activation sparsity in large language models
I Mirzadeh, K Alizadeh, S Mehta, CC Del Mundo, O Tuzel, G Samei, ...
arXiv preprint arXiv:2310.04564, 2023
542023
Openelm: An efficient language model family with open training and inference framework
S Mehta, MH Sekhavat, Q Cao, M Horton, Y Jin, C Sun, SI Mirzadeh, ...
Workshop on Efficient Systems for Foundation Models II@ ICML2024, 2024
46*2024
Gsm-symbolic: Understanding the limitations of mathematical reasoning in large language models
I Mirzadeh, K Alizadeh, H Shahrokhi, O Tuzel, S Bengio, M Farajtabar
arXiv preprint arXiv:2410.05229, 2024
312024
ActiLabel: A Combinatorial Transfer Learning Framework for Activity Recognition
P Alinia, SI Mirzadeh, H Ghasemzadeh
arXiv preprint arXiv:2003.07415, 2020
112020
Optimal Policy for Deployment of Machine Learning Models on Energy-Bounded Systems
SI Mirzadeh, H Ghasemzadeh
International Joint Conference on Artificial Intelligence (IJCAI), 2020
102020
TransNet: minimally supervised deep transfer learning for dynamic adaptation of wearable systems
SA Rokni, M Nourollahi, P Alinia, I Mirzadeh, M Pedram, H Ghasemzadeh
ACM Transactions on Design Automation of Electronic Systems (TODAES) 26 (1 …, 2020
92020
Continual learning beyond a single model
T Doan, SI Mirzadeh, M Farajtabar
Conference on Lifelong Learning Agents, 961-991, 2023
82023
Cl-gym: Full-featured pytorch library for continual learning
SI Mirzadeh, H Ghasemzadeh
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2021
82021
Designing deep neural networks robust to sensor failure in mobile health environments
A Mamun, SI Mirzadeh, H Ghasemzadeh
2022 44th Annual International Conference of the IEEE Engineering in …, 2022
72022
Use of machine learning to predict medication adherence in individuals at risk for atherosclerotic cardiovascular disease
SI Mirzadeh, A Arefeen, J Ardo, R Fallahzadeh, B Minor, JA Lee, ...
Smart Health 26, 100328, 2022
62022
Efficient continual learning ensembles in neural network subspaces
T Doan, SI Mirzadeh, J Pineau, M Farajtabar
arXiv preprint arXiv:2202.09826, 2022
62022
LIDS: mobile system to monitor type and volume of liquid intake
M Pedram, SI Mirzadeh, SA Rokni, R Fallahzadeh, DMK Woodbridge, ...
IEEE Sensors Journal 21 (18), 20750-20763, 2021
62021
Inter-beat interval estimation with tiramisu model: a novel approach with reduced error
A Arefeen, A Akbari, SI Mirzadeh, R Jafari, BA Shirazi, H Ghasemzadeh
ACM Transactions on Computing for Healthcare 5 (1), 1-19, 2024
42024
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20