Seguir
Chuanguang Yang
Título
Citado por
Citado por
Año
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
C Yang, H Zhou, Z An, X Jiang, Y Xu, Q Zhang
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2022
1162022
Mutual contrastive learning for visual representation learning
C Yang, Z An, L Cai, Y Xu
Proceedings of the AAAI Conference on Artificial Intelligence 36 (3), 3045-3053, 2022
532022
Hierarchical Self-supervised Augmented Knowledge Distillation
C Yang, Z An, L Cai, Y Xu
International Joint Conference on Artificial Intelligence (IJCAI-21), 1217-1223, 2021
522021
EENA: Efficient evolution of neural architecture
H Zhu, Z An, C Yang, K Xu, E Zhao, Y Xu
Proceedings of the IEEE/CVF International Conference on Computer Vision …, 2019
482019
Gated Convolutional Networks with Hybrid Connectivity for Image Classification
C Yang, Z An, H Zhu, X Hu, K Zhang, K Xu, C Li, Y Xu
Proceedings of the AAAI Conference on Artificial Intelligence 34 (04), 12581 …, 2020
382020
Multi-objective Pruning for CNNs using Genetic Algorithm
C Yang, Z An, C Li, B Diao, Y Xu
28th International Conference on Artificial Neural Networks (ICANN), 299-305, 2019
332019
MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition
C Yang, Z An, H Zhou, L Cai, X Zhi, J Wu, Y Xu, Q Zhang
European Conference on Computer Vision, 534-551, 2022
282022
Prior Gradient Mask Guided Pruning-aware Fine-tuning
L Cai, Z An, C Yang, Y Yan, Y Xu
Proceedings of the AAAI Conference on Artificial Intelligence, 2022
222022
Softer Pruning, Incremental Regularization
L Cai, Z An, C Yang, Y Xu
International Conference on Pattern Recognition (ICPR), 2020
222020
Multi-view contrastive learning for online knowledge distillation
C Yang, Z An, Y Xu
ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and …, 2021
212021
Online knowledge distillation via mutual contrastive learning for visual recognition
C Yang, Z An, H Zhou, F Zhuang, Y Xu, Q Zhang
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023
152023
Knowledge distillation using hierarchical self-supervision augmented distribution
C Yang, Z An, L Cai, Y Xu
IEEE Transactions on Neural Networks and Learning Systems, 2022
132022
Dnanet: De-normalized attention based multi-resolution network for human pose estimation
K Zhang, P He, P Yao, G Chen, C Yang, H Li, L Fu, T Zheng
CORR, 2019
132019
Efficient Search for the Number of Channels for Convolutional Neural Networks
H Zhu, Z An, C Yang, X Hu, K Xu, Y Xu
International Joint Conference on Neural Networks (IJCNN), 2020
10*2020
Soft and hard filter pruning via dimension reduction
L Cai, Z An, C Yang, Y Xu
2021 International Joint Conference on Neural Networks (IJCNN), 1-8, 2021
72021
CLIP-KD: An Empirical Study of Distilling CLIP Models
C Yang, Z An, L Huang, J Bi, X Yu, H Yang, Y Xu
arXiv preprint arXiv:2307.12732, 2023
42023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
C Yang, X Yu, Z An, Y Xu
Advancements in Knowledge Distillation: Towards New Horizons of Intelligent …, 2023
42023
Learning positional priors for pretraining 2D pose estimators
K Zhang, P Yao, R Wu, C Yang, D Li, M Du, K Deng, R Liu, T Zheng
Proceedings of the 2nd International Workshop on Human-centric Multimedia …, 2021
3*2021
DRNet: Dissect and Reconstruct the Convolutional Neural Network via Interpretable Manners
X Hu, Z An, C Yang, H Zhu, K Xu, Y Xu
European Conference on Artificial Intelligence, 2712-2719, 2019
32019
eTag: Class-Incremental Learning via Embedding Distillation and Task-Oriented Generation
L Huang, Y Zeng, C Yang, Z An, B Diao, Y Xu
Proceedings of the AAAI Conference on Artificial Intelligence 38 (11), 12591 …, 2024
22024
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20