Follow
Kuluhan Binici
Title
Cited by
Cited by
Year
Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
K Binici, S Aggarwal, NT Pham, K Leman, T Mitra
Proceedings of the AAAI Conference on Artificial Intelligence 36 (6), 6089-6096, 2022
352022
Preventing catastrophic forgetting and distribution mismatch in knowledge distillation via synthetic data
K Binici, NT Pham, T Mitra, K Leman
Proceedings of the IEEE/CVF winter conference on applications of computer …, 2022
342022
Chameleon: Dual memory replay for online continual learning on edge devices
S Aggarwal, K Binici, T Mitra
IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 2023
22023
Visual-Policy Learning Through Multi-Camera View to Single-Camera View Knowledge Distillation for Robot Manipulation Tasks
C Acar, K Binici, A Tekirdağ, Y Wu
IEEE Robotics and Automation Letters 9 (1), 691-698, 2023
12023
CRISP: Hybrid Structured Sparsity for Class-aware Model Pruning
S Aggarwal, K Binici, T Mitra
arXiv preprint arXiv:2311.14272, 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–5