Follow
Jangho Kim
Title
Cited by
Cited by
Year
Paraphrasing complex network: Network compression via factor transfer
J Kim, SU Park, N Kwak
Advances in Neural Information Processing Systems (NeurIPS) 31, 2018
6622018
Feature-map-level online adversarial knowledge distillation
I Chung, SU Park, J Kim, N Kwak
International Conference on Machine Learning (ICML), 2006-2015, 2020
1732020
Feature fusion for online mutual knowledge distillation
J Kim, M Hyun, I Chung, N Kwak
2020 25th International Conference on Pattern Recognition (ICPR), 4619-4625, 2021
1212021
QTI submission to DCASE 2021: Residual normalization for device imbalanced acoustic scene classification with efficient design
B Kim, S Yang, J Kim, S Chang
DCASE2021 Challenge, Tech. Rep, 2021
882021
Qkd: Quantization-aware knowledge distillation
J Kim, Y Bhalgat, J Lee, C Patel, N Kwak
arXiv preprint arXiv:1911.12491, 2019
842019
PQK: model compression via pruning, quantization, and knowledge distillation
J Kim, S Chang, N Kwak
INTERSPEECH, 2021
532021
Detection of adversarial examples in text classification: Benchmark and baseline via robust density estimation
KY Yoo, J Kim, J Jang, N Kwak
Findings of the Association for Computational Linguistics: ACL 2022, 3656-3672, 2022
51*2022
Domain generalization with relaxed instance frequency-wise normalization for multi-device acoustic scene classification
B Kim, S Yang, J Kim, H Park, J Lee, S Chang
INTERSPEECH, 2022
452022
Position-based scaled gradient for model quantization and pruning
J Kim, KY Yoo, N Kwak
Advances in Neural Information Processing Systems (NeurIPS) 33, 20415-20426, 2020
422020
Domain Generalization on Efficient Acoustic Scene Classification using Residual Normalization
B Kim, S Yang, J Kim, S Chang
DCASE 2021 workshops, 2021
232021
Self-Distilled Self-Supervised Representation Learning
J Jang, S Kim, K Yoo, C Kong, J Kim, N Kwak
Proceedings of the IEEE/CVF Winter Conference on Applications of Computer …, 2023
202023
StackNet: Stacking feature maps for Continual learning
J Kim, J Kim, N Kwak
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2020
16*2020
Quantization Robust Pruning With Knowledge Distillation
J Kim
IEEE Access 11, 26419-26426, 2023
102023
Finding Efficient Pruned Network via Refined Gradients for Pruned Weights
J Kim, J Yoo, Y Song, KY Yoo, N Kwak
Proceedings of the 31st ACM International Conference on Multimedia (ACM MM …, 2023
8*2023
Prototype-based Personalized Pruning
J Kim, S Chang, S Yun, N Kwak
2021 IEEE International Conference on Acoustics, Speech and Signal …, 2021
52021
Magnitude Attention-based Dynamic Pruning
J Back, N Ahn, J Kim
arXiv preprint arXiv:2306.05056, 2023
22023
Variational On-the-Fly Personalization
J Kim, JT Lee, S Chang, N Kwak
International Conference on Machine Learning (ICML), 11134-11147, 2022
22022
Personalized neural network pruning
S Chang, KIM Jangho, P Hyunsin, LEE Juntae, J Choi, KW Hwang
US Patent App. 17/506,646, 2022
22022
Pqk: Model com-pression via pruning quantization and knowledge distillation
J Kim, S Chang, N Kwak
INTER-SPEECH, 2021
22021
Detecting Korean characters in natural scenes by alphabet detection and agglomerative character construction
J Kim, YJ Kim, Y Kim, D Kim
2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC …, 2016
22016
The system can't perform the operation now. Try again later.
Articles 1–20