Follow
Sushrut Karmalkar
Sushrut Karmalkar
University of Wisconsin-Madison
Verified email at cs.utexas.edu - Homepage
Title
Cited by
Cited by
Year
List-decodable linear regression
S Karmalkar, A Klivans, P Kothari
Advances in neural information processing systems 32, 2019
792019
Superpolynomial lower bounds for learning one-layer neural networks using gradient descent
S Goel, A Gollakota, Z Jin, S Karmalkar, A Klivans
International Conference on Machine Learning, 3587-3596, 2020
712020
Time/accuracy tradeoffs for learning a relu with respect to gaussian marginals
S Goel, S Karmalkar, A Klivans
Advances in neural information processing systems 32, 2019
572019
Approximation schemes for relu regression
I Diakonikolas, S Goel, S Karmalkar, AR Klivans, M Soltanolkotabi
Conference on learning theory, 1452-1485, 2020
522020
Robustly learning any clusterable mixture of gaussians
I Diakonikolas, SB Hopkins, D Kane, S Karmalkar
arXiv preprint arXiv:2005.06417, 2020
492020
Instance-optimal compressed sensing via posterior sampling
A Jalal, S Karmalkar, AG Dimakis, E Price
arXiv preprint arXiv:2106.11438, 2021
382021
Outlier-robust high-dimensional sparse estimation via iterative filtering
I Diakonikolas, D Kane, S Karmalkar, E Price, A Stewart
Advances in Neural Information Processing Systems 32, 2019
362019
Compressed sensing with adversarial sparse noise via l1 regression
S Karmalkar, E Price
arXiv preprint arXiv:1809.08055, 2018
362018
Fairness for image generation with uncertain sensitive attributes
A Jalal, S Karmalkar, J Hoffmann, A Dimakis, E Price
International Conference on Machine Learning, 4721-4732, 2021
342021
Outlier-robust clustering of gaussians and other non-spherical mixtures
A Bakshi, I Diakonikolas, SB Hopkins, D Kane, S Karmalkar, PK Kothari
2020 IEEE 61st Annual Symposium on Foundations of Computer Science (FOCS …, 2020
292020
Robust polynomial regression up to the information theoretic limit
D Kane, S Karmalkar, E Price
2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS …, 2017
172017
On the power of compressed sensing with generative models
A Kamath, E Price, S Karmalkar
International Conference on Machine Learning, 5101-5109, 2020
162020
Robust sparse mean estimation via sum of squares
I Diakonikolas, DM Kane, S Karmalkar, A Pensia, T Pittas
Conference on Learning Theory, 4703-4763, 2022
152022
Lower bounds for compressed sensing with generative models
A Kamath, S Karmalkar, E Price
arXiv preprint arXiv:1912.02938, 2019
142019
List-decodable sparse mean estimation via difference-of-pairs filtering
I Diakonikolas, D Kane, S Karmalkar, A Pensia, T Pittas
Advances in Neural Information Processing Systems 35, 13947-13960, 2022
92022
Fourier entropy-influence conjecture for random linear threshold functions
S Chakraborty, S Karmalkar, S Kundu, SV Lokam, N Saurabh
LATIN 2018: Theoretical Informatics: 13th Latin American Symposium, Buenos …, 2018
52018
Compressed sensing with approximate priors via conditional resampling
A Jalal, S Karmalkar, A Dimakis, E Price
NeurIPS 2020 Workshop on Deep Learning and Inverse Problems, 2020
42020
The polynomial method is universal for distribution-free correlational SQ learning
A Gollakota, S Karmalkar, A Klivans
arXiv preprint arXiv:2010.11925, 2020
12020
Depth separation and weight-width trade-offs for sigmoidal neural networks
A Deshpande, N Goyal, S Karmalkar
12018
Multi-Model 3D Registration: Finding Multiple Moving Objects in Cluttered Point Clouds
D Jin, S Karmalkar, H Zhang, L Carlone
arXiv preprint arXiv:2402.10865, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–20