Follow
Shuxin Zheng
Shuxin Zheng
Principal Researcher, Microsoft Research
Verified email at microsoft.com - Homepage
Title
Cited by
Cited by
Year
Do Transformers Really Perform Badly for Graph Representation?
C Ying, T Cai, S Luo, S Zheng, G Ke, D He, Y Shen, TY Liu
Thirty-Fifth Conference on Neural Information Processing Systems (NIPS), 2021, 2021
11892021
On layer normalization in the transformer architecture
R Xiong, Y Yang, D He, K Zheng, S Zheng, C Xing, H Zhang, Y Lan, ...
Proceedings of the 37th International Conference on Machine Learning, 2020, 2020
9412020
Asynchronous stochastic gradient descent with delay compensation
S Zheng, Q Meng, T Wang, W Chen, N Yu, ZM Ma, TY Liu
Proceedings of the 34th International Conference on Machine Learning, PMLR …, 2017
361*2017
Invertible Image Rescaling
M Xiao, S Zheng, C Liu, Y Wang, D He, G Ke, J Bian, Z Lin, TY Liu
European Conference on Computer Vision (ECCV) 2020, 126-144, 2020
2512020
Cross-Iteration Batch Normalization
Z Yao, Y Cao, S Zheng, G Huang, S Lin
IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2021, 2020
1332020
Deep learning for prediction of the air quality response to emission changes
J Xing, S Zheng, D Ding, JT Kelly, S Wang, S Li, T Qin, M Ma, Z Dong, ...
Environmental science & technology 54 (14), 8589-8600, 2020
832020
One transformer can understand both 2d & 3d molecular data
S Luo, T Chen, Y Xu, S Zheng, TY Liu, L Wang, D He
The Eleventh International Conference on Learning Representations, 2022
772022
Benchmarking graphormer on large-scale molecular modeling datasets
Y Shi, S Zheng, G Ke, Y Shen, J You, J He, S Luo, C Liu, D He, TY Liu
arXiv preprint arXiv:2203.04810, 2022
602022
How could Neural Networks understand Programs?
D Peng, S Zheng, Y Li, G Ke, D He, TY Liu
Proceedings of International Conference on Machine Learning (ICML), 2021 …, 2021
602021
Predicting equilibrium distributions for molecular systems with deep learning
S Zheng, J He, C Liu, Y Shi, Z Lu, W Feng, F Ju, J Wang, J Zhu, Y Min, ...
Nature Machine Intelligence, 1-10, 2024
59*2024
Stable, Fast and Accurate: Kernelized Attention with Relative Positional Encoding
S Luo, S Li, T Cai, D He, D Peng, S Zheng, G Ke, L Wang, TY Liu
Advances in Neural Information Processing Systems, 2021 (NeurIPS 2021), 2021
472021
Your transformer may not be as powerful as you expect
S Luo, S Li, S Zheng, TY Liu, L Wang, D He
Advances in Neural Information Processing Systems 35, 4301-4315, 2022
452022
Molecule generation for target protein binding with structural motifs
Z Zhang, Y Min, S Zheng, Q Liu
The Eleventh International Conference on Learning Representations, 2023
402023
-SGD: Optimizing ReLU Neural Networks in its Positively Scale-Invariant Space
Q Meng, S Zheng, H Zhang, W Chen, Q Ye, ZM Ma, TY Liu
Proceedings of the 7th International Conference on Learning Representations …, 2018
342018
The impact of large language models on scientific discovery: a preliminary study using gpt-4
MR AI4Science, MA Quantum
arXiv preprint arXiv:2311.07361, 2023
282023
Modeling Lost Information in Lossy Image Compression
Y Wang, M Xiao, C Liu, S Zheng, TY Liu
arXiv preprint arXiv:2006.11999, 2020
262020
Capacity control of relu neural networks by basis-path norm
S Zheng, Q Meng, H Zhang, W Chen, N Yu, TY Liu
Proceedings of the 33rd AAAI Conference on Artificial Intelligence, 2019, 2018
242018
Invertible rescaling network and its extensions
M Xiao, S Zheng, C Liu, Z Lin, TY Liu
International Journal of Computer Vision 131 (1), 134-159, 2023
212023
Mc-bert: Efficient language pre-training via a meta controller
Z Xu, L Gong, G Ke, D He, S Zheng, L Wang, J Bian, TY Liu
arXiv preprint arXiv:2006.05744, 2020
172020
Quantized training of gradient boosting decision trees
Y Shi, G Ke, Z Chen, S Zheng, TY Liu
Advances in neural information processing systems 35, 18822-18833, 2022
162022
The system can't perform the operation now. Try again later.
Articles 1–20