Follow
Kaiyuan Gao
Title
Cited by
Cited by
Year
NAGphormer: A tokenized graph transformer for node classification in large graphs
J Chen, K Gao, G Li, K He
International Conference on Learning Representations (ICLR 2022), 2022
622022
Biot5: Enriching cross-modal integration in biology with chemical knowledge and natural language associations
Q Pei, W Zhang, J Zhu, K Wu, K Gao, L Wu, Y Xia, R Yan
arXiv preprint arXiv:2310.07276, 2023
152023
Pre-training Antibody Language Models for Antigen-Specific Computational Antibody Design
K Gao, L Wu, J Zhu, T Peng, Y Xia, L He, S Xie, T Qin, H Liu, K He, TY Liu
Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and …, 2023
8*2023
FABind: Fast and accurate protein-ligand binding
Q Pei, K Gao, L Wu, J Zhu, Y Xia, S Xie, T Qin, K He, TY Liu, R Yan
Advances in Neural Information Processing Systems 36, 2024
42024
BioT5+: Towards Generalized Biological Understanding with IUPAC Integration and Multi-task Tuning
Q Pei, L Wu, K Gao, X Liang, Y Fang, J Zhu, S Xie, T Qin, R Yan
arXiv preprint arXiv:2402.17810, 2024
22024
Examining user-friendly and open-sourced large gpt models: A survey on language, multimodal, and scientific gpt models
K Gao, S He, Z He, J Lin, QZ Pei, J Shao, W Zhang
arXiv preprint arXiv:2308.14149, 2023
22023
Tokenized Graph Transformer with Neighborhood Augmentation for Node Classification in Large Graphs
J Chen, C Liu, K Gao, G Li, K He
arXiv preprint arXiv:2305.12677, 2023
12023
Revisiting language encoding in learning multilingual representations
S Luo, K Gao, S Zheng, G Ke, D He, L Wang, TY Liu
arXiv preprint arXiv:2102.08357, 2021
12021
FABind+: Enhancing Molecular Docking through Improved Pocket Prediction and Pose Generation
K Gao, Q Pei, J Zhu, T Qin, K He, TY Liu, L Wu
arXiv preprint arXiv:2403.20261, 2024
2024
Leveraging Biomolecule and Natural Language through Multi-Modal Learning: A Survey
Q Pei, L Wu, K Gao, J Zhu, Y Wang, Z Wang, T Qin, R Yan
arXiv preprint arXiv:2403.01528, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–10