选择语言
< 返回主菜单
yangzhi_fuben.jpg

杨植麟

上海期智研究院PI(2021年7月-2024年6月)

个人简介

曾任上海期智研究院PI。

杨植麟专注于以自然语言处理为桥梁实现认知智能,先后以一作身份,发表了成果Transformer-XL和XLNet,对自然语言处理带来了极大的影响;Google Scholar引用数超过9600;本科毕业于清华大学,博士毕业于美国卡内基梅隆大学。


个人荣誉:

福布斯中国30位30岁以下精英

福布斯亚洲30位30岁以下精英

研究方向

大规模预训练

自然语言处理

少样本学习/零样本学习

生成模型

序列模型

多模态学习

论文发表

8. Jing Zhou, Zongyu Lin, Yanan Zheng, Jian Li, Zhilin Yang, Not All Tasks Are Born Equal: Understanding Zero-Shot Generalization, International Conference on Learning Representation (ICLR), 2023 查看PDF


7. Nan Shao, Zefan Cai, Chonghua Liao, Yanan Zheng, Zhilin Yang, Compositional task representations for large language models, International Conference on Learning Representation (ICLR), 2023 查看PDF


6. Haike Xu, Zongyu Lin, Jing Zhou, Yanan Zheng, Zhilin Yang, A Universal Discriminator for Zero-Shot Generalization, Annual Meeting of the Association for Computational Linguistics(ACL), 2023 查看PDF


5. Xingcheng Yao, Yanan Zheng, Xiaocong Yang, Zhilin Yang, NLP From Scratch Without Large-Scale Pretraining:A Simple and Efficient FrameworkInternational Conference on Machine Learning (ICML), 2023 查看PDF


4. Zhengxiao Du, Yujie Qian, Xiao Liu, Ming Ding, Jiezhong Qiu, Zhilin Yang, Jie Tang, GLM: General Language Model Pretraining with Autoregressive Blank Infilling, 2022 Annual Meeting of the Association for Computational Linguistics(ACL), 2022 查看PDF


3. Xiao Liu, Kaixuan Ji, Yicheng Fu, Weng Lam Tam, Zhengxiao Du, Zhilin Yang, Jie Tang, P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks, 2022 Annual Meeting of the Association for Computational Linguistics(ACL), 2022 查看PDF


2. Yanan Zheng, Jing Zhou, Yujie Qian, Ming Ding, Chonghua Liao, Jian Li, Ruslan Salakhutdinov, Jie Tang, Sebastian Ruder, Zhilin Yang, FewNLU: Benchmarking State-of-the-Art Methods for Few-Shot Natural Language Understanding, 2022 Annual Meeting of the Association for Computational Linguistics(ACL), 2022 查看PDF


1. Jing Zhou, Yanan Zheng, Jie Tang, Jian Li, Zhilin Yang, Flipda: Effective and Robust Data Augmentation for Few-Shot Learning, 2022 Annual Meeting of the Association for Computational Linguistics(ACL), 2022 查看PDF