I am currently engaged in the R&D related to large language models and multi-modal models at Meituan. I earned my Ph.D degree from Shanghai Jiao Tong University in 2024 and was advised by Kenny Q. Zhu. Before that, I received my bachelor’s degree in Computer Science from Tong Ji University in 2019.
Research Interests
I am generally interested in natural language processing. Current interests include:
- Efficient Language Models
- Model Compression
- Knowledge Distillation
- Pruning
- Inference Acceleration
Publications
* denotes co-first authors
Symbol-LLM: Towards Foundational Symbol-centric Interface For Large Language Models
Fangzhi Xu, Zhiyong Wu, Qiushi Sun, Siyu Ren, Fei Yuan, Shuai Yuan, Qika Lin, Yu Qiao, Jun Liu
ACL 2024. [arxiv] [github]
On the Efficacy of Eviction Policy for Key-Value Constrained Generative Language Model Inference
Siyu Ren, Kenny Q. Zhu
Arxiv. [arxiv] [github]
EMO: Earth Mover Distance Optimization for Auto-regressive Language Modeling
Siyu Ren, Zhiyong Wu, Kenny Q. Zhu
ICLR 2024. [github] [arxiv]
Context Compression for Auto-regressive Transformers with Sentinel Tokens
Siyu Ren, Qi Jia, Kenny Q. Zhu
EMNLP 2023. [github]
Zero-shot Faithfulness Evaluation for Text Summarization with Foundation Language Model
Qi Jia, Siyu Ren, Yizhu Liu, Kenny Q. Zhu
EMNLP 2023. [github]
Combating Short Circuit Behavior in Natural Language Reasoning: Crossover and Mutation Operations for Enhanced Robustness
Shanshan Huang, Siyu Ren, Kenny Q. Zhu
ECAI 2023. [arxiv]
Pruning Pre-trained Language Models with Principled Importance and Self-regularization
Siyu Ren, Kenny Q. Zhu
ACL 2023(Findings). [arxiv] [github]
Low-Rank Prune-And-Factorize for Language Model Compression
Siyu Ren, Kenny Q. Zhu
COLING 2024. [arxiv]
Taxonomy of Abstractive Dialogue Summarization: Scenarios, Approaches and Future Directions
Qi Jia, Siyu Ren, Yizhu Liu, Kenny Q. Zhu
ACM Computing Survey. [arxiv].
Specializing Pre-trained Language Models for Better Relational Reasoning via Network Pruning
Siyu Ren, Kenny Q. Zhu
NAACL-HLT 2022(Findings). [arxiv] [github]
Leaner and Faster: Two-Stage Model Compression for Lightweight Text-Image Retrieval
Siyu Ren, Kenny Q. Zhu
NAACL-HLT 2022. [arxiv] [github]
Knowledge-driven distractor generation for cloze-style multiple choice questions
Siyu Ren, Kenny Q. Zhu
AAAI 2021. [arxiv] [github]
Multi-turn Response Selection using Dialogue Dependency Relations
Qi Jia, Yizhu Liu, Siyu Ren, Kenny Q. Zhu
EMNLP 2020. [arxiv] [github]
Service
Reviewer: ICLR, COLM, ACL, NAACL, EMNLP, AAAI
Awards
- 2021-2022 GuangHua Scholarship
- 2017-2018 Undergraduate Excellent Student Second-Class Scholarship
- 2016-2017 Undergraduate Excellent Student Second-Class Scholarship