About Me
I am currently a postdoc researcher in Tsinghua University, working with Prof. Jun Zhu at Department of Computer Science and Technology. Before that, I received the PhD degree from Tsinghua University (Aug 2019 - Jul 2023), co-advised by Prof. Yi Zhong at School of Life Sciences and Prof. Jun Zhu at Department of Computer Science and Technology. I received the BS degree also from Tsinghua University (Aug 2013 - Jul 2017), with major in biological science and minor in computer science. I was a visiting scholar in Massachusetts Institute of Technology (Jun 2016 to Sep 2016), advised by Prof. Yingxi Lin at Department of Brain and Cognitive Sciences. I was a participant in Tsinghua-Huawei Large Granularity Long Term Cooperation Project (Sep 2019 to May 2022), working with Dr. Lanqing Hong and Dr. Zhenguo Li in Huawei Noah’s Ark Lab.
I have an interdisciplinary background in neuroscience and machine learning. My primary research interest lies in the development of bio-inspired machine learning methodologies and generic computational models for neuroscience. The current focus includes continual / incremental / lifelong learning and transfer learning, by exploring “natural algorithms” in biological learning and memory.
Selected Publications
2023
Liyuan Wang$^{\ast}$, Xingxing Zhang$^{\ast}$, Qian Li, Mingtian Zhang, Hang Su, Jun Zhu, Yi Zhong. Incorporating Neuro-Inspired Adaptability for Continual Learning in Artificial Intelligence. Nature Machine Intelligence, 2023.
Liyuan Wang, Jingyi Xie, Xingxing Zhang, Hang Su, Jun Zhu. Towards a General Framework for Continual Learning with Pre-training. In NeurIPS Intrinsically Motivated Open-ended Learning (IMOL) Workshop (NeurIPSW), 2023.
Liyuan Wang, Jingyi Xie, Xingxing Zhang, Mingyi Huang, Hang Su, Jun Zhu. Hierarchical Decomposition of Prompt-Based Continual Learning: Rethinking Obscured Sub-optimality. In Neural Information Processing Systems (NeurIPS), Spotlight, 2023.
Yilin Lyu$^{\ast}$, Liyuan Wang$^{\ast}$, Xingxing Zhang, Zicheng Sun, Hang Su, Jun Zhu, Liping Jing. Overcoming Recency Bias of Normalization Statistics in Continual Learning: Balance and Adaptation. In Neural Information Processing Systems (NeurIPS), 2023.
Gengwei Zhang$^{\ast}$, Liyuan Wang$^{\ast}$, Guoliang Kang, Ling Chen, Yunchao Wei. SLCA: Slow Learner with Classifier Alignment for Continual Learning on a Pre-trained Model. In International Conference on Computer Vision (ICCV), 2023.
Jianjian Zhao, Xuchen Zhang, Bohan Zhao, Liyuan Wang, Wantong Hu, Yi Zhong, Qian Li. Genetic Dissection of Mutual Interference between Two Consecutively Learned Tasks in Drosophila. eLife, 2023, 12:e83516.
Liyuan Wang, Xingxing Zhang, Hang Su, Jun Zhu. A Comprehensive Survey of Continual Learning: Theory, Method and Application. arXiv preprint arXiv:2302.00487, 2023. Under Revision in IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI). Reported in Zhuanzhi, AI Era, Zhihu, Twitter.
2022
Liyuan Wang$^{\ast}$, Xingxing Zhang$^{\ast}$, Qian Li, Jun Zhu, Yi Zhong. CoSCL: Cooperation of Small Continual Learners is Stronger than a Big One. In European Conference on Computer Vision (ECCV), 2022.
Xingxing Zhang, Zhizhe Liu, Weikai Yang, Liyuan Wang, Jun Zhu. The More, The Better? Active Silencing of Non-Positive Transfer for Efficient Multi-Domain Few-Shot Classification. In ACM Multimedia (MM), 2022.
Liyuan Wang$^{\ast}$, Xingxing Zhang$^{\ast}$, Kuo Yang, Longhui Yu, Chongxuan Li, Lanqing Hong, Shifeng Zhang, Zhenguo Li, Yi Zhong, Jun Zhu. Memory Replay with Data Compression for Continual Learning. In International Conference on Learning Representations (ICLR), 2022.
2021
Liyuan Wang, Mingtian Zhang, Zhongfan Jia, Qian Li, Chenglong Bao, Kaisheng Ma, Jun Zhu, Yi Zhong. AFEC: Active Forgetting of Negative Transfer in Continual Learning. In Neural Information Processing Systems (NeurIPS), 2021.
Liyuan Wang, Kuo Yang, Chongxuan Li, Lanqing Hong, Zhenguo Li, Jun Zhu. ORDisCo: Effective and Efficient Usage of Incremental Unlabeled Data for Semi-supervised Continual Learning. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2021.
Liyuan Wang, Bo Lei, Qian Li, Hang Su, Jun Zhu, Yi Zhong. Triple-Memory Networks: A Brain-Inspired Method for Continual Learning. IEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2021, 33(5):1925-34.
Academic Services
- Conference Reviewer: NeurIPS, ICLR, CVPR, ICCV, ECCV, ACM MM, CoLLAs
- Journal Reviewer: TPAMI, TNNLS, TCSVT, Artificial Intelligence, Neural Networks
Honors and Awards
- Shuimu Tsinghua Scholar for Postdoc Researcher, 2023.
- National Scholarship for Graduate Student, 2022.
- Beijing Outstanding Graduate Award, 2017.
- Tsinghua Outstanding Graduate Award, 2017.
- National Scholarship for Undergraduate Student, 2014, 2015, 2016.