Yiping Wang 王宜平

alt text 

Yiping Wang
Ph.D student
Paul G. Allen School of Computer Science & Engineering,
University of Washington
Email: ypwang61@cs.washington.edu

Google Scholar / Twitter / Github / LinkedIn

About me

I'm a second-year Ph.D. student in Paul G. Allen School of Computer Science & Engineering from University of Washington. I feel very fortunate to have worked under the guidance of Prof. Simon Shaolei Du since 2022 summer.

My main research interest broadly spread across machine learning theory and foundation models. For the theortical part, I care about understanding the foundations of deep learning and representation learning, especially the training dynamics of the basic components like Transformer. For the empirical part, I am keen on developing efficient algorithms with strong theoretical guarantees or insightful observations. Currently, in this aspect, I'm working on data selection/scheduling for multi-modal pretraining and improving inference efficiency of LLM. I'm also working on some projects related to video generation. In addition, I have always held a strong enthusiasm for understanding the essence of intelligence and exploring the cross-cutting areas of mathematics, physics, and AGI, such as using LLMs for mathematical proof and seeking scientific truth.

I'm grateful to all my collaborators and mentors along the way. I'm priviledged to be working closely with Dr. Yuandong Tian since 2023 spring. Besides, I'm also having intern at Microsoft started from June 2024, fortunate to be advised by Yelong Shen and Shuohang Wang. During my undergraduate, I was fortunate to work closely with Prof. Huaxiu Yao and Prof. Linjun Zhang.

Previously, I studied Computer Science and Mathematics in Zhejiang University, got an honors degree from Chu Kochen Honors College.

News

  • 09/2024: Attending MoDL 2024 in New York sponsored by Simons Foundation, and presenting our negCLIPLoss poster!

  • 09/2024: Our negCLIPLoss paper is accepted by NeurIPS 2024 as spotlight!

  • 06/2024: Started my internship at Microsoft!

  • 01/2024: One paper (JoMA) is accepted by ICLR 2024!

  • 12/2023: Attended NeurIPS 2023 in New Orleans!

  • 09/2023: One paper (Scan&Snap) is accepted by NeurIPS 2023!

  • 09/2023: Become a husky in UW!

My Favourite Papers

(* denotes equal contribution or alphabetic ordering.)

Data Selection Algorithm

We studied how to efficiently select data for multimodal pretraining tasks, drawing inspiration from both empirical observations and theoretical insights.

alt text 

CLIPLoss and Norm-Based Data Selection Methods for Multimodal Contrastive Learning [Arxiv] [Code] [Poster] [Twitter] [Previous Versions]
Yiping Wang*, Yifang Chen*, Wendan Yan, Alex Fang, Wenjing Zhou, Kevin Jamieson, Simon S. Du
NeurIPS 2024 (Spotlight)

tl;dr: We design universal data selection methods for CLIP pretraining and achieve near SOTA results with less than 10% of preprocessing resources. It can obtain a new SOTA in DataComp benchmark when combined with other approaches.

Training Dynamics of Transformer

We attempted to analyze the training dynamics of transformers in a mathematical way.

alt text 

Scan and Snap: Understanding Training Dynamics and Token Composition in 1-layer Transformer [Arxiv] [Poster] [Twitter]
Yuandong Tian, Yiping Wang, Beidi Chen, Simon S. Du
NeurIPS 2023
Oral presentation at High-dimensional learning dynamics workshop @ ICML 2023

tl;dr: We analyze the 1-layer transformer with next token prediction loss, and rigorously prove its training process and reveal how the token is combined via self-attention layer and the nature of its inductive bias.

alt text 

JoMA: Demystifying Multilayer Transformers via JOint Dynamics of MLP and Attention [Arxiv] [Twitter]
Yuandong Tian, Yiping Wang, Zhenyu Zhang, Beidi Chen, Simon S. Du
ICLR 2024

tl;dr: We analyze the training dynamics of multilayer transformer, characterizing the role of self-attention, MLP nonlinearity, and the learning procedure of hierarchical structure, if the data follow hierarchical generative models.