I am Ruomin Huang (黄若民 in Chinese), a 2nd-year CS Ph.D. student at Duke University. I am very fortunate to be advised by Prof. Rong Ge. Previously I received M.S. in Data Science and B.S. in Computational Mathematics from USTC, where I worked with Prof. Hu Ding.
Previously I worked on the algorithmic aspect of ML. Now I am interested in mechanisms physics of transformers.
* denotes equal contribution.
Task Descriptors Help Transformers Learn Linear Models In-Context
Ruomin Huang and Rong Ge
ICLR 2025
Short version appears at 1st ICML Workshop on In-Context Learning (ICL@ICML 2024)
To Tackle Adversarial Transferability: A Novel Ensemble Training Method with Fourier Transformation
Wanlin Zhang, Weichen Lin, Ruomin Huang, Shihong Song and Hu Ding
ICLR 2025
ReCaLL: Membership Inference via Relative Conditional Log-Likelihoods
Roy Xie, Junlin Wang, Ruomin Huang, Minxing Zhang, Rong Ge, Jian Pei, Neil Gong and Bhuwan Dhingra
EMNLP 2024
An Effective Dynamic Gradient Calibration Method for Continual Learning
Weichen Lin, Jiaxiang Chen, Ruomin Huang and Hu Ding
ICML 2024
Coresets for Wasserstein Distributionally Robust Optimization Problems
Ruomin Huang, Jiawei Huang, Wenjie Liu and Hu Ding
NeurIPS 2022 (spotlight)
Coresets for Relational Data and The Applications
Jiaxiang Chen, Qingyuan Yang, Ruomin Huang and Hu Ding
NeurIPS 2022 (spotlight)
A Novel Sequential Coreset Method for Gradient Descent Algorithms
Jiawei Huang*, Ruomin Huang*, Wenjie Liu*, Nikolaos M. Freris and Hu Ding
ICML 2021 (spotlight)
Randomized Greedy Algorithms and Composable Coreset for k-Center Clustering with Outliers
Hu Ding, Ruomin Huang, Kai Liu, Haikuo Yu and Zixiu Wang
Preprint