I am Ruomin Huang (黄若民 in Chinese), a 1st-year CS Ph.D. student at Duke University. I am fortunate to be advised by Prof. Rong Ge. Previously I received M.S. in Data Science and B.S. in Computational Mathematics from USTC, where I worked with Prof. Hu Ding.
Previously I worked on the algorithmic aspect of ML. Recently I am interested in the mechanism of in-context learning.
* denotes equal contribution.
Ruomin Huang, Rong Ge, Task Descriptors Help Transformers Learn Linear Models In-Context, ICML 2024 Workshop on In-Context Learning.
Roy Xie, Junlin Wang, Ruomin Huang, Minxing Zhang, Rong Ge, Jian Pei, Neil Gong, Bhuwan Dhingra, ReCaLL: Membership Inference via Relative Conditional Log-Likelihoods, preprint.
Weichen Lin*, Jiaxiang Chen*, Ruomin Huang, Hu Ding, An Effective Dynamic Gradient Calibration Method for Continual Learning, ICML 2024.
Hu Ding, Ruomin Huang, Kai Liu, Haikuo Yu, Zixiu Wang, Randomized Greedy Algorithms and Composable Coreset for k-Center Clustering with Outliers, preprint.
Ruomin Huang, Jiawei Huang, Wenjie Liu, Hu Ding, Coresets for Wasserstein Distributionally Robust Optimization Problems, NeurIPS 2022 (spotlight).
Jiaxiang Chen, Qingyuan Yang, Ruomin Huang, Hu Ding, Coresets for Relational Data and The Applications, NeurIPS 2022 (spotlight).
Jiawei Huang*, Ruomin Huang*, Wenjie Liu*, Nikolaos M. Freris, Hu Ding, A Novel Sequential Coreset Method for Gradient Descent Algorithms, ICML 2021 (spotlight).
I am open to discussions.