Publications and Preprints
My research studies the foundations of machine learning for stochastic optimization. Specifically, my research focuses on understanding the performance of data-driven optimization algorithms in the real-world "data -- model -- evaluation" pipeline. This includes developing:
fundamental tools in evaluation and selection of data-driven optimization models [J3, C4, W3];
efficient data integration models with performance guarantees [J5, J2, J1, C1];
empirical data benchmark and data-driven tools in understanding distribution shifts [J5, J4, W2].
Practically, I have been working with several companies and institutions, including the Fire Department of the City of New York (FDNY, [W4]) and Merck Sharp & Dohme (MSD, [J5]) to help design and implement data-driven optimization models in real operations. In Summer 2023, I spent a great time working on uncertainty attribution of inventory production control simulation systems as a research scientist intern at Supply Chain Optimization Technologies Team in Amazon.
For papers listed as follows, * means that authors are listed in alphabetical order.
+ means that authors are equally contributed (order listed alphabetically).
Journal Articles Published or Under Revisions
[J5] Optimizing Pharmaceutical Control with Multi-Task Contextual Bandits: Addressing Batch Heterogeneity for Improved Manufacturing Efficiency.
Tianyu Wang, Naz Pinar Taskiran, and Garud Iyengar
Major revision at Manufacturing and Service Operations Management.
Finalist of MSOM Data-Driven Research Challenge 2025.
[J4] Rethinking Distribution Shifts: Empirical Analysis and Inductive Modeling for Tabular Data
Jiashuo Liu+, Tianyu Wang+, Peng Cui, Hongseok Namkoong
Major revision at Management Science.
Preliminary version appeared in NeurIPS 2023 [C2].
[J3] Optimizer's Information Criterion: Dissecting and Correcting Bias in Data-Driven Optimization
Garud Iyengar, Henry Lam, Tianyu Wang*
Major revision at Management Science.
Honorable Mention of Dupačová-Prékopa Best Student Paper Prize in Stochastic Programming 2025.
[J2] Hedging Complexity in Generalization via a Parametric Distributionally Robust Optimization Framework
Garud Iyengar, Henry Lam, Tianyu Wang*
Major revision at Management Science.
Preliminary version appeared in AISTATS 2023 [C1].
[J1] Data-Driven Distributionally Robust CVaR Portfolio Optimization Under A Regime-Switching Ambiguity Set [Code]
Chi Seng Pun, Tianyu Wang, Zhenzhen Yan*
Manufacturing and Service Operations Management, 25(5): 1779 - 1795, 2023.
Refereed Conference Publications
[C4] Is Cross-validation the Gold Standard to Estimate Out-of-sample Model Performance? [Code]
Garud Iyengar, Henry Lam, Tianyu Wang*
Advances in Neural Information Processing Systems (NeurIPS) 2024.
[C3] Geometry-Calibrated DRO: Combating Over-Pessimism with Free Energy Implications
Jiashuo Liu, Jiayun Wu, Tianyu Wang, Hao Zou, Peng Cui
International Conference on Machine Learning (ICML) 2024.
Preliminary version appeared in NeurIPS 2023 Workshop on Distribution Shifts.
[C2] On the Need for a Language Describing Distribution Shifts: Illustrations on Tabular Datasets [Code] [Python package]
Jiashuo Liu+, Tianyu Wang+, Peng Cui, Hongseok Namkoong
Advances in Neural Information Processing Systems (NeurIPS) 2023, Datasets and Benchmarks Track.
Highlighted as NeurIPS 2023 Favorite Papers by Two Sigma (9/3500+) [Link]
[C1] Hedging against Complexity: Distributionally Robust Optimization with Parametric Approximation [Code]
Garud Iyengar, Henry Lam, Tianyu Wang*
International Conference on Artificial Intelligence and Statistics (AISTATS) 2023.
Notable Paper (Oral presentation), 32/1689 = 1.9% of submissions
Working Papers
|