ECE PhD Prospectus Defense: Jiujia Zhang

  • Starts: 10:00 am on Monday, November 24, 2025
  • Ends: 11:30 am on Monday, November 24, 2025

ECE PhD Prospectus Defense: Jiujia Zhang

Title: Robustifying Online Convex Optimization via Gradient Clipping and Regularization

Presenter: Jiujia Zhang

Advisor: Professor Ashok Cutkosky

Chair: Professor Kayhan Batmanghelich

Committee: Professor Kayhan Batmanghelich, Professor Brian Kulis, Professor Ashok Cutkosky

Google Scholar Link: https://scholar.google.com/citations?hl=en&user=eiOVT-8AAAAJ

Abstract: Contemporary machine learning models are largely trained iteratively with first-order optimization methods that rely on gradients as feedback. Online Convex Optimization (OCO) provides a general framework for designing such algorithms for convex losses under varying degrees of prior knowledge about the problem. Existing methods require exact gradients, but gradients can often be corrupted in practice. For example, model training typically relies on mini-batch data at each iteration, producing heavy-tailed noisy gradients. In addition, mislabeling or gradual distributional shifts during data collection translate to adversarially contaminated gradients throughout the training process. This prospectus will present the combination of gradient clipping and regularization as a unified framework that extends existing OCO algorithms to handle corrupted feedback. With appropriately chosen clipping thresholds and regularizers, the approach guarantees tight dependence on the degree of corruption in both heavy-tailed and adversarial cases. The shared principles underlying both scenarios highlight how simple and theoretically grounded modifications can systematically improve the robustness of OCO algorithms in noisy or potentially adversarial environments.