Speaker: Fnu Suya

Date: Feb 26, 11:45am–12:45pm

Abstract: Machine learning models are often vulnerable to attacks during both training and test phases, yet the risks in adversarial environments are frequently misjudged. In this talk, I will first demonstrate that black-box test time attacks, which require only API access to the victim model, are more potent than previously believed when assessed with realistic attacker goals. Secondly, I will address the overestimation of threats from training time attacks, particularly data poisoning. By designing an empirical attack with state-of-the-art performance, I will show that such attacks, even under strong knowledge about the victim, are inherently limited in their ability to compromise model performance across subpopulations or entire distributions. Through rigorous theoretical analysis, I will further reveal that certain subpopulations and distributions possess natural resistance to any poisoning attacks. This insight suggests that future defenses can be strengthened by focusing on improving distributional quality. In conclusion, I will discuss my future research plans aimed at designing machine learning models that maintain trustworthiness in adversarial environments.

Biographical Sketch: Fnu Suya is an MC2 postdoctoral fellow at the Maryland Cybersecurity Center, University of Maryland, College Park. Previously, he obtained his PhD degree in computer science from the University of Virginia in 2023, advised by Prof. David Evans and Prof. Yuan Tian. His research area is in trustworthy machine learning and machine learning for security, with an interest in the realistic analysis of risks associated with deploying machine learning models in real-world scenarios, especially in contaminated training environments. His work has been published in top-tier conferences, including Usenix Security, IEEE S&P, CVPR, ICML and NeurIPS. He received a best paper award at the VISxAI workshop in 2022 and has been recognized as a top reviewer for ICLR and NeurIPS.

Location and Zoom link: 307 Love, or https://fsu.zoom.us/j/92298043290