浙江工业大学物理学院
 所在位置:首页 > 博学堂讲座
博学堂讲座
Stochastic Optimization for AUC Maximization in Machine Learning (第486讲)
浏览量:1559    发布时间:2019-12-23 21:37:00

报告题目:Stochastic Optimization for AUC Maximization in Machine Learning

报告人:Prof. Yiming Ying,

报告时间:12月27日 上午 9:30-10:30

报告地点:教师活动中心

报告题目: Stochastic Optimization for AUC Maximization in Machine Learning
 

报告摘要:  Stochastic optimization algorithms such as stochastic gradient descent (SGD) update the model sequentially with cheap per-iteration costs, making them amenable for large-scale streaming data analysis. However, most of the existing studies focus on the classification accuracy which  can not be directly applied to the important problems of maximizing the Area under the ROC curve (AUC) in imbalanced classification and bipartite ranking.
     In this talk, I will present our recent work on developing novel SGD-type algorithms for AUC maximization. Our new algorithms can allow general loss functions and penalty terms which are achieved through the innovative interactions between machine learning and applied mathematics. Compared with the previous literature which requires high storage and per-iteration costs, our algorithms have both space and per-iteration costs of one datum while achieving optimal convergence rates.
报告人: Prof. Yiming Ying,   Department of Mathematics and Statistics,  State University of New York at Albany

报告时间:2019年12月27日 上午 9:30---10:30   

地点:教师活动中心

博学堂讲座
Stochastic Optimization for AUC Maximization in Machine Learning (第486讲)
浏览量:1559    发布时间:2019-12-23 21:37:00

报告题目:Stochastic Optimization for AUC Maximization in Machine Learning

报告人:Prof. Yiming Ying,

报告时间:12月27日 上午 9:30-10:30

报告地点:教师活动中心

报告题目: Stochastic Optimization for AUC Maximization in Machine Learning
 

报告摘要:  Stochastic optimization algorithms such as stochastic gradient descent (SGD) update the model sequentially with cheap per-iteration costs, making them amenable for large-scale streaming data analysis. However, most of the existing studies focus on the classification accuracy which  can not be directly applied to the important problems of maximizing the Area under the ROC curve (AUC) in imbalanced classification and bipartite ranking.
     In this talk, I will present our recent work on developing novel SGD-type algorithms for AUC maximization. Our new algorithms can allow general loss functions and penalty terms which are achieved through the innovative interactions between machine learning and applied mathematics. Compared with the previous literature which requires high storage and per-iteration costs, our algorithms have both space and per-iteration costs of one datum while achieving optimal convergence rates.
报告人: Prof. Yiming Ying,   Department of Mathematics and Statistics,  State University of New York at Albany

报告时间:2019年12月27日 上午 9:30---10:30   

地点:教师活动中心