国家天元数学中部中心学术报告 | 廖振宇 副教授(华中科技大学)

发布时间: 2023-05-26 20:11

报告题目:Random Matrix Methods for Machine Learning

报告时间:2023-06-01  09:30 - 10:30

报告人:廖振宇 副教授 华中科技大学

报告地点:理学院东北楼三楼会议室

Abstract: Numerous and large-dimensional data is now a default setting in modern machine learning (ML). Standard ML algorithms like kernel methods and neural networks were, however, initially designed out of small-dimensional intuitions and tend to misbehave, if not completely collapse, when dealing with real-world large-dimensional problem. Random matrix theory has recently developed a broad spectrum of tools to help understand this new "curse of dimensionality," to help repair or completely recreate the suboptimal algorithms, and most importantly to provide new intuitions to deal with modern data mining.

 In this talk, we will start with the example of covariance estimation and the "curse of dimensionality" phenomenon in high dimensions, and highlight many counterintuitive phenomena that arise when large-dimensional data are considered. Specifically, by focusing on the urgent need of compressing large and deep neural networks, we will show how the proposed random matrix framework can be applied to design efficient DNN compression schemes with strong performance guarantees.