国家天元数学中部中心高性能计算系列报告 | 吕绍高 教授(南京审计大学)

发布时间: 2021-10-14 16:51

报告题目:Nonparametric Optimality for Large Compressible Deep Neural Networks under Quadratic Loss Functions

报告时间:2021-10-15  10:00-11:30

报告人:吕绍高 教授 南京审计大学

腾讯会议ID:165 369 418

访问此链接进入会议,或添加至会议列表:https://meeting.tencent.com/dm/EmwdDHeescFP

Abstract: Establishing theoretical analysis that explain the empirical success of deep learning have attracted increasing attention in modern learning literature. Towards this direction, we evaluate excess risk of a deep learning estimator based on fully connected neural networks with ReLU activation function. In this paper, we establish optimal excess bounds under the quadratic loss and the composite structures of the true function. The obtained excess bounds are built upon so called compressible conditions on over-parameterized neural networks, including widely-used sparse networks, low rank networks associated with weight matrices as special cases. The core proof is based on advanced empirical processes and new approximation results concerning deep neural networks.