학술논문

Lower Bounds on the Expected Excess Risk Using Mutual Information
Document Type
Conference
Source
2021 IEEE Information Theory Workshop (ITW) Information Theory Workshop (ITW), 2021 IEEE. :1-6 Oct, 2021
Subject
Communication, Networking and Broadcast Technologies
Conferences
Noise measurement
Mutual information
Language
Abstract
The expected excess risk of a learning algorithm is the average suboptimality of using the learning algorithm, relative to the optimal hypothesis in the hypothesis class. In this work, we lower bound the expected excess risk of a learning algorithm using the mutual information between the input and the noisy output of the learning algorithm. The setting we consider is, where the hypothesis class is the set of real numbers and the true risk function has a local strong convexity property. Our main results also lead to asymptotic lower bounds on the expected excess risk, which do not require the knowledge of the local strong convexity constants of the true risk function.