J. Korean Math. Soc. 2022; 59(2): 217-233
Online first article February 17, 2022 Printed March 1, 2022
https://doi.org/10.4134/JKMS.j200406
Copyright © The Korean Mathematical Society.
Hi Jun Choe, Hayeong Koh, Jimin Lee
Yonsei University; Telecommunication Technology Association; Yonsei University
Although machine learning shows state-of-the-art perfor\-man\-ce in a variety of fields, it is short a theoretical understanding of how machine learning works. Recently, theoretical approaches are actively being studied, and there are results for one of them, margin and its distribution. In this paper, especially we focused on the role of margin in the perturbations of inputs and parameters. We show a generalization bound for two cases, a linear model for binary classification and neural networks for multi-classification, when the inputs have normal distributed random noises. The additional generalization term caused by random noises is related to margin and exponentially inversely proportional to the noise level for binary classification. And in neural networks, the additional generalization term depends on (input dimension) $\times$ (norms of input and weights). For these results, we used the PAC-Bayesian framework. This paper is considering random noises and margin together, and it will be helpful to a better understanding of model sensitivity and the construction of robust generalization.
Keywords: Generalization bound, PAC-Bayesian, margin loss function
MSC numbers: Primary 68Q32, 65Y20
Supported by: Hi Jun Choe and Hayeong Koh were supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Korea government (No. 2015R1A5A1009350, No. 20181A2A3074566). Jimin Lee was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Korea government (No. 2020R1I1A1A01071731).
© 2022. The Korean Mathematical Society. Powered by INFOrang Co., Ltd