# Journal of theKorean Mathematical SocietyJKMS

ISSN(Print) 0304-9914 ISSN(Online) 2234-3008

• ### 2022-01-01

#### Synchronized components of a subshift

Manouchehr Shahamat

Abstract : We introduce the notion of a minimal synchronizing word; that is a synchronizing word whose proper subwords are not synchronized. This has been used to give a new shorter proof for a theorem in [6]. Also, the common synchronized components of a subshift and its derived set have been characterized.

• ### 2022-01-01

#### Parabolic quaternionic Monge-Amp\{e}re equation on compact manifolds with a flat hyperK\"ahler metric

Jiaogen Zhang

Abstract : The quaternionic Calabi conjecture was introduced by Alesker-Verbitsky, analogous to the K\"ahler case which was raised by Calabi. On a compact connected hypercomplex manifold, when there exists a  flat hyperK\"ahler metric which is compatible with the underlying hypercomplex structure, we will consider the parabolic quaternionic Monge-Amp\{e}re equation.  Our goal is to prove the long time existence and $C^{\infty}$ convergence for normalized solutions  as $t\rightarrow\infty$. As a consequence, we show that the limit function is exactly the solution of quaternionic Monge-Amp\{e}re equation, this gives a parabolic proof for the quaternionic Calabi conjecture in this special setting.

• ### 2022-03-01

#### Margin-based generalization for classifications with input noise

Hi Jun Choe, Hayeong Koh, Jimin Lee

Abstract : Although machine learning shows state-of-the-art perfor\-man\-ce in a variety of fields, it is short a theoretical understanding of how machine learning works. Recently, theoretical approaches are actively being studied, and there are results for one of them, margin and its distribution. In this paper, especially we focused on the role of margin in the perturbations of inputs and parameters. We show a generalization bound for two cases, a linear model for binary classification and neural networks for multi-classification, when the inputs have normal distributed random noises. The additional generalization term caused by random noises is related to margin and exponentially inversely proportional to the noise level for binary classification. And in neural networks, the additional generalization term depends on (input dimension) $\times$ (norms of input and weights). For these results, we used the PAC-Bayesian framework. This paper is considering random noises and margin together, and it will be helpful to a better understanding of model sensitivity and the construction of robust generalization.

Chunfang Gao

## Current Issue

• ### Synchronized components of a subshift

Manouchehr Shahamat

J. Korean Math. Soc. 2022; 59(1): 1-12
https://doi.org/10.4134/JKMS.j200112

• ### Regularity of the generalized Poisson operator

Pengtao Li, Zhiyong Wang, Kai Zhao

J. Korean Math. Soc. 2022; 59(1): 129-150
https://doi.org/10.4134/JKMS.j210224

• ### Construction for self-orthogonal codes over a certain non-chain Frobenius ring

Boran Kim

J. Korean Math. Soc. 2022; 59(1): 193-204
https://doi.org/10.4134/JKMS.j210357

• ### Parabolic quaternionic Monge-Amp\{e}re equation on compact manifolds with a flat hyperK\"ahler metric

Jiaogen Zhang

J. Korean Math. Soc. 2022; 59(1): 13-33
https://doi.org/10.4134/JKMS.j200626

• ### Synchronized components of a subshift

Manouchehr Shahamat

J. Korean Math. Soc. 2022; 59(1): 1-12
https://doi.org/10.4134/JKMS.j200112

• ### Parabolic quaternionic Monge-Amp\`{e}re equation on compact manifolds with a flat hyperK\"ahler metric

Jiaogen Zhang

J. Korean Math. Soc. 2022; 59(1): 13-33
https://doi.org/10.4134/JKMS.j200626

• ### Margin-based generalization for classifications with input noise

Hi Jun Choe, Hayeong Koh, Jimin Lee

J. Korean Math. Soc. 2022; 59(2): 217-233
https://doi.org/10.4134/JKMS.j200406

• ### Hardy type estimates for Riesz transforms associated with Schr\"{o}dinger operators on the Heisenberg group

Chunfang Gao

J. Korean Math. Soc. 2022; 59(2): 235-254
https://doi.org/10.4134/JKMS.j200484