您当前所在位置: 首页 > 学者
在线提示

恭喜!关注成功

在线提示

确认取消关注该学者?

邀请同行关闭

只需输入对方姓名和电子邮箱,就可以邀请你的同行加入中国科技论文在线。

真实姓名:

电子邮件:

尊敬的

我诚挚的邀请你加入中国科技论文在线,点击

链接,进入网站进行注册。

添加个性化留言

已为您找到该学者13条结果 成果回收站

上传时间

2021年03月23日

【期刊论文】Margin Error Bounds for Support Vector Machines on Reproducing Kernel Banach Spaces

Neural Computation,2017,29(11):3078–3093&

2017年11月01日

摘要

Support vector machines, which maximize the margin from patterns to the separation hyperplane subject to correct classification, have received remarkable success in machine learning. Margin error bounds based on Hilbert spaces have been introduced in the literature to justify the strategy of maximizing the margin in SVM. Recently, there has been much interest in developing Banach space methods for machine learning. Large margin classification in Banach spaces is a focus of such attempts. In this letter we establish a margin error bound for the SVM on reproducing kernel Banach spaces, thus supplying statistical justification for large-margin classification in Banach spaces.

0

上传时间

2021年03月23日

【期刊论文】Sharp exponential bounds for the Gaussian regularized Whittaker–Kotelnikov–Shannon sampling series

Journal of Approximation Theory,2019,245():73-82

2019年09月01日

摘要

Fast reconstruction of a bandlimited function from its finite oversampling data has been a fundamental problem in sampling theory. As the number of sample data increases to infinity, exponentially-decaying reconstruction errors can be achieved by many methods in the literature. In fact, it is generally conjectured that when the optimal method is used, the dominant term in the error of reconstructing a function bandlimited to () from its data sampled at the integer points on is . By far, the best estimate for the constant among regularization methods is and is achieved by the highly efficient Gaussian regularized Whittaker–Kotelnikov–Shannon sampling series. We prove in this paper that the exponential constant is optimal for this method. Moreover, the optimal variance of the Gaussian regularizer is provided.

Bandlimited functions The Paley–Wiener space Sampling theorems Gaussian regularization Error bounds

0

上传时间

2021年03月23日

【期刊论文】Reproducing Kernel Banach Spaces for Machine Learning

The Journal of Machine Learning Research,-0001,10():2741-2775

-1年11月30日

摘要

We introduce the notion of reproducing kernel Banach spaces (RKBS) and study special semi-inner-product RKBS by making use of semi-inner-products and the duality mapping. Properties of an RKBS and its reproducing kernel are investigated. As applications, we develop in the framework of RKBS standard learning schemes including minimal norm interpolation, regularization network, support vector machines, and kernel principal component analysis. In particular, existence, uniqueness and representer theorems are established.

0

上传时间

2021年03月23日

【期刊论文】Existence of the Bedrosian identity for Fourier multiplier operators

Forum Mathematicum,-0001,28(4):749-759

-1年11月30日

摘要

The Hilbert transformHsatisfies the Bedrosian identityH(fg)=$=$fHgwhenever the supports of the Fourier transforms off,g∈$\in$L2$L^{2}$(ℝ$\mathbb{R}$) are respectively contained inA=$=$[-a,b] andB=$=$ℝ$\mathbb{R}$∖$\setminus$(-b,a), where0≤$\leq$a,b≤$\leq$+∞$\infty$. Attracted by this interesting result arising from the time-frequency analysis, we investigate the existence of such an identity for a general bounded Fourier multiplier operator onL2$L^{2}$(ℝd$\mathbb{R}^{d}$) and for general support setsAandB. A geometric characterization of the support sets for the existence of the Bedrosian identity is established. Moreover, the support sets for the partial Hilbert transforms are all found. In particular, for the Hilbert transform to satisfy the Bedrosian identity, the support sets must be given as above.

0

上传时间

2021年03月23日

【期刊论文】Refinable Kernels

Journal of Machine Learning Research,-0001,8(71):2083−2120

-1年11月30日

摘要

Motivated by mathematical learning from training data, we introduce the notion of refinable kernels. Various characterizations of refinable kernels are presented. The concept of refinable kernels leads to the introduction of wavelet-like reproducing kernels. We also investigate a refinable kernel that forms a Riesz basis. In particular, we characterize refinable translation invariant kernels, and refinable kernels defined by refinable functions. This study leads to multiresolution analysis of reproducing kernel Hilbert spaces.

0

合作学者

  • 暂无合作作者