您当前所在位置: 首页 > 学者
在线提示

恭喜!关注成功

在线提示

确认取消关注该学者?

邀请同行关闭

只需输入对方姓名和电子邮箱,就可以邀请你的同行加入中国科技论文在线。

真实姓名:

电子邮件:

尊敬的

我诚挚的邀请你加入中国科技论文在线,点击

链接,进入网站进行注册。

添加个性化留言

已为您找到该学者13条结果 成果回收站

上传时间

2021年03月23日

【期刊论文】Refinement of Operator-valued Reproducing Kernels

Journal of Machine Learning Research,-0001,13(4):91−136

-1年11月30日

摘要

This paper studies the construction of a refinement kernel for a given operator-valued reproducing kernel such that the vector-valued reproducing kernel Hilbert space of the refinement kernel contains that of the given kernel as a subspace. The study is motivated from the need of updating the current operator-valued reproducing kernel in multi-task learning when underfitting or overfitting occurs. Numerical simulations confirm that the established refinement kernel method is able to meet this need. Various characterizations are provided based on feature maps and vector-valued integral representations of operator-valued reproducing kernels. Concrete examples of refining translation invariant and finite Hilbert-Schmidt operator-valued reproducing kernels are provided. Other examples include refinement of Hessian of scalar-valued translation-invariant kernels and transformation kernels. Existence and properties of operator-valued reproducing kernels preserved during the refinement process are also investigated.

0

上传时间

2021年03月23日

【期刊论文】Vector-valued reproducing kernel Banach spaces with applications to multi-task learning

Journal of Complexity,2013,29(2):195-215

2013年04月01日

摘要

Motivated by multi-task machine learning with Banach spaces, we propose the notion of vector-valued reproducing kernel Banach spaces (RKBSs). Basic properties of the spaces and the associated reproducing kernels are investigated. We also present feature map constructions and several concrete examples of vector-valued RKBSs. The theory is then applied to multi-task machine learning. Especially, the representer theorem and characterization equations for the minimizer of regularized learning schemes in vector-valued RKBSs are established.

Vector-valued reproducing kernel Banach spaces Feature maps Regularized learning The representer theorem Characterization equations

0

上传时间

2021年03月23日

【期刊论文】Reproducing kernel Banach spaces with the norm

Applied and Computational Harmonic Analysis,2013,34(1):96-116

2013年01月01日

摘要

Targeting at sparse learning, we construct Banach spaces of functions on an input space X with the following properties: (1) possesses an norm in the sense that is isometrically isomorphic to the Banach space of integrable functions on X with respect to the counting measure; (2) point evaluations are continuous linear functionals on and are representable through a bilinear form with a kernel function; and (3) regularized learning schemes on satisfy the linear representer theorem. Examples of kernel functions admissible for the construction of such spaces are given.

Reproducing kernel Banach spaces Sparse learning Lasso Basis pursuit Regularization The representer theorem The Brownian bridge kernel The exponential kernel

0

上传时间

2021年03月23日

【期刊论文】Existence of the Bedrosian identity for Fourier multiplier operators

Forum Mathematicum,-0001,28(4):749-759

-1年11月30日

摘要

The Hilbert transformHsatisfies the Bedrosian identityH(fg)=$=$fHgwhenever the supports of the Fourier transforms off,g∈$\in$L2$L^{2}$(ℝ$\mathbb{R}$) are respectively contained inA=$=$[-a,b] andB=$=$ℝ$\mathbb{R}$∖$\setminus$(-b,a), where0≤$\leq$a,b≤$\leq$+∞$\infty$. Attracted by this interesting result arising from the time-frequency analysis, we investigate the existence of such an identity for a general bounded Fourier multiplier operator onL2$L^{2}$(ℝd$\mathbb{R}^{d}$) and for general support setsAandB. A geometric characterization of the support sets for the existence of the Bedrosian identity is established. Moreover, the support sets for the partial Hilbert transforms are all found. In particular, for the Hilbert transform to satisfy the Bedrosian identity, the support sets must be given as above.

0

上传时间

2021年03月23日

【期刊论文】Optimal sampling points in reproducing kernel Hilbert spaces

Journal of Complexity,2016,34():129-151

2016年06月01日

摘要

The recent development of compressed sensing seeks to extract information from as few samples as possible. In such applications, since the number of samples is restricted, one should deploy the sampling points wisely. We are motivated to study the optimal distribution of finite sampling points in reproducing kernel Hilbert spaces, the natural background function spaces for sampling. Formulation under the framework of optimal reconstruction yields a minimization problem. In the discrete measure case, we estimate the distance between the optimal subspace resulting from a general Karhunen–Loève transform and the kernel space to obtain another algorithm that is computationally favorable. Numerical experiments are then presented to illustrate the effectiveness of the algorithms for the searching of optimal sampling points.

Sampling points Optimal distribution Reproducing kernels The Karhunen–Loève transform

0

合作学者

  • 暂无合作作者