您当前所在位置: 首页 > 学者
在线提示

恭喜!关注成功

在线提示

确认取消关注该学者?

邀请同行关闭

只需输入对方姓名和电子邮箱,就可以邀请你的同行加入中国科技论文在线。

真实姓名:

电子邮件:

尊敬的

我诚挚的邀请你加入中国科技论文在线,点击

链接,进入网站进行注册。

添加个性化留言

已为您找到该学者10条结果 成果回收站

上传时间

2006年01月26日

【期刊论文】A New Training Algorithm for a Fuzzy Perceptron and its Convergence

吴微, Jie Yang, Wei Wu, , ☆ and Zhiqiong Shao

,-0001,():

-1年11月30日

摘要

In this paper, we present a new training algorithm for a fuzzy perceptron. In the case where the dimension of the input vectors is two and the training examples are separable, we can prove a-nite convergence, i.e., the training procedure for the network weights will stop after-nite steps. When the dimension is greater than two, stronger conditions are needed to guarantee the-nite convergence.

上传时间

2006年01月26日

【期刊论文】Deterministic Convergence of an Online Gradient Method for BP Neural Networks

吴微, Wei Wu, Guorui Feng, Zhengxue Li, and Yuesheng Xu

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL.16, NO.3, MAY 2005,-0001,():

-1年11月30日

摘要

Online gradient methods are widely used for training feedforward neural networks. We prove in this paper a convergence theorem for an online gradient method with variable step size for backward propagation (BP) neural networks with a hidden layer. Unlike most of the convergence results that are of probabilistic and nonmonotone nature, the convergence result that we establish here has a deterministic and monotone nature.

上传时间

2006年01月26日

【期刊论文】Convergence of an Online Gradient Method for BP Neural Networks with Stochastic Inputs☆

吴微, Zhengxue Li, WeiWu, , ☆☆, Guorui Feng, and Huifang Lu

,-0001,():

-1年11月30日

摘要

An online gradient method for BP neural networks is presented and discussed. The input training examples are permuted stochastically in each cycle of iteration. A monotonicity and a weak convergence of deterministic nature for the method are proved.

上传时间

2006年01月26日

【期刊论文】Bifurcation from local local steady-states to global dynamics

吴微, WEI WU, ZHENGXUE Li

,-0001,():

-1年11月30日

摘要

A survey is given for some recent developments on bifurcation from local steady-states to global dynamics governed by nonlinear ODE's with Z2 or O(2) symmetries. In particular, we are mainly concerned with a double singular point, which is a Z2 symmetric steady-state possessing a Jacobian with two zero eigenvalues. There exist, under suitable con-ditions, Hopf points and heteroclinic cycles bifurcating from the double singular point. We also considered a triple zero point with an O(2) sym-metry, from which bifurcate standing waves, rotating waves and modu-lated rotating waves.

Bifurcations,, local steady-states,, global dynamics,, Hopf points,, het-eroclinic points,, Z2 and O(, 2), symmetries,, rotating waves.,

上传时间

2006年01月26日

【期刊论文】Convergence of an online gradient method for feedforward neural networks with stochastic inputs☆

吴微, Zhengxue Li a, Wei Wu a, *, Yulong Tian b

,-0001,():

-1年11月30日

摘要

In this paper, we study the convergence of an online gradient method for feed-forward neural networks. The input training examples are permuted stochastically in each cycle of iteration. A monotonicity and a weak convergence of deterministic nature are proved.

Feedforward neural networks, Online gradient method, Convergence, Stochastic inputs

合作学者

  • 吴微 邀请

    大连理工大学,辽宁

    尚未开通主页