您当前所在位置: 首页 > 学者

吴微

  • 97浏览

  • 0点赞

  • 0收藏

  • 0分享

  • 220下载

  • 0评论

  • 引用

期刊论文

Deterministic Convergence of an Online Gradient Method for BP Neural Networks

吴微Wei Wu Guorui Feng Zhengxue Li and Yuesheng Xu

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL.16, NO.3, MAY 2005,-0001,():

URL:

摘要/描述

Online gradient methods are widely used for training feedforward neural networks. We prove in this paper a convergence theorem for an online gradient method with variable step size for backward propagation (BP) neural networks with a hidden layer. Unlike most of the convergence results that are of probabilistic and nonmonotone nature, the convergence result that we establish here has a deterministic and monotone nature.

关键词:

【免责声明】以下全部内容由[吴微]上传于[2006年01月26日 00时03分48秒],版权归原创者所有。本文仅代表作者本人观点,与本网站无关。本网站对文中陈述、观点判断保持中立,不对所包含内容的准确性、可靠性或完整性提供任何明示或暗示的保证。请读者仅作参考,并请自行承担全部责任。

我要评论

全部评论 0

本学者其他成果

    同领域成果