您当前所在位置: 首页 > 学者

吴微

  • 112浏览

  • 0点赞

  • 0收藏

  • 0分享

  • 96下载

  • 0评论

  • 引用

期刊论文

Deterministic Convergence of an Online Gradient Method for Neural Networks

吴微Wei Wu* and Yuesheng Xuy

,-0001,():

URL:

摘要/描述

The online gradient method has been widely used as a learning algorithm for neural networks. We establish a deterministic convergence of online gradient methods for the training of a class of nonlinear feedforward neural networks when the training examples are linearly independent. We choose the learning rate to be a constant during the training procedure. The monotonicity of the error function in the iteration is proved. A criterion for choosing the learning rate is also provided to guarantee the convergence. Under certain conditions similar to those for the classical gradient methods, an optimal convergence rate for our online gradient methods is proved.

【免责声明】以下全部内容由[吴微]上传于[2006年01月25日 23时57分09秒],版权归原创者所有。本文仅代表作者本人观点,与本网站无关。本网站对文中陈述、观点判断保持中立,不对所包含内容的准确性、可靠性或完整性提供任何明示或暗示的保证。请读者仅作参考,并请自行承担全部责任。

我要评论

全部评论 0

本学者其他成果

    同领域成果