对偶神经网络中激活函数对的改进
首发时间:2018-03-29
摘要:对偶神经网络的多重定积分计算方法相比于传统的数值积分方法,优势在于可以获得被积函数的原函数,但在实际训练过程中精度和效率仍无法达到预期。为此,本文拟通过构造对偶神经网络中新的激活函数对来解决此类问题,引入了sigmoid()/softplus()做为对偶神经网络新的激活函数对。通过算例仿真,相比于采用传统激活函数对的对偶神经网络,含有激活函数对sigmoid()/ softplus()的对偶神经网络精度更高、收敛速度更快。
关键词: 对偶神经网络; 多重定积分; sigmoid()/softplus(); 激活函数对
For information in English, please click here
The improvement of activation function to dual neural network
Abstract: Compared with traditional method of numerical calculation, the dual neural network calculation method of multiple definite integral has the advantages of high efficiency and high precision, but in the process of actual traing, the precision and efficiency can not meet expectations. Hence, a great deal of studies focus on the construction of new activation functions of dual neural network, in this paper, we apply sigmoid/softplus as the new activation functions. The simulation by examples, compared with the dual neural network with traditional activation functions, the results show that the dual neural network with sigmoid/softplus activition functions has higher precision and can increase convergence speed.
Keywords: Dual neural network Multiple definite integral sigmoid/softplus activition functions
引用
No.****
动态公开评议
共计0人参与
勘误表
对偶神经网络中激活函数对的改进
评论
全部评论0/1000