已为您找到该学者9条结果 成果回收站
曾晓勤, Yingfeng Wang
LETTER Communicated by Andries P. Engelbrecht Neural Computation 18, 2854-2877 (2006),-0001,():
-1年11月30日
The sensitivity of a neural network’s output to its input and weight perturbations is an important measure for evaluating the network’s performance. In this letter, we propose an approach to quantify the sensitivity of Madalines. The sensitivity is defined as the probability of output deviation due to input and weight perturbations with respect to overall input patterns. Based on the structural characteristics of Madalines, a bottomup strategy is followed, along which the sensitivity of single neurons, that is, Adalines, is considered first and then the sensitivity of the entire Madaline network. By means of probability theory, an analytical formula is derived for the calculation of Adalines’ sensitivity, and an algorithm is designed for the computation of Madalines’ sensitivity. Computer simulations are run to verify the effectiveness of the formula and algorithm. The simulation results are in good agreement with the theoretical results.
-
43浏览
-
0点赞
-
0收藏
-
0分享
-
68下载
-
0
-
引用
曾晓勤, Xiaoqin
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 17, NO. 2, MARCH 2006,-0001,():
-1年11月30日
In
Adaline,
-
40浏览
-
0点赞
-
0收藏
-
0分享
-
143下载
-
0
-
引用
【期刊论文】A sensitivity-based approach for pruning architecture of Madalines
曾晓勤, Xiaoqin Zeng, Jing Shao, Yingfeng Wang, Shuiming Zhong
Neural Comput & Applic (2009) 18: 957-965,-0001,():
-1年11月30日
Architecture design is a very important issue in neural network research. One popular way to find proper size of a network is to prune an oversize trained network to a smaller one while keeping established performance. This paper presents a sensitivity-based approach to prune hidden Adalines from a Madaline with causing as little as possible performance loss and thus easy compensating for the loss. The approach is novel in setting up a relevance measure, by means of an Adalines'sensitivity measure, to locate the least relevant Adaline in a Madaline. The sensitivity measure is the probability of an Adaline's output inversions due to input variation with respect to overall input patterns, and the relevance measure is defined as the multiplication of the Adaline's sensitivity value by the summation of the absolute value of the Adaline's outgoing weights. Based on the relevance measure, a pruning algorithm can be simply programmed, which iteratively prunes an Adaline with the least relevance value from hidden layer of a given Madaline and then conducts some compensations until no more Adalines can be removed under a given performance requirement. The effectiveness of the pruning approach is verified by some experimental results.
Adaline, Madaline, Architecture pruning, Sensitivity measure, Relevance measure
-
46浏览
-
0点赞
-
0收藏
-
0分享
-
102下载
-
0
-
引用
曾晓勤, Xiaoqin
LETTER Communicated by Terrence Sejnowski Neural Computation 15, 183-212 (2003),-0001,():
-1年11月30日
The sensitivity of a neural network’s output to its input perturbation is an important issue with both theoretical and practical values. In this article, we propose an approach to quantify the sensitivity of the most popular and general feedforward network: multilayer perceptron (MLP). The sensitivity measure is de.ned as the mathematical expectation of output deviation due to expected input deviation with respect to overall input patterns in a continuous interval. Based on the structural characteristics of the MLP, a bottom-up approach is adopted. A single neuron is considered .rst, and algorithms with approximately derived analytical expressions that are functions of expected input deviation are given for the computation of its sensitivity. Then another algorithm is given to compute the sensitivity of the entireMLP network. Computer simulations are used to verify the derived theoretical formulas. The agreement between theoretical and experimental results is quite good. The sensitivity measure can be used to evaluate the MLP’s performance.
-
40浏览
-
0点赞
-
0收藏
-
0分享
-
102下载
-
0
-
引用