[1]孙峰,龚晓玲,张炳杰,等.一种基于共轭梯度法的广义单隐层神经网络[J].郑州大学学报(工学版),2018,39(02):28-32.
 An Efficient Generalized Single Hidden Layer Neural Networks Based on Conjugate Gradient Method[J].Journal of Zhengzhou University (Engineering Science),2018,39(02):28-32.
点击复制

一种基于共轭梯度法的广义单隐层神经网络()
分享到:

《郑州大学学报(工学版)》[ISSN:1671-6833/CN:41-1339/T]

卷:
39
期数:
2018年02期
页码:
28-32
栏目:
出版日期:
2018-03-30

文章信息/Info

Title:
An Efficient Generalized Single Hidden Layer Neural Networks Based on Conjugate Gradient Method
作者:
孙峰龚晓玲张炳杰柳毓松王延江
文献标志码:
A
摘要:
单隐层前馈神经网络是一种高效且结构简单的神经网络,它的一种典型的学习算法就是误差反向传播(Error Back Propagation, BP)算法。这种算法基于最速下降法的原理,主要缺点是学习速度过慢。超限学习机(Extreme Learning Machine, ELM)极大地优化了单隐层神经网络的学习速度,却需要更多的隐层单元来达到与BP网络相当的效率,这不可避免的使网络结构冗余、测试时间变长。受到一种结合了ELM和最速下降法思想的USA(Upper-layer-Solution-Aware)算法的启发,提出一种基于共轭梯度法的单隐层神经网络快速算法,并把它应用于不同数据库中。试验结果表明,在相同网络结构的情况下,本算法的效率要优于ELM和USA算法。
Abstract:
The single hidden layer feedforward neural network was efficient with simple structure. Back Propagation Error (BP) algorithm was one of its typical learning algorithm which had one main shortcoming of the slow learning speed because of the use of the steepest descent method. Extreme Learning Machine (ELM) which could greatly accelerate the learning speed of networks was put forward. However, it demanded much more hidden neurons than BP algorithm to get the match accuracy, which led to redundant structure of networks and more testing time. Motived by the USA (Upper-layer-Solution-Aware)) which was a combination of the steepest descent method and ELM, in this paper, we proposed an algorithm based on the conjugate gradient method and train the network on different data sets. The Simulation results showed our algorithm had a better performance than USA and ELM with the same structure of the network.
更新日期/Last Update: 2018-04-02