WASD neural network activated by bipolar sigmoid functions together with subsequent iterations
返回论文页
|更新时间:2023-12-11
|
WASD neural network activated by bipolar sigmoid functions together with subsequent iterations
Acta Scientiarum Naturalium Universitatis SunYatseniVol. 55, Issue 4, Pages: 1-10(2016)
作者机构:
1. 中山大学数据科学与计算机学院,广东,广州,510006
2. 华南理工大学自主系统和网络控制教育部重点实验室,广东,广州,510640
3. 广东顺德中山大学卡内基梅隆大学国际联合研究院,广东,佛山,528300
作者简介:
基金信息:
DOI:
CLC:
Published:2016,
Published Online:25 July 2016,
扫 描 看 全 文
ZHANG Yunong, XIAO Zhengli, DING Sitong, et al. WASD neural network activated by bipolar sigmoid functions together with subsequent iterations. [J]. Acta Scientiarum Naturalium Universitatis SunYatseni 55(4):1-10(2016)
DOI:
ZHANG Yunong, XIAO Zhengli, DING Sitong, et al. WASD neural network activated by bipolar sigmoid functions together with subsequent iterations. [J]. Acta Scientiarum Naturalium Universitatis SunYatseni 55(4):1-10(2016)DOI:
WASD neural network activated by bipolar sigmoid functions together with subsequent iterations
A weights-and-structure-determination (WASD) algorithm is proposed for the neural network using bipolar sigmoid activation functions together with subsequent iterations
which is the combination of the Levenberg-Marquardt algorithm and the weights-direct-determination method for neural network training. The proposed algorithm
combined with the Neural Network Toolbox of MATLAB software
aims at remedying the common weaknesses of traditional artificial neural networks
such as long-time learning expenditure
difficulty in determining the network structure
and to-be-improved performance of learning and generalization. Meanwhile
the WASD algorithm has good flexibility and operability. Taking data fitting of nonlinear functions for example
numerical experiments and comparison results illustrate the superiority of the WASD algorithm for determining the optimal number and optimal weights of hidden neurons. And the resultant neural network has more excellent performance on learning and generalization.