神經(jīng)網(wǎng)絡理論及應用(英文版)
定 價:198 元
- 作者:范欽偉
- 出版時間:2025/6/1
- ISBN:9787030778574
- 出 版 社:科學出版社
- 中圖法分類:TP183
- 頁碼:215
- 紙張:
- 版次:1
- 開本:B5
本書闡述了基于群組及Lp正則項稀疏神經(jīng)元反應的多種神經(jīng)網(wǎng)絡建模的思想,介紹了以大腦神經(jīng)元反應能耗為約束的多目標稀疏優(yōu)化模型,并針對傳統(tǒng)前饋神經(jīng)網(wǎng)絡、Pi-Sigma神經(jīng)網(wǎng)絡,Sigma-Pi-Sigma 神經(jīng)網(wǎng)絡、遞歸神經(jīng)網(wǎng)絡、嶺多項式神經(jīng)網(wǎng)絡、Elman 神經(jīng)網(wǎng)絡等給出了相關的稀疏優(yōu)化方案,并針對每種具體的神經(jīng)網(wǎng)絡模型給出權值的有界性、算法的強、弱收斂性分析,同時通過大量的回歸、分類實例驗證了算法理論的正確性及算法的有效性。
更多科學出版社服務,請掃碼獲取。
主要從事神經(jīng)網(wǎng)絡結構稀疏化及算法收斂性研究。在《Neural Networks》等著名雜志發(fā)表論文10余篇。
Contents
“博士后文庫”序言
Preface
Chapter 1 Batch gradient method with smoothing L1/2 regularization for training of feedforward neural network 1
1.1 Introduction 1
1.2 Batch gradient method with smoothing L1/2 regularization 3
1.2.1 Batch gradient method with L1/2 regularization 3
1.2.2 Smoothing L1/2 regularization 5
1.3 Convergence results 6
1.4 Numerical examples 7
1.4.1 XOR problem and Parity problem 7
1.4.2 Sonar problem 10
1.5 Conclusions 10
1.6 Appendix 11
Chapter 2 Convergence of online gradient method for feedforward neural network with smoothing L1/2 regularization penalty 17
2.1 Introduction 17
2.2 Algorithm description 19
2.2.1 Online gradient method with L2 regularization 20
2.2.2 Online gradient method with L1/2 regularization 21
2.2.3 Online gradient method with smoothing L1/2 regularization 22
2.3 Convergence results 24
2.4 Numerical examples 24
2.4.1 Example 1: Parity problem 24
2.4.2 Example 2: function regression problem 26
2.5 Conclusions 28
2.6 Appendix 28
Chapter 3 Deterministic convergence analysis via smoothing group lasso regularization and adaptive momentum for Sigma-Pi-Sigma neural network 41
3.1 Introduction 41
3.2 Network structure and learning algorithm 44
3.2.1 Batch gradient algorithm with group lasso regularization and adaptive momentum 44
3.2.2 Batch gradient algorithm with smoothing group lasso regularization and adaptive momentum 47
3.3 Convergence results 48
3.4 Simulation results 49
3.4.1 Function approximation problems 49
3.4.2 K-dimensional Parity problem 52
3.4.3 Classification problems 54
3.5 Conclusions 56
3.6 Appendix 56
Chapter 4 Convergence analysis for Sigma-Pi-Sigma neural network based on some relaxed conditions 70
4.1 Introduction 70
4.2 Sigma-Pi-Sigma neural network and batch gradient algorithm 73
4.2.1 Sigma-Pi-Sigma neural network 73
4.2.2 Batch gradient algorithm for Sigma-Pi-Sigma neural network with regularization 74
4.3 Main results 75
4.4 Simulation results 77
4.4.1 Function approximation problems 77
4.4.2 Parity problems 79
4.4.3 Classification problems 79
4.5 Conclusions 82
4.6 Appendix 82
Chapter 5 Recurrent neural networks with smoothing regularization for regression and multiclass classification problems 95
5.1 Introduction 95
5.2 The model structure of RNN97
5.3 Gradient learning method in RNN with smoothing L1/2 regularization 100
5.4 Numerical simulations 103
5.4.1 XOR problems 103
5.4.2 Function approximation problems 104
5.4.3 Classification problems 105
5.5 Conclusions 107
Chapter 6 Weak and strong convergence analysis of Elman neural network via weight decay regularization.108
6.1 Introduction 108
6.2 Algorithm description 110
6.2.1 Batch gradient algorithm for Elman neural network 112
6.2.2 Batch gradient algorithm for Elman neural network with L2
regularization.113
6.2.3 Introduction of related definitions 114
6.3 Convergence results 115
6.4 Numerical examples.116
6.4.1 Function approximation 117
6.4.2 Real-word classification 118
6.4.3 XOR with two-cycle delay problem 120
6.5 Conclusions 122
6.6 Appendix 122
Chapter 7 Convergence of a gradient-based learning algorithm with penalty for ridge polynomial neural network 131
7.1 Introduction 131
7.2 Network structure description 133
7.2.1 Pi-Sigma neural network 133
7.2.2 Ridge polynomial neural network 134
7.3 Batch gradient learning algorithm for RPNN135
7.3.1 The original group lasso regularization algorithm 135
7.3.2 The smoothing group lasso regularization algorithm 137
7.4 Theorems of monotonicity and convergence 138
7.5 Numerical examples 139
7.5.1 Example 1: function approximation problem 139
7.5.2 Example 2: Parity problem 141
7.6 Conclusions 143
7.7 Appendix 143
Chapter 8 Regression and multiclass classification using sparse extreme learning machine via smoothing group L1/2 regularizer 152
8.1 Introduction 152
8.2 The preliminary ELM.154
8.3 Algorithm description 156
8.3.1 Group L1/2 ELM for hidden nodes and output weights 156
8.3.2 Smoothing group L1/2 ELM for hidden nodes and output weights 157
8.4 Experimental results 159
8.4.1 Experimental setup 159
8.4.2 Benchmarking with regression problems 161
8.4.3 Benchmarking with classification problems 166
8.5 Conclusions 168
Chapter 9 Smooth L0 regularization for extreme learning machine 170
9.1 Introduction 170
9.2 Extreme learning machine 172
9.3 Extreme learning machine with L0 regularization 174
9.4 Description of sparsity 175
9.5 Simulation results 177
9.5.1 Function regression problems 177
9.5.2 Real-word classification problems 181
9.6 Conclusions 183
Chapter 10 A hybrid model of extreme learning machine based on bat and cuckoo search algorithm for regression and multiclass classification 184
10.1 Introduction 184
10.2 The preliminary of ELM 186
10.3 Algorithm description 188
10.3.1 Bat algorithm 188
10.3.2 Cuckoo search algorithm 189
10.3.3 Bat cuckoo hybrid algorithm 190
10.4 Hybrid algorithm of extreme learning machine based on bat cuckoo algorithm 191
10.5 Experimental results 194
10.5.1 Function fitting 194
10.5.2 Classification problems 196
10.6 Conclusions.199
References 200
編后記 216