AdaBoost_Neural_Network-master
所属分类:人工智能/神经网络/深度学习
开发工具:Python
文件大小:7KB
下载次数:1
上传日期:2019-12-19 20:53:19
上 传 者:
甩帅
说明: 集成学习对于神经网络,对于时间序列的处理
(ensemble learning and deep learning)
文件列表:
DataExtraction (0, 2016-11-15)
DataExtraction\Extraction.py (1595, 2016-11-15)
License (1074, 2016-11-15)
MNIST_Analysis.py (2087, 2016-11-15)
NeuralNetwork (0, 2016-11-15)
NeuralNetwork\Activation.py (435, 2016-11-15)
NeuralNetwork\Adaboost.py (1835, 2016-11-15)
NeuralNetwork\Cost.py (152, 2016-11-15)
NeuralNetwork\Network.py (4166, 2016-11-15)
# AdaBoost_Neural_Network
Test the adaBoost algorithm on multiple neural networks.
## Theory
In order to improve the backpropagation, it is more accurate to associate a stronger weight to inputs which add the most information. In other words, an input which has a wrong computed output.
In order to do so, we developp an algorithm inspired by AdaBoost.
## Execution
```
python3.5 MNIST_Analysis.py
```
## Results
The current example uses three neural networks with 25 hidden nodes with 10 iterations for backpropagation for each :
Training set : 54 831 / 60 000
Testing set : 9 091 / 10 000
The result is a little less impressive than a unique neural network with 30 iterations, however, the proof shows that for a huge number of weak classifier the result should exponentially decrease.
The second test is with 5 identic neural networks :
Training set : 55 031 / 60 000
Testing set : 9 176 / 10 000
## Libraries
Needs struct, urllib.request, io, gzip, numpy and os. Executed with python3.5
近期下载者:
相关文件:
收藏者: