backPROPnetwork
所属分类:人工智能/神经网络/深度学习
开发工具:C++
文件大小:210KB
下载次数:14
上传日期:2007-01-16 09:50:28
上 传 者:
neuo2001
说明: 完整的反向传播神经网络模式识别C++源代码,原版权威所著,学习物有所值.
(integrity of the back-propagation neural network pattern recognition C source code, original authoritative book, learning value for money.)
文件列表:
BACKPROP\BPROP.CPP (22488, 1995-08-20)
BACKPROP\BPROP.DEF (264, 1993-07-23)
BACKPROP\BPROP.EXE (65072, 1995-08-20)
BACKPROP\DIGIT0.TRN (2891, 1995-08-20)
BACKPROP\DIGIT1.TRN (2579, 1995-08-20)
BACKPROP\DIGIT2.TRN (1476, 1995-08-20)
BACKPROP\DIGIT3.TRN (2480, 1995-08-20)
BACKPROP\DIGIT4.TRN (2300, 1995-08-20)
BACKPROP\DIGIT5.TRN (1541, 1995-08-20)
BACKPROP\DIGIT6.TRN (1808, 1995-08-20)
BACKPROP\DIGIT7.TRN (1814, 1995-08-20)
BACKPROP\DIGIT8.TRN (2612, 1995-08-20)
BACKPROP\DIGIT9.TRN (2888, 1995-08-20)
BACKPROP\GRID1.CPP (48311, 1995-08-20)
BACKPROP\GRID1.DEF (279, 1995-08-14)
BACKPROP\GRID1.EXE (98304, 1995-08-20)
BACKPROP\GRID1.H (1997, 1995-08-14)
BACKPROP\GRID1.RC (4001, 1995-08-14)
BACKPROP\GRID1.RES (2106, 1995-08-14)
BACKPROP\H16.GBX (111, 1993-08-02)
BACKPROP\H16.PRM (41, 1995-08-20)
BACKPROP\H8.GBL (101, 1993-08-02)
BACKPROP\H8.PRM (41, 1995-08-20)
BACKPROP\H8D0.WGT (16248, 1995-08-20)
BACKPROP\H8D1.WGT (16164, 1995-08-20)
BACKPROP\H8D2.WGT (16342, 1995-08-20)
BACKPROP\H8D3.WGT (16379, 1995-08-20)
BACKPROP\H8D4.WGT (16381, 1995-08-20)
BACKPROP\H8D5.WGT (16513, 1995-08-20)
BACKPROP\H8D6.WGT (16393, 1995-08-20)
BACKPROP\H8D7.WGT (16435, 1995-08-20)
BACKPROP\H8D8.WGT (16414, 1995-08-20)
BACKPROP\H8D9.WGT (16377, 1995-08-20)
BACKPROP\LOAD.HPP (1033, 1995-08-20)
BACKPROP\M12CHARS.H (21538, 1995-08-14)
BACKPROP\MAKEFILE (1077, 1995-08-20)
BACKPROP\MISCLIB.H (2395, 1995-08-20)
BACKPROP\PNET.CPP (23404, 1995-08-16)
BACKPROP\QNET.CPP (15760, 1995-08-20)
... ...
There are two parts to this directory
1) Training by backprop
2) The GRID1 user interface for testing
Both source and exe are provided
1) TRAINING BY BACKPROP.
Usage:
bprop TrainingFile ParmFile WeightFile
Parm file format:
Temperature);
ETA - learning rate
ALPHA - momentum
MAXITER - max iteration
ERRTOL - error tolerence for convergence
NumLayers - number of layers
N(input) - Number of neurons in input layer
N(hidden1) - Number of neurons in 1st hidden layer
N(hidden2) - Number of neurons in 2nd hidden layer (if present)
.
.
.
N(hiddenK) - Number of neurons in Kth hidden layer (if present)
N(output) - Number of neurons in output hidden layer
Training file format:
NumPatterns - Number of patterns in training set
I0 I1 ... In D - hex byte input for each input / desire value for pattern 1
I0 I1 ... In D - hex byte input for each input / desire value for pattern 2
. . . .
. . . .
. . . .
I0 I1 ... In D - hex byte input for each input / desire value for pattern P
Example usage may be found in the command files provided to train networks on the sample
training data. The Samples likewise provide examples of parm and training files. Note
that after training a .GBL file must be created so that GRID1 will be able to access the
trained weights and network parameters. (For the sample data these files are provided
with a .gbx extension. After training rename the .GBX file giving it a .GBL extension.
2) GRID1
To run the program it is only necessary to type GRID1 on the command line.
近期下载者:
相关文件:
收藏者: