MELBP
lbp 

所属分类:单片机开发
开发工具:C/C++
文件大小:121KB
下载次数:4
上传日期:2006-09-19 16:25:02
上 传 者haifeng6666
说明:  C语言多种求最优的程序源代码,方便实用 对编程有很大帮助
(C language for a variety of optimal source code, useful for programming will be of great help)

文件列表:
MELBP (0, 2006-09-19)
MELBP\BP.C (29059, 1989-12-30)
MELBP\BP.EXE (29080, 1989-12-30)
MELBP\BP1.DAT (69, 1989-12-30)
MELBP\BP2.DAT (13, 1989-12-30)
MELBP\BP3.DAT (0, 2006-09-19)
MELBP\BP_H.EXE (41000, 1989-12-30)
MELBP\CG299_0I.001 (1, 1989-12-30)
MELBP\CONVER.C (12568, 1989-12-30)
MELBP\CREWS1.EX (491, 1989-12-30)
MELBP\CREWS2.EX (723, 1989-12-30)
MELBP\CREWS3.EX (664, 1989-12-30)
MELBP\CREWS4.EX (899, 1989-12-30)
MELBP\CREWS5.EX (7323, 1989-12-30)
MELBP\ERROR.C (2076, 1989-12-30)
MELBP\ERROR.H (647, 1989-12-30)
MELBP\INPUT (307, 1989-12-30)
MELBP\MEL.ART (21726, 1989-12-30)
MELBP\MEL.C (42467, 1989-12-30)
MELBP\MEL.DOC (7502, 1989-12-30)
MELBP\MEL.H (17925, 1989-12-30)
MELBP\MEL.OBJ (10297, 1989-12-30)
MELBP\MEL_TEST.C (9917, 1989-12-30)
MELBP\MEL_TEST.EXE (37574, 1989-12-30)
MELBP\MEL_TEST.OBJ (3931, 1989-12-30)
MELBP\OUTPUT (363, 1989-12-30)
MELBP\PARITY3.DAT (157, 1989-12-30)
MELBP\PARITY4.DAT (372, 1989-12-30)
MELBP\PLOT.C (10255, 1989-12-30)
MELBP\PLOT.H (653, 1989-12-30)
MELBP\RANDOM.C (3227, 1989-12-30)
MELBP\RANDOM.H (617, 1989-12-30)
MELBP\SUBMIT.DSK (3403, 1991-08-15)
MELBP\TEST.C (3712, 1989-12-30)
MELBP\XOR.DAT (69, 1989-12-30)

Back Propagation Generalized Delta Rule Learning Program (c) 1***9 by Ronald Michaels. This program may be freely copied, modified, transmitted, or used for any non-commercial purpose. Ronald Michaels 3 Wilmar Close, Greendale Harare, Zimbabwe after 1 January 1990: Ronald Michaels 231 Tipton Station Rd. Knoxville, Tn. 37920 USA I. Contents of this disk: bp.c the main file error.c handles error messages for bp.c error.h header file for error.c random.c a state of the art random number generator random.h header file for random.c plot.c contains functions to plot error graph for bp learning plot.h header file for plot.h bp.exe basic program without graphics bp_h.exe program with Hercules graphics bp1.dat bp looks here for the training patterns bp2.dat bp looks here for program parameters xor.dat training pattern set to illustrate the exclusive or problem this is the problem used to prove the limitations of the single level perceptron by Minski and Papert. parity3.dat training pattern set to illustrate a more difficult version of the xor problem parity4.dat training pattern set to illustrate an even more difficult version of the xor problem test.c a test driver for random.c use this driver to test the random number generator for correctness if program is ported to non-80X86 environment readme this file II. Purpose of this program The purpose of this program is to illustrate the Back Propagation algorithm as outlined in the article: Rummelhart, Hinton & Williams. "Learning representations by back- propagating errors". NATURE, vol. 329, 9 October 1***6, pages 533-536. To this end, I have chosen to use the array representation rather than the linked list representation. Vectors and matrices are allocated dynamically as one dimensional arrays and pointer arithmetic is used to access elements. This creates a fair amount of loop invariant code which the Zortech compiler evidently hoists; optimization cuts execution time in half. For the sake of simplicity I have used floating point representations for all patterns, weights, deltas, etc. This means that without a math coprocessor this program will be SLOW. III. How it works Back Propagation is a non-linear pattern recognition algorithm which overcomes one of the limitations of the single layer perceptron, that is the inability to learn to distinguish groups of data points which cannot be separated by a straight line or flat plane (or a hyper-plane in hyper- space). This program looks for the file bp1.dat for the set of patterns to be learned. Several pattern sets are provided, for example parity3.dat. To use the algorithm to learn this set of patterns, do: copy parity3.dat bp1.dat. The file bp2.dat contains the learning rate and momentum parameters for the algorithm. These may be altered to observe their effect on the algorithm. bp.c contains the back propagation algorithm. The function learn() is the heart of the program and the various steps in the algorithm are implemented as functions to be called from learn(). Output of the program can be character based or graphic depending on the value defined for GRAPH at the top of bp.c. The graphics are for Hercules screens using the Zortech graphics library but can be rewritten easily. See the bp.c source code for descriptions of the individual functions which are called from learn(). In the bp.c file there are several activation functions which may be tried in lieu of the sigmoid function used by Rummelhart, et al. Since we are using finite precision arithmetic, there is no such thing as real continuity; it is of interest to observe the effects of function shape. IV. Compiling the program There are two options presented here - graphics output to Hercules graphics and ASCII output to compile program and incorporate graphics, set GRAPH to 1 in file bp.c then compile and link bp.c, error.c, plot.c, and random.c. to compile program with only ASCII output, set GRAPH to 0 in bp.c then compile and link bp.c, error.c, and random.c. This program compiles under the Zortech c compiler V. 1.07 as is. With minor changes it will compile under Ecosoft V.4.07. All functions are prototyped so older compilers will require changes to function declarations. Note that graphics function calls and coordinates in plot.c must be revised to suit the graphics library employed. If a display other than Hercules graphics is used for graphics, function prt_scr in plot.c must be revised to reflect the mapping of video memory. V. Running the program First the program asks for a seed for the pseudorandom number generator. The number of learning cycles required for convergence is sensitive to the initial value of weights so the use of random.c should allow predictable pseudorandom weights independent of any specific compiler library and allow results from one implementation to be directly compared to results from another. Next the program asks for the upper and lower limits for the random numbers used to initialize the weights. As mentioned above, the algorithm is sensitive to initial weights. Once the random weight parameters are selected, the program enters its main control loop. Commands available are selected by pressing the first letter of the command. Learn: Run the learning algorithm. User is asked to specify the number of cycles. From, say, 50 to 5000 cycles may be required to reach convergence depending upon starting point, leaning rate, and patterns to be learned. Recognize: Present input patterns to weight matrices and calculate outputs. Compare outputs to target outputs and calculate error. Dump: Dump program variables to file bp3.dat for later review. Caution, this command can build big files. Quit: Terminate program VI. Further Reading Jones, William P. and Hoskins, Josiah. "Back-Propagation A generalized delta learning rule". Byte October 1***7. pages 155-162. Rummelhart, David E. and McClelland, James L. Parallel Distributed Processing:Explorations in the Microstructure of Cognition. Cambridge: MIT Press, 1***6. For an overview of artificial neural systems see the March 1***8 issue of Computer published by the Computer Society of the IEEE. For a historical review of pattern recognition see: Tou, Julius T. and Gonzalez, Rafael C. Pattern Recognition Principles. Reading, Mass.: Addison-Wesley Publishing Company. For a good, inexpensive introduction to Neural Networks see: Vemuri, V. (editor) Artificial Neural Networks: Theoretical Concepts. Washington, D.C. Computer Society Press Society Press of the IEEE.

近期下载者

相关文件


收藏者