IntroNeuralJavaExampesEdition2
所属分类:人工智能/神经网络/深度学习
开发工具:Java
文件大小:4785KB
下载次数:67
上传日期:2009-12-31 15:49:02
上 传 者:
huiweics
说明: IntroNeuralJavaExampesEdition书上的源代码,分章节,很有参考价值的。
(IntroNeuralJavaExampesEdition the book' s source code, sub-chapters, of great reference value.)
文件列表:
JavaIntroNeuralNetworkEdition2\.checkstyle (304, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.classpath (357, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.project (389, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn\all-wcprops (416, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn\entries (600, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn\prop-base (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn\props (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn\text-base (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn\text-base\org.eclipse.jdt.core.prefs.svn-base (21606, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn\text-base\org.eclipse.jdt.ui.prefs.svn-base (250, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn\tmp (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn\tmp\prop-base (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn\tmp\props (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\.svn\tmp\text-base (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\org.eclipse.jdt.core.prefs (21606, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.settings\org.eclipse.jdt.ui.prefs (250, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\all-wcprops (2089, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\entries (3040, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\prop-base (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\prop-base\build.xml.svn-base (30, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\prop-base\MinMax.net.svn-base (53, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\prop-base\random.net.svn-base (53, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\prop-base\sp500.net.svn-base (53, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\prop-base\whenborn.hst.svn-base (53, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\prop-base\whenborn.net.svn-base (53, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\props (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\text-base (0, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\text-base\.checkstyle.svn-base (304, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\text-base\.classpath.svn-base (357, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\text-base\.project.svn-base (389, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\text-base\bornTrainingBad.txt.svn-base (4669624, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\text-base\bornTrainingGood.txt.svn-base (106563, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\text-base\build.xml.svn-base (1255, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\text-base\common.csv.svn-base (6846, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\text-base\famous.csv.svn-base (1969, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\text-base\MinMax.net.svn-base (2187, 2009-10-29)
JavaIntroNeuralNetworkEdition2\.svn\text-base\prime.csv.svn-base (5412, 2009-10-29)
... ...
Introduction to Neural Networks with Java, 2nd Edition
by Jeff Heaton
ISBN: 1-60439-008-5
===============================================================================
===============================================================================
This archive contains the Java source code from the book "Introduction to
Neural Networks with Java". If you would like to purchase this book you may
do so at the following URL:
http://www.heatonresearch.com/book/
You can also view much of the book online, at the following URL:
http://www.heatonresearch.com/articles/series/1/
===============================================================================
Table of Contents from "Introduction to Neural Networks with Java"
Introduction to Neural Networks with Java, Second Edition, introduces the Java programmer
to the world of Neural Networks and Artificial Intelligence. Neural network architectures,
such as the feedforward, Hopfield, and self-organizing map architectures are discussed.
Training techniques, such as backpropagation, genetic algorithms and simulated annealing
are also introduced. Practical examples are given for each neural network. Examples
include the traveling salesman problem, handwriting recognition, financial prediction,
game strategy, mathematical functions, and Internet bots. All Java source code is available
online for easy downloading.
Chapter 1 provides an overview of neural networks. You will be introduced to the mathematical
underpinnings of neural networks and how to calculate their values manually. You will also
see how neural networks use weights and thresholds to determine their output. Matrix math plays
a central role in neural network processing.
Chapter 2 introduces matrix operations and demonstrates how to implement them in Java. The
mathematical concepts of matrix operations used later in this book are discussed. Additionally,
Java classes are provided which accomplish each of the required matrix operations. One of the
most basic neural networks is the Hopfield neural network.
Chapter 3 demonstrates how to use a Hopfield Neural Network. You will be shown how to
construct a Hopfield neural network and how to train it to recognize patterns.
Chapter 4 introduces the concept of machine learning. To train a neural network, the weights
and thresholds are adjusted until the network produces the desired output. There are many
different ways training can be accomplished. This chapter introduces the different training
methods.
Chapter 5 introduces perhaps the most common neural network architecture, the feedforward
backpropagation neural network. This type of neural network is the central focus of this book.
In this chapter, you will see how to construct a feedforward neural network and how to train
it using backpropagation. Backpropagation may not always be the optimal training algorithm.
Chapter 6 expands upon backpropagation by showing how to train a network using a genetic
algorithm. A genetic algorithm creates a population of neural networks and only allows the
best networks to mate and produce offspring. Simulated annealing can also be a very
effective means of training a feedforward neural network.
Chapter 7 continues the discussion of training methods by introducing simulated annealing.
Simulated annealing simulates the heating and cooling of a metal to produce an optimal solution.
Neural networks may contain unnecessary neurons.
Chapter 8 explains how to prune a neural network to its optimal size. Pruning allows unnecessary
neurons to be removed from the neural network without adversely affecting the error
rate of the network. The neural network will process information more quickly with fewer
neurons. Prediction is another popular use for neural networks.
Chapter 9 introduces temporal neural networks, which attempt to predict the future. Prediction
networks can be applied to many different problems, such as the prediction of sunspot cycles,
weather, and the financial markets.
Chapter 10 builds upon chapter 9 by demonstrating how to apply temporal neural networks to
the financial markets. The resulting neural network attempts to predict the direction of
the S & P 500. Another neural network architecture is the self-organizing map (SOM). SOMs
are often used to group input into categories and are generally trained with an unsupervised
training algorithm. An SOM uses a winner-takes-all strategy, in which the output is provided
by the winning neuronoutput is not produced by each of the neurons.
Chapter 11 provides an introduction to SOMs and demonstrates how to use them.
Handwriting recognition is a popular use for SOMs.
Chapter 12 continues where chapter 11 leaves off, by demonstrating how to use an SOM to
read handwritten characters. The neural network must be provided with a sample of the handwriting
that it is to analyze. This handwriting is categorized using the 26 characters of the
Latin alphabet. The neural network is then able to recognize new characters.
Chapter 13 introduces bot programming and explains how to use a neural network to help
identify data. Bots are computer programs that perform repetitive tasks. An HTTP bot is a
special type of bot that uses the web much like a human uses it. The neural network is
trained to recognize the specific types of data for which the bot is searching.
The book ends with chapter 14, which discusses the future of neural networks, quantum
computing, and how it applies to neural networks. The Encog neural network framework is
also introduced.
近期下载者:
相关文件:
收藏者: