GNU/Linux AI & Alife HOWTO, John Eikenberry [best books to read for self development TXT] 📗
- Author: John Eikenberry
- Performer: -
Book online «GNU/Linux AI & Alife HOWTO, John Eikenberry [best books to read for self development TXT] 📗». Author John Eikenberry
Balancing task
� NeuroEvolution of Augmenting Topologies (NEAT) software
for evolving neural networks using structure
Various (C++) Neural Networks
� Web site: www.dontveter.com/nnsoft/nnsoft.html
Example neural net codes from the book, The Pattern
Recognition Basics of AI. These are simple example codes of
these various neural nets. They work well as a good starting
point for simple experimentation and for learning what the code
is like behind the simulators. The types of networks available
on this site are: (implemented in C++)
� The Backprop Package
� The Nearest Neighbor Algorithms
� The Interactive Activation Algorithm
� The Hopfield and Boltzman machine Algorithms
� The Linear Pattern Classifier
� ART I
� BiDirectional Associative Memory
� The Feedforward Counter-Propagation Network
3.2. Connectionist software kits/applications
These are various applications, software kits, etc. meant for research
in the field of Connectionism. Their ease of use will vary, as they
were designed to meet some particular research interest more than as
an easy to use commercial package.
Aspirin - MIGRAINES
(am6.tar.Z on ftp site)
� FTP site: sunsite.unc.edu/pub/academic/computer-science/neural-networks/programs/Aspirin/
The software that we are releasing now is for creating, and
evaluating, feedforward networks such as those used with the
backpropagation learning algorithm. The software is aimed both
at the expert programmer/neural network researcher who may wish
to tailor significant portions of the system to his/her precise
needs, as well as at casual users who will wish to use the
system with an absolute minimum of effort.
DDLab
� Web site: www.santafe.edu/~wuensch/ddlab.html
� FTP site: ftp.santafe.edu/pub/wuensch/
DDLab is an interactive graphics program for research into the
dynamics of finite binary networks, relevant to the study of
complexity, emergent phenomena, neural networks, and aspects of
theoretical biology such as gene regulatory networks. A network
can be set up with any architecture between regular CA (1d or
2d) and “random Boolean networks” (networks with arbitrary
connections and heterogeneous rules). The network may also have
heterogeneous neighborhood sizes.
GENESIS
� Web site: www.genesis-sim.org/GENESIS/
� FTP site: genesis-sim.org/pub/genesis/
GENESIS (short for GEneral NEural SImulation System) is a
general purpose simulation platform which was developed to
support the simulation of neural systems ranging from complex
models of single neurons to simulations of large networks made
up of more abstract neuronal components. GENESIS has provided
the basis for laboratory courses in neural simulation at both
Caltech and the Marine Biological Laboratory in Woods Hole, MA,
as well as several other institutions. Most current GENESIS
applications involve realistic simulations of biological neural
systems. Although the software can also model more abstract
networks, other simulators are more suitable for backpropagation
and similar connectionist modeling.
JavaBayes
� Web site: www.cs.cmu.edu/People/javabayes/index.html/
The JavaBayes system is a set of tools, containing a graphical
editor, a core inference engine and a parser. JavaBayes can
produce:
� the marginal distribution for any variable in a network.
� the expectations for univariate functions (for example,
expected value for variables).
� configurations with maximum a posteriori probability.
� configurations with maximum a posteriori expectation for
univariate functions.
Jbpe
� Web site: cs.felk.cvut.cz/~koutnij/studium/jbpe.html
Jbpe is a backpropagation neural network editor/simulator.
Features
� Standart backpropagation networks creation.
� Saving network as a text file, which can be edited and loaded
back.
� Saving/loading binary file
� Learning from a text file (with structure specified below),
number of learning periods / desired network energy can be
specified as a criterion.
� Network recall
Neural Network Generator
� Web site: www.idsia.ch/~rafal/research.html
� FTP site: ftp.idsia.ch/pub/rafal
The Neural Network Generator is a genetic algorithm for the
topological optimization of feedforward neural networks. It
implements the Semantic Changing Genetic Algorithm and the Unit-Cluster Model. The Semantic Changing Genetic Algorithm is an
extended genetic algorithm that allows fast dynamic adaptation
of the genetic coding through population analysis. The Unit-Cluster Model is an approach to the construction of modular
feedforward networks with a ”backbone” structure.
NOTE: To compile this on Linux requires one change in the
Makefiles. You will need to change ‘-ltermlib’ to ‘-ltermcap’.
Neureka ANS (nn/xnn)
� FTP site: ftp.ii.uib.no/pub/neureka/
nn is a high-level neural network specification language. The
current version is best suited for feedforward nets, but
recurrent models can and have been implemented, e.g. Hopfield
nets, Jordan/Elman nets, etc. In nn, it is easy to change
network dynamics. The nn compiler can generate C code or
executable programs (so there must be a C compiler available),
with a powerful command line interface (but everything may also
be controlled via the graphical interface, xnn). It is possible
for the user to write C routines that can be called from inside
the nn specification, and to use the nn specification as a
function that is called from a C program. Please note that no
programming is necessary in order to use the network models that
come with the system (`netpack’).
xnn is a graphical front end to networks generated by the nn
compiler, and to the compiler itself. The xnn graphical
interface is intuitive and easy to use for beginners, yet
powerful, with many possibilities for visualizing network data.
NOTE: You have to run the install program that comes with this
to get the license key installed. It gets put (by default) in
usrlib. If you (like myself) want to install the package
somewhere other than in the /usr directory structure (the
install program gives you this option) you will have to set up
some environmental variables (NNLIBDIR & NNINCLUDEDIR are
required). You can read about these (and a few other optional
variables) in appendix A of the documentation (pg 113).
NEURON
� Web site: www.neuron.yale.edu/
NEURON is an extensible nerve modeling and simulation program.
It allows you to create complex nerve models by connecting
multiple one-dimensional sections together to form arbitrary
cell morphologies, and allows you to insert multiple membrane
properties into these sections (including channels, synapses,
ionic concentrations, and counters). The interface was designed
to present the neural modeler with a intuitive environment and
hide the details of the numerical methods used in the
simulation.
PDP++
� Web site: www.cnbc.cmu.edu/Resources/PDP++/
� FTP site (US): cnbc.cmu.edu/pub/pdp++/
� FTP mirror (US): grey.colorado.edu/pub/oreilly/pdp++/
As the field of Connectionist modeling has grown, so has the
need for a comprehensive simulation environment for the
development and testing of Connectionist models. Our goal in
developing PDP++ has been to integrate several powerful software
development and user interface tools into a general purpose
simulation environment that is both user friendly and user
extensible. The simulator is built in the C++ programming
language, and incorporates a state of the art script interpreter
with the full expressive power of C++. The graphical user
interface is built with the Interviews toolkit, and allows full
access to the data structures and processing modules out of
which the simulator is built. We have constructed several useful
graphical modules for easy interaction with the structure and
the contents of neural networks, and we’ve made it possible to
change and adapt many things. At the programming level, we have
set things up in such a way as to make user extensions as
painless as possible. The programmer creates new C++ objects,
which might be new kinds of units or new kinds of processes;
once compiled and linked into the simulator, these new objects
can then be accessed and used like any other.
RNS
� Web site: www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/neural/systems/rns/
RNS (Recurrent Network Simulator) is a simulator for recurrent
neural networks. Regular neural networks are also supported. The
program uses a derivative of the backpropagation algorithm, but
also includes other (not that well tested) algorithms.
Features include
� freely choosable connections, no restrictions besides memory
or CPU constraints
� delayed links for recurrent networks
� fixed values or thresholds can be specified for weights
� (recurrent) backpropagation, Hebb, differential Hebb,
simulated annealing and more
� patterns can be specified with bits, floats, characters,
numbers, and random bit patterns with Hamming distances can
be chosen for you
� user definable error functions
� output results can be used without modification as input
Simple Neural Net (in Python)
� Web site: http://www.amk.ca/python/unmaintained/
Simple neural network code, which implements a class for 3-level
networks (input, hidden, and output layers). The only learning
rule implemented is simple backpropagation. No documentation (or
even comments) at all, because this is simply code that I use to
experiment with. Includes modules containing sample datasets
from Carl G. Looney’s NN book. Requires the Numeric extensions.
SCNN
� Web site: www.uni-frankfurt.de/fb13/iap/e_ag_rt/SCNN/
SCNN is an universal simulating system for Cellular Neural
Networks (CNN). CNN are analog processing neural networks with
regular and local interconnections, governed by a set of
nonlinear ordinary differential equations. Due to their local
connectivity, CNN are realized as VLSI chips, which operates at
very high speed.
Semantic Networks in Python
� Web site: strout.net/info/coding/python/ai/index.html
The semnet.py module defines several simple classes for building
and using semantic networks. A semantic network is a way of
representing knowledge, and it enables the program to do simple
reasoning with very little effort on the part of the programmer.
The following classes are defined:
� Entity: This class represents a noun; it is something which
can be related to other things, and about which you can store
facts.
� Relation: A Relation is a type of relationship which may
exist between two entities. One special relation, “IS_A”, is
predefined because it has special meaning (a sort of logical
inheritance).
� Fact: A Fact is an assertion that a relationship exists
between two entities.
With these three object types, you can very quickly define
knowledge about a set of objects, and query them for logical
conclusions.
SNNS
� Web site: www-ra.informatik.uni-tuebingen.de/SNNS/
� FTP site: ftp.informatik.uni-stuttgart.de/pub/SNNS/
Stuttgart Neural Net Simulator (version 4.1). An awesome neural
net simulator. Better than any commercial simulator I’ve seen.
The simulator kernel is written in C (it’s fast!). It supports
over 20 different network architectures, has 2D and 3D X-based
graphical representations, the 2D GUI has an integrated network
editor, and can generate a separate NN program in C. SNNS is
very powerful, though a bit difficult to learn at first. To help
with this it comes with example networks and tutorials for many
of the architectures. ENZO, a supplementary system allows you
to evolve your networks with genetic algorithms.
SPRLIB/ANNLIB
� Web site: www.ph.tn.tudelft.nl/~sprlib/
SPRLIB (Statistical Pattern Recognition Library) was developed
to support the easy construction and simulation of pattern
classifiers. It consist of a library of functions (written in C)
that can be called from your own program. Most of the well-known
classifiers are present (k-nn, Fisher, Parzen, ….), as well as
error estimation and dataset generation routines.
ANNLIB (Artificial Neural Networks Library) is a neural network
simulation library based on the data architecture laid down by
SPRLIB. The library contains numerous functions for creating,
training and testing feedforward networks. Training algorithms
include backpropagation, pseudo-Newton, Levenberg-Marquardt,
conjugate gradient descent, BFGS…. Furthermore, it is possible
- due to the datastructures’ general applicability - to build
Kohonen maps and other more exotic network architectures using
the same data types.
TOOLDIAG
� Web site: www.inf.ufes.br/~thomas/home/soft.html
� Alt site: http://www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/neural/systems/tooldiag/0.html
TOOLDIAG is a collection of methods for statistical pattern
recognition. The main area of application is classification. The
application area is limited to multidimensional continuous
features, without any missing values. No symbolic features
(attributes) are allowed. The program in implemented in the ‘C’
programming language and was tested in several computing
environments.
Comments (0)