|SourceForge | About SourceForge | SourceForge Partners | Contact SourceForge|
Inanna(+ Nehep + Annalee + MagiC++)
Artificial neural network library, and MagiClib, a common base class library.
Inanna is a relatively small ANN library that I've been developing for a few years for research purposes. It's very object-oriented design, and written in C++ (g++) in Solaris and Linux. It's just gone through heavy changes, and some restructuring is yet to be done, so the current release should be considered as VERY ALPHA (developmental release) VERSION!!!
Downloads, bug reports (please report them) and support (please use it) at the SourceForge project page:
Users who like to make support request and bug reports are welcome. Especially developers are most welcome. The library is LGPL licensed, and CVS accounts are available at SourceForge.net.
The OO design is somewhat open and generic, which has been my main design goal. Some main features:
The architecture is such that user-defined components should be very easy to implement.
You must compile and install the MagiClib first, then Inanna.
Inanna uses the GNU "./configure ; make ; make install" build system, although you probably won't get it that easy...
If you try it, PLEASE tell me if you could get it compiled or even working, and what problems you had (I KNOW you'll have many). I really don't know how easy it is to use...
Inanna has some documentation in PostScript format. See the Files section.
Included in the package is a simple program (in proj/data subdirectory) for training XOR and Letters problems. If you have the Proben1 problem dataset, you may want to try the program on those problems.
Included is also a research project, for auditing predictions. Datasets are not provided. In CVS there is also KAuditor, which is a KDE interface for the auditing library. Although the libraries are LGPL licensed, KAuditor is GPL licensed (and is also restricted by KDE and Qt licenses).
Background historyI developed the previous version for a study involving evolutionary design of neural networks. That part of the project is currently broken though, because of various library incompatibilities. But, maybe it will work again some day. Previously Inanna used SNNS for training, but now it implements two training algorithms: Backprop and Resilient Backprop. Not quite as fast as SNNS, but almost (I'm working on that).
Author: Marko Gronroos (firstname.lastname@example.org, email@example.com)
|All trademarks and copyrights on this page are properties of their respective owners. Forum comments are owned by the poster. The rest is copyright ©1999-2000 VA Linux Systems, Inc.|