Using FannJ without creating an ANN first

116 views
Skip to first unread message

brandstaetter

unread,
Mar 1, 2011, 8:41:17 AM3/1/11
to FannJ
I would like to use FannJ and Fann to create/train and use a neural
network. I noticed that there is a Trainer class, but that one still
needs a Fann object as parameter? Is there a way to create an empty
ANN for the trainer first or should I simply use the JNI myself and
talk directly to Fann?

Kyle Renfro

unread,
Mar 1, 2011, 9:32:41 AM3/1/11
to fa...@googlegroups.com
Creation and training of an ANN from FannJ has not been implemented.  If you end up writing that code, consider contributing it.

I use FannTool (http://code.google.com/p/fanntool) to create and train the ANNs that I use with FannJ.

-Kyle

Hannes Brandstätter-Müller

unread,
Mar 1, 2011, 9:35:19 AM3/1/11
to fa...@googlegroups.com
Ok, I'll try to write that code.

Do you have any pointers / hints for me so I don't start off in the
wrong direction?

I tried fanntool, but it did not start on my machine...

Daniel Thomas

unread,
Mar 1, 2011, 9:36:17 AM3/1/11
to fa...@googlegroups.com
I have a patch in progress to provide some useful related functionality
and I might add that as well.

Remind me to submit it.

Daniel

signature.asc

Kyle Renfro

unread,
Mar 1, 2011, 9:46:13 AM3/1/11
to fa...@googlegroups.com
Make sure you are using code/docs for FANN 2.1.0beta.  
If you hit a road-block let me know.
I'll get any patches submitted quickly.

-Kyle

Kyle Renfro

unread,
Mar 1, 2011, 9:47:24 AM3/1/11
to fa...@googlegroups.com
This is your official reminder.

I'll get any patches applied quickly.

thanks,
Kyle

Daniel Thomas

unread,
Mar 1, 2011, 9:51:18 AM3/1/11
to fa...@googlegroups.com

> I tried fanntool, but it did not start on my machine...

I think I ran into that problem as well - I think it is distributed as
an 64bit binary with the source code so you need to recompile it using
code::blocks.

To automatically create Fanns with fannj I create a Fann with a
specified layer structure and then use Trainer to train it. I have also
added a test function (in my patch) which lets you use a separate test
file to compute a more accurate MSE.

Daniel

signature.asc

Hannes Brandstätter-Müller

unread,
Mar 1, 2011, 10:25:03 AM3/1/11
to fa...@googlegroups.com
So, I checked out the source via svn into my eclipse.

First errors: Added JUnit4 as reference, fixed that. Then, downloaded
and installed JNA and edded the .jar as referenced library.

Now there are no errors shown anymore. But when I try to run the JUnit
test, it loooks like it does not find the resources (*.data and *.net
files) - what did I do wrong?

Thanks, jUnit noob here :D

Hannes

Daniel Thomas

unread,
Mar 1, 2011, 11:25:12 AM3/1/11
to fa...@googlegroups.com
Hello,

The files it is looking for are not where it expects them to be but they
are in the source: src/test/resources
I suspect that there is some maven magic that makes this work neatly but
I just did the ugly hack of copying them from src/test/resources into
the correct location in bin/

I hope that this helps,

Daniel

signature.asc

Kyle Renfro

unread,
Mar 1, 2011, 11:28:31 AM3/1/11
to fa...@googlegroups.com
Make sure src/test/resources is added to 'source' tab of 'Java Build Path' in project properties.

-Kyle

Kyle Renfro

unread,
Mar 1, 2011, 11:29:02 AM3/1/11
to fa...@googlegroups.com
Make sure src/test/resources is added to 'source' tab of 'Java Build Path' in project properties.

-Kyle

Kyle Renfro

unread,
Mar 1, 2011, 11:44:13 AM3/1/11
to fa...@googlegroups.com
I just tried to check-out and build and the tests failed using latest version of fann-2.1.0beta.
Fann was complaining about the xor_float.net file.

I have just commited the latest xor_float.net file from the examples directory of fann.  If you are having trouble getting the tests to pass, please checkout latest source.

-Kyle

Daniel Thomas

unread,
Mar 2, 2011, 4:48:08 AM3/2/11
to fa...@googlegroups.com
Here are the patches. One to add the test function and one to close() on
finalization (this one might not be appropriate though).

Daniel

0001-Add-test-1-function-to-Trainer.patch
0002-Add-finalize-method-to-Fann.patch
signature.asc

Kyle Renfro

unread,
Mar 2, 2011, 6:11:14 PM3/2/11
to fa...@googlegroups.com
Patches applied. 

Is this project something you use often?

thanks!
Kyle

Hannes Brandstätter-Müller

unread,
Mar 3, 2011, 3:42:09 AM3/3/11
to fa...@googlegroups.com
On Thu, Mar 3, 2011 at 00:11, Kyle Renfro <kre...@real-comp.com> wrote:
> Patches applied.
> Is this project something you use often?
> thanks!
> Kyle

Thanks for the quick patching. I'll take a look if it works now as I
need it. I have some more ideas for fixes, but firstly a small typo
fix:
(you typoed neurons twice in the docstrings in Layer.java)

I also have no idea why the patch is that long/complicated for those 2
small fixes. Stupid eclipse. Also the junit tests work now if I copy
the data files to the correct directory.

Index: src/main/java/com/googlecode/fannj/Layer.java
===================================================================
--- src/main/java/com/googlecode/fannj/Layer.java (revision 45)
+++ src/main/java/com/googlecode/fannj/Layer.java (working copy)
@@ -26,50 +26,52 @@
*/
public class Layer extends ArrayList<Neuron> {

- private static final long serialVersionUID = -6467294440860703773L;
+ private static final long serialVersionUID = -6467294440860703773L;

- /**
- * Create a Layer with the specified number of neurons with the default
- * Activation Function: {@link Neuron.DEFAULT_ACTIVATION_FUNCTION} with
- * steepness: {@link Neuron.DEFAULT_ACTIVATION_STEEPNESS}
- *
- * @param numNeurons
- * @return
- */
- public static Layer create(int numNeurons) {
- return create(numNeurons, Neuron.DEFAULT_ACTIVATION_FUNCTION,
- Neuron.DEFAULT_ACTIVATION_STEEPNESS);
- }
+ /**
+ * Create a Layer with the specified number of neurons with the default
+ * Activation Function: {@link Neuron.DEFAULT_ACTIVATION_FUNCTION} with
+ * steepness: {@link Neuron.DEFAULT_ACTIVATION_STEEPNESS}
+ *
+ * @param numNeurons
+ * @return
+ */
+ public static Layer create(int numNeurons) {
+ return create(numNeurons, Neuron.DEFAULT_ACTIVATION_FUNCTION,
+ Neuron.DEFAULT_ACTIVATION_STEEPNESS);
+ }

- /**
- * Create a Layer with the specified number of neruons and a particular
- * ActivationFunction with the steepness:
- * {@link Neuron.DEFAULT_ACTIVATION_STEEPNESS}
- *
- * @param numNeurons
- * @param activationFunction
- * @return
- */
- public static Layer create(int numNeurons, ActivationFunction
activationFunction) {
+ /**
+ * Create a Layer with the specified number of neurons and a particular
+ * ActivationFunction with the steepness:
+ * {@link Neuron.DEFAULT_ACTIVATION_STEEPNESS}
+ *
+ * @param numNeurons
+ * @param activationFunction
+ * @return
+ */
+ public static Layer create(int numNeurons,
+ ActivationFunction activationFunction) {

- return create(numNeurons, activationFunction,
Neuron.DEFAULT_ACTIVATION_STEEPNESS);
- }
+ return create(numNeurons, activationFunction,
+ Neuron.DEFAULT_ACTIVATION_STEEPNESS);
+ }

- /**
- * Create a Layer with the specified number of neruons and a particular
- * ActivationFunction with specified steepness
- *
- * @param numNeurons
- * @param activationFunction
- * @param steepness
- * @return
- */
- public static Layer create(int numNeurons, ActivationFunction
activationFunction,
- float steepness) {
+ /**
+ * Create a Layer with the specified number of neurons and a particular
+ * ActivationFunction with specified steepness
+ *
+ * @param numNeurons
+ * @param activationFunction
+ * @param steepness
+ * @return
+ */
+ public static Layer create(int numNeurons,
+ ActivationFunction activationFunction, float steepness) {

- Layer layer = new Layer();
- for (int i = 0; i < numNeurons; i++)
- layer.add(new Neuron(activationFunction, steepness));
- return layer;
- }
+ Layer layer = new Layer();
+ for (int i = 0; i < numNeurons; i++)
+ layer.add(new Neuron(activationFunction, steepness));
+ return layer;
+ }
}

Hannes Brandstätter-Müller

unread,
Mar 3, 2011, 4:04:22 AM3/3/11
to fa...@googlegroups.com
On Thu, Mar 3, 2011 at 09:42, Hannes Brandstätter-Müller
<hannes....@gmail.com> wrote:

> I also have no idea why the patch is that long/complicated for those 2
> small fixes. Stupid eclipse.

Ah yes, found out why the diffs are so big. Which code formatting do
you use? Do you have a standard?

Kyle Renfro

unread,
Mar 3, 2011, 9:24:11 AM3/3/11
to fa...@googlegroups.com
Great. I use Sun's Java Coding Conventions. 

thanks,
-Kyle

Kyle Renfro

unread,
Mar 3, 2011, 9:25:59 AM3/3/11
to fa...@googlegroups.com
I'm sure you are right about the "maven magic", as the tests just run for me.  Did you try to setup your eclipse project as described earlier.  I don't think you should have to copy around files to get the build to work.

-Kyle

Hannes Brandstätter-Müller

unread,
Mar 3, 2011, 9:43:55 AM3/3/11
to fa...@googlegroups.com
Okay, I use the Java Conventions [built-in] from eclipse - I noticed
some small differences to the checked-in code, e.g. sometimes the
indents are done with tabs (see Fann.java)

-> I'll use the mentioned settings and submit my patches even if some
"untouched" code is always in the diff too

Hannes

Kyle Renfro

unread,
Mar 3, 2011, 9:48:00 AM3/3/11
to fa...@googlegroups.com
Ah! There was a tab. it was killed.

-K

Kyle Renfro

unread,
Mar 3, 2011, 9:50:15 AM3/3/11
to fa...@googlegroups.com
geez. more tabs.

Hannes Brandstätter-Müller

unread,
Mar 3, 2011, 10:33:31 AM3/3/11
to fa...@googlegroups.com
Well, I added some small things in preparation for further extensions.
I try to get the cascade training to work, but I have not figured out
the correct parameters yet, so I commented out the unit test function.
I'll do more next week.

Hannes

patch.txt

Kyle Renfro

unread,
Mar 3, 2011, 11:21:28 AM3/3/11
to fa...@googlegroups.com
would you like commit privileges?

-Kyle

Hannes Brandstätter-Müller

unread,
Mar 3, 2011, 2:33:20 PM3/3/11
to fa...@googlegroups.com
Thanks, but I'd like to "test it out" first by sending you the
patches, unless it's too much hassle for you. Perhaps in a few weeks,
once I also have used the code in our project too...

Hannes

Hannes Brandstätter-Müller

unread,
Mar 8, 2011, 5:06:40 AM3/8/11
to fa...@googlegroups.com
Ha, fresh week, fresh brain. Found a stupid error which prevented the
test from running. Yes, you need the proper number of input neurons
too *headdesk*

Well, I've attached the patch again, this time it should work and
support the cascade2 algorithm for creating an ANN without specifying
the hidden layers first, including a first training file.

Hannes

patch.txt
Reply all
Reply to author
Forward
0 new messages