Do you have any pointers / hints for me so I don't start off in the
wrong direction?
I tried fanntool, but it did not start on my machine...
I think I ran into that problem as well - I think it is distributed as
an 64bit binary with the source code so you need to recompile it using
code::blocks.
To automatically create Fanns with fannj I create a Fann with a
specified layer structure and then use Trainer to train it. I have also
added a test function (in my patch) which lets you use a separate test
file to compute a more accurate MSE.
Daniel
First errors: Added JUnit4 as reference, fixed that. Then, downloaded
and installed JNA and edded the .jar as referenced library.
Now there are no errors shown anymore. But when I try to run the JUnit
test, it loooks like it does not find the resources (*.data and *.net
files) - what did I do wrong?
Thanks, jUnit noob here :D
Hannes
The files it is looking for are not where it expects them to be but they
are in the source: src/test/resources
I suspect that there is some maven magic that makes this work neatly but
I just did the ugly hack of copying them from src/test/resources into
the correct location in bin/
I hope that this helps,
Daniel
Daniel
Thanks for the quick patching. I'll take a look if it works now as I
need it. I have some more ideas for fixes, but firstly a small typo
fix:
(you typoed neurons twice in the docstrings in Layer.java)
I also have no idea why the patch is that long/complicated for those 2
small fixes. Stupid eclipse. Also the junit tests work now if I copy
the data files to the correct directory.
Index: src/main/java/com/googlecode/fannj/Layer.java
===================================================================
--- src/main/java/com/googlecode/fannj/Layer.java (revision 45)
+++ src/main/java/com/googlecode/fannj/Layer.java (working copy)
@@ -26,50 +26,52 @@
*/
public class Layer extends ArrayList<Neuron> {
- private static final long serialVersionUID = -6467294440860703773L;
+ private static final long serialVersionUID = -6467294440860703773L;
- /**
- * Create a Layer with the specified number of neurons with the default
- * Activation Function: {@link Neuron.DEFAULT_ACTIVATION_FUNCTION} with
- * steepness: {@link Neuron.DEFAULT_ACTIVATION_STEEPNESS}
- *
- * @param numNeurons
- * @return
- */
- public static Layer create(int numNeurons) {
- return create(numNeurons, Neuron.DEFAULT_ACTIVATION_FUNCTION,
- Neuron.DEFAULT_ACTIVATION_STEEPNESS);
- }
+ /**
+ * Create a Layer with the specified number of neurons with the default
+ * Activation Function: {@link Neuron.DEFAULT_ACTIVATION_FUNCTION} with
+ * steepness: {@link Neuron.DEFAULT_ACTIVATION_STEEPNESS}
+ *
+ * @param numNeurons
+ * @return
+ */
+ public static Layer create(int numNeurons) {
+ return create(numNeurons, Neuron.DEFAULT_ACTIVATION_FUNCTION,
+ Neuron.DEFAULT_ACTIVATION_STEEPNESS);
+ }
- /**
- * Create a Layer with the specified number of neruons and a particular
- * ActivationFunction with the steepness:
- * {@link Neuron.DEFAULT_ACTIVATION_STEEPNESS}
- *
- * @param numNeurons
- * @param activationFunction
- * @return
- */
- public static Layer create(int numNeurons, ActivationFunction
activationFunction) {
+ /**
+ * Create a Layer with the specified number of neurons and a particular
+ * ActivationFunction with the steepness:
+ * {@link Neuron.DEFAULT_ACTIVATION_STEEPNESS}
+ *
+ * @param numNeurons
+ * @param activationFunction
+ * @return
+ */
+ public static Layer create(int numNeurons,
+ ActivationFunction activationFunction) {
- return create(numNeurons, activationFunction,
Neuron.DEFAULT_ACTIVATION_STEEPNESS);
- }
+ return create(numNeurons, activationFunction,
+ Neuron.DEFAULT_ACTIVATION_STEEPNESS);
+ }
- /**
- * Create a Layer with the specified number of neruons and a particular
- * ActivationFunction with specified steepness
- *
- * @param numNeurons
- * @param activationFunction
- * @param steepness
- * @return
- */
- public static Layer create(int numNeurons, ActivationFunction
activationFunction,
- float steepness) {
+ /**
+ * Create a Layer with the specified number of neurons and a particular
+ * ActivationFunction with specified steepness
+ *
+ * @param numNeurons
+ * @param activationFunction
+ * @param steepness
+ * @return
+ */
+ public static Layer create(int numNeurons,
+ ActivationFunction activationFunction, float steepness) {
- Layer layer = new Layer();
- for (int i = 0; i < numNeurons; i++)
- layer.add(new Neuron(activationFunction, steepness));
- return layer;
- }
+ Layer layer = new Layer();
+ for (int i = 0; i < numNeurons; i++)
+ layer.add(new Neuron(activationFunction, steepness));
+ return layer;
+ }
}
> I also have no idea why the patch is that long/complicated for those 2
> small fixes. Stupid eclipse.
Ah yes, found out why the diffs are so big. Which code formatting do
you use? Do you have a standard?
-> I'll use the mentioned settings and submit my patches even if some
"untouched" code is always in the diff too
Hannes
Hannes
Hannes
Well, I've attached the patch again, this time it should work and
support the cascade2 algorithm for creating an ANN without specifying
the hidden layers first, including a first training file.
Hannes