The ANN business with Shen: A public (simple) demo

136 views
Skip to first unread message

Antti Ylikoski

unread,
May 24, 2016, 11:10:37 AM5/24/16
to Shen
I have been working with Artificial Neuron Networks applications of
Shen.

The first software that will be commercial is approaching completion.

I include one simple ANN demo here: the McCulloch-Pitts neuron demo
from the Luger AI textbook, for an introduction to the ANN business.

The comments explain the usage of the file.  This ANN program uses
vectors; later ANN programs of mine apply lists instead.

All sorts of problems have occurred.  First I had the problems with
the Hopfield ANNs that I have mentioned here; Then a more serious
problem occurred, namely:

The question of optimizing NP-complete problems with Hopfield ANNs is
a research problem.  Can I give the customer some Hopfield ANN
software, and promise that he/she can optimize his/her NP-complete
optimization problems with it?

I came to the conclusion that most probably I cannot do so.

Anyway, if someone wants to see some intelligent application program
use of Shen, here comes one (simple) example.

yours, A. J. Y.
HELSINKI
Finland, the EU

PS.  The "problems" with the Hopfield ANNs were solved, part because
of the smart Shen group.  But the real problem is the fact that this
NP-complete problems optimization question with Hopfield ANNs is to my opinion,
as I said, a research question.

PPS.  The McCullough-Pitts program is not type secure -- because the
defstruct library that I used is not type secure.  Is the SP defstruct
type secure?


mcc-p.shen

Mark Tarver

unread,
May 24, 2016, 3:52:50 PM5/24/16
to Shen
Perhaps if you put a script showing your program in action or even better - a video screen shot- it might help people appreciate the potential.

Mark

Mark Tarver

unread,
May 24, 2016, 3:53:41 PM5/24/16
to Shen
SP does not use defstruct - though of course it can be loaded.

Mark

Antti Ylikoski

unread,
May 24, 2016, 4:33:33 PM5/24/16
to Shen
Can you recommend some other functionality that I could use instead of that non type secure defstruct.shen that I have applied?

I could I think write something to replace it  myself.  At least two possibilities come to my mind.

A. J. Y.

Antti Ylikoski

unread,
May 24, 2016, 4:54:42 PM5/24/16
to Shen

Thank you for the advice: Below is a screen shot from a demo run:

------------------------------------------------------------

antti@antti-HP-630-Notebook-PC:~/ShenANN$ ./Shen

Shen, copyright (C) 2010-2015 Mark Tarver
running under Common Lisp, implementation: CLisp
port 1.9 ported by Mark Tarver


(0-) (load "defstruct.shen")
defstruct.type#struct-type
defstruct.char-upcase
defstruct.string-capitalize
defstruct.sym-capitalize
defstruct.slot-type
defstruct.slots-types
defstruct.slots-defs
defstruct.datatypes
defstruct.accessors
defstruct.setters
defstruct.constructor-type
defstruct.constr-init
defstruct.constructor
defstruct.struct-aux
defstruct-macro

run time: 0.7919999919831753 secs
loaded

(1-) (load "mcc-p.shen")
type#neuron1
mk-neuron1
neuron1-neuron-output->
neuron1-threshold-function->
neuron1-activation-level->
neuron1-weights-vec->
neuron1-inputs-vec->
neuron1-nr-inputs->
neuron1-neuron-output
neuron1-threshold-function
neuron1-activation-level
neuron1-weights-vec
neuron1-inputs-vec
neuron1-nr-inputs
transfer-function
activation-level-aux
activation-level-aux-h
tres-f
<3 <... ... ...> <... ... ...> 0 #<FUNCTION LAMBDA (V1616) (tres-f V1616)> 0>
<3 <0 0 1> <... ... ...> 0 #<FUNCTION LAMBDA (V1616) (tres-f V1616)> 0>
<3 <0 0 1> <-1 1 0> 0 #<FUNCTION LAMBDA (V1616) (tres-f V1616)> 0>
ann-demo

run time: 1.4439998865127563 secs
loaded

(2-) (ann-demo)

Give A: 1

Give B: 1

Output of neuron: 1
[]

(3-) (ann-demo)

Give A: 1

Give B: 0

Output of neuron: -1
[]

(4-) (ann-demo)

Give A: 0

Give B: 1

Output of neuron: 1
[]

(5-) (ann-demo)

Give A: 0

Give B: 0

Output of neuron: 1
[]

(6-) (QUIT)
antti@antti-HP-630-Notebook-PC:~/ShenANN$

------------------------------------------------------------

The ANN movement began with McCullough and Pitts demonstrating that
their ANNs can compute all logical functions.

But compared eg. to the backpropagation ANNs these McCullough-Pitts
ANNs are rather simple.  Much better stuff is on the way to come!

A. J. Y.

Mark Tarver

unread,
May 25, 2016, 1:27:40 PM5/25/16
to qil...@googlegroups.com
Any defstruct can be replaced by a tuple and recognisor/selector/constructor functions are straightforward.  You can either code these yourself or write a macro to do the job.

Mark

--
You received this message because you are subscribed to the Google Groups "Shen" group.
To unsubscribe from this group and stop receiving emails from it, send an email to qilang+un...@googlegroups.com.
To post to this group, send email to qil...@googlegroups.com.
Visit this group at https://groups.google.com/group/qilang.
For more options, visit https://groups.google.com/d/optout.

Antti Ylikoski

unread,
Jun 5, 2016, 9:48:30 AM6/5/16
to Shen

I decided to post in this group one more demo of doing ANNs with Shen.

This ANN has been implemented with vectors, later on the author
decided to use lists.

This one is the Bidirectional Associative Memory.  The ANN associates
a list of X(i) values with a list of Y(i) values, to the both
directions, ie. X(i) ==> Y(i) and Y(i) ==> X(i).

See the George F Luger textbook on Pages 460 -->.

Below is a sample run.  The reader really needs the textbook to fully
understand everything that is going on.  Yes, and there is there the
little typo I made: threshold pro "treshold".

So this ANN business is going to be one of the first commercial
applications of the SP.


------------------------------------------------------------

antti@antti-HP-630-Notebook-PC:~/ShenANN$ ./Shen

Shen, copyright (C) 2010-2015 Mark Tarver
running under Common Lisp, implementation: CLisp
port 1.9 ported by Mark Tarver


(0-) (load "defstruct.shen")
defstruct.type#struct-type
defstruct.char-upcase
defstruct.string-capitalize
defstruct.sym-capitalize
defstruct.slot-type
defstruct.slots-types
defstruct.slots-defs
defstruct.datatypes
defstruct.accessors
defstruct.setters
defstruct.constructor-type
defstruct.constr-init
defstruct.constructor
defstruct.struct-aux
defstruct-macro

run time: 0.8000000007450581 secs
loaded

(1-) (load "tc_for.shen")
for-expand
for-macro

run time: 0.07599997520446777 secs
loaded

(2-) (load "array.shen")
outer-product
scalar-mult
output-array
array-sum
transpose
output-array-v

run time: 0.30400002002716064 secs
loaded

(3-) (load "BAM.shen")
type#bam
mk-bam
bam-outputs-vec->
bam-nr-outputs->
bam-neuron-output->
bam-treshold-function->
bam-activation-level->
bam-weights-vec->
bam-inputs-vec->
bam-nr-inputs->
bam-neuron-name->
bam-outputs-vec
bam-nr-outputs
bam-neuron-output
bam-treshold-function
bam-activation-level
bam-weights-vec
bam-inputs-vec
bam-nr-inputs
bam-neuron-name
[]
add-neuron-to-oblist
treshold
transfer-function
activation-level
activation-level-h
<"X1" 3 <... ... ...> <... ... ...> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"X2" 3 <... ... ...> <... ... ...> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"X3" 3 <... ... ...> <... ... ...> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"X4" 3 <... ... ...> <... ... ...> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"Y1" 4 <... ... ... ...> <... ... ... ...> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <... ... ... ...>>
<"Y2" 4 <... ... ... ...> <... ... ... ...> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <... ... ... ...>>
<"Y3" 4 <... ... ... ...> <... ... ... ...> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <... ... ... ...>>
[]
[]
[]
[]
[]
[]
[]
<(@p <1 -1 -1 -1> <1 1 1>) (@p <-1 -1 -1 1> <1 -1 1>)>
<<1 -1 -1 -1> <1 -1 -1 -1> <1 -1 -1 -1>>
<<-1 -1 -1 1> <1 1 1 -1> <-1 -1 -1 1>>
<<0 -2 -2 0> <2 0 0 -2> <0 -2 -2 0>>
<<0 2 0> <-2 0 -2> <-2 0 -2> <0 -2 0>>

<<0 -2 -2 0>
 <2 0 0 -2>
 <0 -2 -2 0>
>
[]

<<0 2 0>
 <-2 0 -2>
 <-2 0 -2>
 <0 -2 0>
>
[]
<"Y1" 4 <... ... ... ...> <0 -2 -2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <... ... ... ...>>
<"Y2" 4 <... ... ... ...> <2 0 0 -2> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <... ... ... ...>>
<"Y3" 4 <... ... ... ...> <0 -2 -2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <... ... ... ...>>
<"X1" 3 <... ... ...> <0 2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"X2" 3 <... ... ...> <-2 0 -2> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"X3" 3 <... ... ...> <-2 0 -2> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"X4" 3 <... ... ...> <0 -2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"X1" 3 <... ... ...> <0 2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"X1" 3 <... ... ...> <0 2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <(@p "Y1" 1) (@p "Y2" 1) (@p "Y3" 1)>>
<"X2" 3 <... ... ...> <-2 0 -2> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"X2" 3 <... ... ...> <-2 0 -2> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <(@p "Y1" 2) (@p "Y2" 2) (@p "Y3" 2)>>
<"X3" 3 <... ... ...> <-2 0 -2> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"X3" 3 <... ... ...> <-2 0 -2> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <(@p "Y1" 3) (@p "Y2" 3) (@p "Y3" 3)>>
<"X4" 3 <... ... ...> <0 -2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <... ... ...>>
<"X4" 3 <... ... ...> <0 -2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 3 <(@p "Y1" 4) (@p "Y2" 4) (@p "Y3" 4)>>
<"Y1" 4 <... ... ... ...> <0 -2 -2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <... ... ... ...>>
<"Y1" 4 <... ... ... ...> <0 -2 -2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <(@p "X1" 1) (@p "X2" 1) (@p "X3" 1) (@p "X4" 1)>>
<"Y2" 4 <... ... ... ...> <2 0 0 -2> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <... ... ... ...>>
<"Y2" 4 <... ... ... ...> <2 0 0 -2> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <(@p "X1" 2) (@p "X2" 2) (@p "X3" 2) (@p "X4" 2)>>
<"Y3" 4 <... ... ... ...> <0 -2 -2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <... ... ... ...>>
<"Y3" 4 <... ... ... ...> <0 -2 -2 0> 0 #<FUNCTION LAMBDA (V1810) #'(LAMBDA (V1811) (treshold V1810 V1811))> 0 4 <(@p "X1" 3) (@p "X2" 3) (@p "X3" 3) (@p "X4" 3)>>
send-outputs
find-neuron
find-neuron-h
test-bam-x
list2vec
list2vec-h
test-bam-y

run time: 2.6360000371932983 secs
loaded

(4-) (test-bam-x)  \\ See the x1 <-> y1, x2 <-> y2 values Luger p. 460

Give a sequence of 4 Xi neuron values: 1 -1 -1 -1

Numbers: [1 -1 -1 -1]

Vector: <1 -1 -1 -1>


Output of ANN: [1 1 1]
[]

(5-) (test-bam-y)

Give a sequence of 3 Yi neuron values: 1 1 1

Numbers: [1 1 1]

Vector: <1 1 1>


Output of ANN: [1 -1 -1 -1]
[]

(6-) (test-bam-x)

Give a sequence of 4 Xi neuron values: -1 -1 -1 1

Numbers: [-1 -1 -1 1]

Vector: <-1 -1 -1 1>


Output of ANN: [1 -1 1]
[]

(7-) (test-bam-y)

Give a sequence of 3 Yi neuron values: 1 -1 1

Numbers: [1 -1 1]

Vector: <1 -1 1>


Output of ANN: [-1 -1 -1 1]
[]

(8-) (QUIT)
antti@antti-HP-630-Notebook-PC:~/ShenANN$

------------------------------------------------------------


This one is here as one more example of the (here, non-proprietary)
techniques to implement ANNs with Shen.


yours, Dr A. J. Y.
HELSINKI
Finland, the E.U.
BAM.shen
array.shen

Robert Herman

unread,
Jun 7, 2016, 9:32:06 AM6/7/16
to Shen
Antii,

I am just wondering what you will be offering commercially re: ANNs and Shen. A library? A package to solve a specific domain set?

I am currently working with TWEANNs in LFE (Lisp Flavored Erlang) due to its distribution model, but would like to contribute to Shen. I would not want to double any efforts already being made.

I am also playing with Darknet, a neural network library written in C that leverages the GPU and OpenCV. It has great practical examples of using RNNs and CNNs too.

My particular area of interest is in using ANNs and other evolutionary computation techniques to make art, music or interesting combinations of both. I dabbled with using them on time series data such as found in the finance markets or a musical score, but only for fun. No real progress other than learning in those areas.

I do not have the Luger book, but I have a bunch of other NN books from the early 90s to 2016. Could you confirm which Luger book? I can then see if I can see what it is you are working on.

Regards,

Rob
Message has been deleted

Antti Ylikoski

unread,
Jun 7, 2016, 12:16:54 PM6/7/16
to Shen

Hello Robert:



I'm planning to offer together with the Lambda Associates, packages to
solve typical ANN problems.

One characteristic example would be, an error backpropagation ANN to
classify arbitrary objects, ie. the ANN can be trained to classify any
numerically describable objects: credit card fraud; enemy aircraft;
anything that one can put into a numerical form.

This Hopfield project of mine did not succeed quite as expected, which
has slowed down producing commercial products.  The final word on
optimizing functions with Hopfield ANNs seems to be, that they can be
used as optimizers; but they cannot mathematically be guaranteed to
converge on the optimal solution.  (They may converge onto a
suboptimal local minimum.  Will the customer want this?)

Once I have this basic framework of ANNs with Shen working, it will,
in all probability, be rather straightforward to author recurrent
ANNs; convolutional ANNs; and in general various kinds of ANN
software.  We shall see.

The Luger book I used is:

George F Luger: ARTIFICIAL INTELLIGENCE, 4th Edition
Pearson--Addison-Wesley
ISBN 0-201-64866-0

but it is too basic for a good scientific reference, my current work
is based on a better book:

Simon Haykin: NEURAL NETWORKS, A Comprehensive Foundation
I have the 2nd edition
Prentice--Hall
ISBN 0-13-273350-1

Could you point out for me some good books as to eg. recurrent and
convolutional ANNs, and generally the most modern research---


kind regards, Dr Antti J Ylikoski
HELSINKI
Finland, the EU


Reply all
Reply to author
Forward
0 new messages