On Tuesday, August 4, 2015 at 4:46:02 PM UTC-7,
sean....@gmail.com wrote:
> Thanks for the information, especially about learning anti-rewards.
> The key idea about prefix/context trees for reinforcement learning is that early on it would only have learned very short simple rules that would only boost its ability to get another reward a little bit.
I agree.
> Over time it would acquire more specific rules with higher probabilities.
I agree.
> I presume there would be a snowball effect over time.
I so believe in this.
> I'm not sure if reinforcement learning has been done in that exact way before. The machine learning literature is extensive. Anyway I'll try.
Good luck. I find those paper difficult to read. I would be very pleased to know
what you fine.
For building a tree, i need a source of data. like white noise, music, or
silence, coming form the a radio.
to make a perceptron,
lets say, I have thousand hand crafted svm algorithm to choose from.
I randomly generate a list of svm algorithms that are going to sample the
stream of sound. It will be A perceptron detecting a specific something at a
unknown fidelity.
List could be one line, or a few thousand lines.
There is the data going in but What is coming out?
Is the perceptron a one shot. Or is it activating on anything.
I make pattern loops out of what perceptrons activated.
If a randomly generated perceptron never fires, it will be deleted sometime
later, no hurry.
If there is one that fires all the time and a different that does fire
periodically will replace it.
What could these perceptron detecting? It could be, a auto encoder, edge
detector, feature detector, Object detector, dithered reality, or fantasy.
All life forms must comply to energy management scheme, or other survival
scheme. So this is what i use to get the best percetrons. The reward of
energy.
I do it this way because i can make allot of them threw a automated
process, from simple to complex.
I can make Neural Network perceptron, too. Which i call a NN chip.
Theses peceptron can be strung together to make a cascading RNN.