Re: [nupic-discuss] Meetup with Jeff Hawkins on Sensorimotor Integration in the CLA

33 views
Skip to first unread message

SeH

unread,
Mar 16, 2014, 12:58:41 PM3/16/14
to NuPIC general mailing list., becca...@googlegroups.com
In the Sensory-Motor integration talk, Jeff mentions testing in a maze with a few sensors and motor controls.  Here is a maze world added to this 3d physics simulation

The bot is a simple worm with angular motor controls for each segment.  It "sees" through raytracing a number of vision paths to find the distance to the nearest collided object (which could also provide the observed color as an R,G,B vector).  The current bot controller is operating entirely randomly but this could be replaced with NuPIC, OpenBECCA, or some other cognition engine or reinforcement learning algorithm.

Any type of world or bot geometry, motors, and sensors could be constructed from the 3d primitives and controllable constraints / joints in the physics engine API - in this case, Ammo.JS (which is an automatic Javascript port of the C++ Bullet 3D physics engine).


Inline image 1Inline image 2Inline image 3



On Fri, Feb 28, 2014 at 4:36 PM, Matthew Taylor <ma...@numenta.org> wrote:
http://www.meetup.com/numenta/events/168671932/

This event is in San Jose on March 14. There are only 50 seats, so RSVP quickly!

---------
Matt Taylor
OS Community Flag-Bearer
Numenta

_______________________________________________
nupic mailing list
nu...@lists.numenta.org
http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org


SeH

unread,
Mar 17, 2014, 9:56:10 AM3/17/14
to NuPIC general mailing list., becca...@googlegroups.com
Gazebo / ROS is a more complete solution, whether in desktop or web mode.   SpacegraphJS is still just a prototype.  Its Ammo.JS + Three.JS browser-based design does have significant performance limitations but could be optimized (ex: WebCL)

The browser-based front-end can be responsible for initiating and controlling the interaction with the cognition engine, probably through websockets for lower-latency I/O.  The attached cognition engine back-end, serving a websockets connection, would be entirely decoupled from the front-end, receiving sensor data and transmitting motor commands.  RLGlue follows a similar model: http://glue.rl-community.org/wiki/Main_Page

Craig's AngryBotsInAiWorld is worth further development too but possibly without a server involved.  Have you seen Quake and other FPS ported to WebGL?  Those older FPS do not involve a physics engine so they are less computationally intensive and so can more easily support larger worlds and multiplayer action.  Bots in such a world might overwhelm current versions of NuPIC if their entire "retina" of thousands of pixels were used as sensory input.  Instead maybe they can, at first, distill the perceived scene to numeric geometry (without cheating) such as knowledge about distance to nearby objects like walls and players, measured at discrete intervals (ex: every 10 degrees rotating around the bot).  This is how conventional FPS bots must operate.

I imagine that 2D and especially 3D physics engine provides interaction subtleties that would test AI unlike any other method.  Very slight movements can result in drastic changes to the world and the agent's body.  In other words, the degree of freedom in the control signal space can be arbitrarily large.

For high performance simulation, a native C++ OpenGL game engine is still the best.  SpacegraphJS is a JS port of SpacegraphC https://github.com/automenta/spacegraphc2 which was originally inspired by Critterding http://critterding.sf.net which could all be connected to a cognition engine in the ways described here.  There's also several open-source Minecraft clones.

So, there's many possibilities.  Difficult to choose.  Any ideas?


On Mon, Mar 17, 2014 at 12:27 AM, Craig Quiter <cqu...@gmail.com> wrote:
Wow, very cool to see another 3D platform for testing sensorimotor algorithms, all in the browser no less! I had looked into ROS / Gazebo which can also render via three.js, but this still needs several supporting services on Linux. http://gazebosim.org/wiki/Tutorials/CloudSim/IntroductionToiPyNotebook  Awesome to see sensors, rendering, and physics all running in the browser. Great work!

On Sunday, March 16, 2014, Chetan Surpur <chet...@gmail.com> wrote:
This is awesome! Are you building this?
Reply all
Reply to author
Forward
0 new messages