GSoC Project : Control of Simulations with Gesture Input

105 views
Skip to first unread message

Aniket Gupta

unread,
Mar 16, 2015, 12:55:14 AM3/16/15
to cc-dev...@googlegroups.com
Hi,

I am a sophomore undergraduate student in Computer Science and Engineering from Indian Institute of Technology, Roorkee. I'm highly interested in HCI. I'm in a nationwide-popular development group(SDSLabs) and have been developing applications for the same since 2 years. I'm proficient at Ruby, PHP, JAVA, C++ and hardware(worked with leap, arduino etc.). I'm new to open source development though and I hope I'll progress soon.

I went through the list of your interesting proposed projects and the one that fascinates me the most is the Control of Simulations with Gesture Input.

I saw that you intend to use a Leap or a Kinect Device and I'm familiar working with both as I've used them in my applications in the past. I'll be digging upon the pros and cons of using both of them assigning scores to each of the pros according to their relevance to the projects and then I'll come up with strategies of how should they be implemented. I'll be coming up with mock-ups necessary and their proofs of concept.

For now I backup leap as it is easier for small lab applications. The more complicated it gets, the more people will be reluctant in using it. I'm referring to the guidelines being given on the other threads of this topic. Please give a feedback if I am missing out something. 
Looking forward to contributing to the lab!
Cheers.
Aniket

Aniket Gupta

unread,
Mar 17, 2015, 4:11:32 AM3/17/15
to cc-dev...@googlegroups.com
Hi,

I researched a bit on our two present options : Leap and Kinect. 

Comparing the two on a logical as well as technical basis, Leap seems to be a better option than Kinect. I'll be using the term movements/actions(to mean gestures) too as we're talking about kinect also in this context. Now, I'll elaborate how.
  • We need students to use the simulations, whether in school or at home i.e. where ever they wish to. Thus, portability in this case is an important issue to be addressed.
    Leap is a small handy device that can be carried in a pocket whereas kinect needs space and a lot of resources which will induce reluctance among the users and won't bring much utility hence. Students would instead prefer the keyboard-input of simulations.

  • Leap is more accurate. Various studies and researches show that if we need a limited number of actions to be performed leap maps them beautifully, though it's range is just 8 cubic feet. See a pdf attached for a reference on how accurate it is. Shows a standard error of only about 0.5 mm.

  • Cost. A kinect is three times the cost of a leap, thus it has reached fewer audiences and it's addition to our simulations would thus be acclaimed only by a few. Whereas leap with lesser costs and almost equal functionalities(read sufficient for lab simulations) is found more frequently.

  • We need finger tracking more than our feet movement or hand movements because that is more convenient.
Now, as far as accuracy of leap is concerned, Quoting from the website "In just one hand you have 29 bones, 29 joints, 123 ligaments, 48 nerves and 30 arteries. The Leap Motion Controller has come really close to figuring it all out."
Hence, I feel we should go ahead with Leap. Do tell if there's somewhere I'm wrong or if I'm missing out something. I'll be coming up with more points and a mock-up soon.
Thanks,
Aniket
leap-motion-accuracy.pdf

Nathan Kimball

unread,
Mar 18, 2015, 12:35:25 PM3/18/15
to cc-dev...@googlegroups.com
Thanks, Aniket for your comparison of the devices.  At this point, the most useful thing would be to focus on the programming of simulations. We will be adding gesture control to some of these simulations so they are very relevant to the project.  A post by Dan Damlin here could get you started. Other posts on this list could help too.

Best, -Nathan

Aniket Gupta

unread,
Mar 18, 2015, 3:54:56 PM3/18/15
to cc-dev...@googlegroups.com
Hi Sir,

In addition to the previous mail, I would like to address the question in the project description i.e. if learning through gestures enhances learning experience. On searching more I'm sure it does. Regarding my viewpoint, I've drawn inspiration from related articles over the internet, mainly this and this.

When people speak or do some activity, they spontaneously gesture. The gestures helps to attach meaning and retaining things better as studies in this area demonstrate. Thus, it would be easier for them to understand and retain things using gestures, also drastically improving the user experience.
I'd also suggest adding audio inputs/outputs to our simulations, alongside gesture input. Reading articles made me realize that apart from gestures, audio learning has also proven to be very effective amongst learning techniques. Inputs through keyboard are lesser intuitive than these methods hence take longer time to sink in. Adding audio thus can be a later part of the project as it needs more research
Now that I've analysed the positivity of our approach and the direction of development propagation, I'll start making models and writing code for leap integration with our simulations and simultaneously work on the issues that Dan Damlin posted. I really appreciate the guidance. Thanks. :)
I'll get back to you soon. 
Thanks,
Aniket.

Aniket Gupta

unread,
Mar 22, 2015, 2:43:35 PM3/22/15
to cc-dev...@googlegroups.com
Hi Nathan,

  I was working upon possibilities of leap device usage for our simulations. I have been reading the leap developer release notes extensively having tried the in-built functions and basic custom gestures writing some joint-mapping algorithms. Now, I can pretty confidently use leap's pre-defined functions / gestures for my own requirements. I have now started with the analysis on how to integrate a leap with any of our simulations. So, do you have any simulation in mind that I should try integrating gestures to?

Also, while I was thinking of going ahead with this simulation, I've come up with a lot of ways this can be implemented. Now comes the main point of discussion i.e. What all functionality should the gestures provide?

Note :- All gesture names written below are terms used in leap's documentation.

Some of the one's I was thinking of were :-
  •  Page up or down scroll by doing a up or down hand-drag gesture respectively.

  •  Finger circular movement gesture for increasing / decreasing the value of a spinner. Or in case of Slider, a clockwise gesture to move slider to the right and anti-clockwise to move it to the left. 

  • Switching between sliders using the key-tap gesture.

  • Play a paused or still simulation/ Pause a playing simulation, using the screen tap gesture.

This is one approach to gesture simulation. But maybe these won't attach a lot of ''''meaning'''' which is what the project aims for. Hence, then we'll define gestures separately for each simulation rather than for the common elements of a simulation and serve them to the users. Some of them can be like :-
  •  A closed-fist gesture for increasing the pressure. An open palm thus for decreasing it.

  • Palms of two hands going apart/coming close for increase/decrease in volume. 

  • Play/Pause remains the same as above.
A lot more gestures that can be implemented but the point of this mail is to decide HOW they need to be implemented.


Now, comes the most difficult and technical approach but this is what users will absolutely love.( Me included! ) :D

This is called Train your Leap. We can give a personalized experience to each of our user, so that they can map the gestures with simulation elements( such as temperature slider, play/pause button etc.) however they want to. See a diagram of the process attached. We intend to save the gestures a user decides once and use them every other time he opens the simulation in the same machine and in the same browser.

Now explaining how do we obtain this a bit technically :-

This would work somewhat like cookies. When the user maps the simulation the first time we save a simulation-gesture-map file in their system containing his mapping and every time he changes a gesture map, those changes are made in the simulation-gesture-map file too. This way he gets a personalized experience of our simulations. The format of the file and other specifications can be decided later if we move ahead with this idea.

Please do give your thoughts and suggestions, how much user acceptability do you expect out of each of these and also how can these be improved. Audiences will love gesture additions I'm sure and taking care of the users this way will get our simulations to reach an even larger user-base.
Eagerly waiting for your reply. 
Thank You,
Aniket Gupta.
ankg.github.io
LeapPersonalization.jpg

Nathan Kimball

unread,
Mar 23, 2015, 4:19:35 PM3/23/15
to cc-dev...@googlegroups.com
Hello Aniket,

One set of models that we are interested in using with gesture input are the ones in lab.concord.org listed under Science of Atoms and Molecules: Gas Laws. One example (the most complicated) is at http://lab.concord.org/interactives.html#interactives/sam/gas-laws/7-why-did-the-can-collapse.json  However, if you want to work with these, I would suggest starting with a simpler one in this group.

You are quite right. The gestures should be meaningful regarding the phenomenon described by the simulation. The gesture input is not just an air-based mouse.  Presently, we are researching the gestures that students use to describe their understanding of the phenomena such as the relationship between temperature and pressure.  Our intention is find gestures that students find meaningful and then incorporate them into the models.  It is too early in the research to say what they may be.  However, we have some evidence that pressure is often described by holding one hand flat and vertical and have the fingers of the other had drum on the palm of the hand held flat, in this way to show molecules bouncing on the surface of a container.  Fast drumming would increase pressure and slow drumming would lower pressure. This is just an idea, and but it illustrates how a motion might describe a phenomena.

At this point, just getting something to work with gesture is the initial challenge.  I'm am intrigued with the idea of training the simulation, and it could be useful in research, but it certainly is a more distant step.

Best, -Nathan
 

--
--
----
post message :cc-dev...@googlegroups.com
unsubscribe: cc-developer...@googlegroups.com
more options: http://groups.google.com/group/cc-developers?hl=en
---
You received this message because you are subscribed to the Google Groups "Concord Consortium Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cc-developer...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Aniket Gupta

unread,
Mar 24, 2015, 8:40:17 AM3/24/15
to cc-dev...@googlegroups.com
Hi Nathan,

You are certainly right that it is a distant step and would be difficult to implement.

I am thinking more on the specifics though, because it would be great if this can be implemented. Also for trial I decided to make changes over the existing code-base rather than creating a test application from scratch as that would be too much effort for lesser gain.

I'd like to remind you that a leap functionality fires as soon as you plugin the device. After installation a leapd process used for initiating the leap device runs in the background as a daemon, and starts with the boot processes. As of now, I'm trying to integrate leap with this 
interactive http://lab.concord.org/interactives.html#interactives/sam/gas-laws/6-number-volume-relationship.json for a start.

The functionality I'll try to add, and the gestures that'll cause them :-
1. Play/Pause Simulation :- Screen Tap gesture
2. Increase/Decrease Molecule number :- up/down flat palm gesture
3. Page Scroll :- up/down drag gesture
4. Reset Simulation :- Double tap, maybe?

I understand that bringing about these changes in the large code-base will take a little time. Please comment on this approach and suggest if there's something wrong with this. 
Thanks,
Aniket.
Reply all
Reply to author
Forward
0 new messages