Have anybody ever get the code working? Have anyone ever seen a live demo?

398 views
Skip to first unread message

jay

unread,
Sep 17, 2014, 1:52:09 PM9/17/14
to ss_...@googlegroups.com
Please forgive my ignorance and bluntness.  Have anyone ever get this technology working in real life?  If so, could you post a demo?

Despite the huge buzz it made, I have not seen a single live demo of this technology.  I suspect the videos shown in the TED talks to be fake (i.e. the projector merely projects a predetermined video and the user show the corresponding gesture to fake real time interaction between human and the device), but I could be wrong...  I would love to know if anyone ever got it to work as shown in the TED videos.

Jay.

J.Prakash

unread,
Sep 17, 2014, 2:51:38 PM9/17/14
to jay, SS_DEV
Hi Jay,

Not all the things shown in the video is working with this code base however you could easily invoke a camera and capture the picture. Even you don't actually need a projector to test the code, your laptop cam can be used to control your the mouse action using the color tapes on your finger (Gestures). I worked on this couple of years back along with one of the sixth sense technology enthusiast Mr. Drew and we tried different approaches. Tried using his android mobile cam (US) to control my computer (India) and even succeeded in it.

Also, we should appreciate Mr. Pranav Mistry to release his code as open source to evolve the technology and share his har work to the community without expecting anything.

Hope you have heard about Kinect in XBox which uses the Augmented reality. There are other open source technologies OpenCV, ARToolkit take a look into it.

Don't think that this is fake, all are real. Though there are slight disadvantages in this but Sixth sense is the first and one of the biggest technology inspiration and using which people are making the lot of advancement and implementing the parts of it in different applications. It is all how you conceive the imagination and dream big to take to the next level. Remember nothing is impossible!

--
JP 

--
Code: https://github.com/sixthsense
Discussion: https://groups.google.com/d/forum/ss_dev?hl=en
---
You received this message because you are subscribed to the Google Groups "SixthSense Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ss_dev+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

uohz ama

unread,
Sep 17, 2014, 3:41:06 PM9/17/14
to J.Prakash, SS_DEV
Hi JP,

Thanks for the input.  Do you have a video showing a user interacting with your SixthSense system in real time?

Here is a list of demos shown in the TED talk given by Prof. Maes (courtesy of Wikipedia).  Could you let me know which results you have successfully reproduced?
  1. Four colored cursors are controlled by four fingers wearing different colored markers in real time. The projector displays video feedback to the user on a vertical wall.
  2. The projector displaying a map on the wall, and the user controlling itusing zoom and pan gestures.
  3. The user can make a frame gesture to instruct the camera take a picture. It is hinted that the photo will be automatically cropped to remove the user's hands.
  4. The system could project multiple photos on a wall, and the user could sort, re-size and organize them with gestures. This application was called Reality Window Manager (RWM) in Mann's headworn implementation of Sixth Sense.[11]
  5. A number pad is projected onto the user's palm, and the user can dial a phone number by touching his palm with a finger. It was hinted that the system is able to pin point the location of the palm. It was also hinted the camera and projector are able to adjust themselves for surfaces that are not horizontal.
  6. The user can pick up a product in supermarket (e.g. a package of paper towels), and the system could display related information (e.g. the amount of bleach used) back on the product itself.
  7. The system can recognize any book picked up by the user and display Amazon rating on the book cover.
  8. As the user opens a book, the system can display additional information such as reader's comments.
  9. The system is able to recognize individual pages of a book and display annotation by the user's friend. This demo also hinted the system's ability to handle tilted surface.
  10. The system is able to recognize newspaper articles and project the most recent video on the news event on a blank region of the newspaper.
  11. The system is able to recognize people by their appearances and project a word cloud of related information retrieved from the internet on the person's body.
  12. The system is able to recognize a boarding pass and display related information such as flight delay and gate change.
  13. The user can draw a circle on his or her wrist, and the system will project a clock on it. Note this demo hinted at the ability to accurately detect the location of the wrist.
Thanks a lot!

Jay.

J.Prakash

unread,
Sep 17, 2014, 4:51:38 PM9/17/14
to uohz ama, SS_DEV
Hi Jay,

Sorry, I don't have any video to show you, it was couple of years back I tried this. Appreciate that you have clearly documented all the points from the video. Here are my experience and suggestions

1. Have succeed in using the four color marker on the finger, try using electric insulation tape one of the cheapest available in the market. Instead of projector you can use your monitor and it will behave the same, may be the inverted action will take place. 

The basic problem in using the marker in real world is recognizing the maker color on your fingers. There will be color interference with other objects. If you wear the camera on to your chest then the camera can capture all that are infront of you. If the objects color are mixing with your marker color then there is problem in recognizing it and you may not effectively recognize and apply your gesture . If you have a white wall infront of you then absolutely there will not be any problem. So using this practically may not be possible.

A guy named Rohildev has tweaked the approach and succeed in building a device named Fin, you can check http://www.wearfin.com/ and https://www.indiegogo.com/projects/fin-wearable-ring-make-your-palm-as-numeric-keypad-and-gesture-interface

Here is his achievement, http://indianexpress.com/article/technology/gadgets/at-23-he-is-the-youngest-to-address-world-mobile-congress/ We came to know each other through Sixth sense only. On a similar note there were other applications/devices built on Augmented reality.

It is upto us how we overcome the obstacles.

2. This is pinching action that you are talking about and it is possible.

3. I could achieve this.

4. This was not working.

5. This is not available in the code. It is about recognizing the marker, you can try making an app for this.

10. This is something that Times of India alive app is doing now, check out https://play.google.com/store/apps/details?id=com.adstuck.times.iar&hl=en

13. This is basically gesture recognition and invoking the relevant app. You have to do it.

Rest are all achievable provided you have enough back end server to handle the requests from your device. Imagination is the power of creativity.

Hope this will help you.

--
JP

Joshua Belke

unread,
Sep 22, 2014, 3:30:46 PM9/22/14
to ss_...@googlegroups.com
The technology on the whole has been slow for people to develop.
Besides Fin, the Myo Armband is out there -- and some (like myself) are integrating these technology into digital products to avoid computers and promote human interactions and natural movements. 
Fin has a couple sample videos, Thalmic Labs has a few out there too.
I have no affiliation with either, but can tell you more companies are racing to produce gesture based devices and are dealing with issues from: SDK / False positives / Power 

As far as the device, camera, etc -- Google glass is very notable with a couple other pre-assembled devices on the market that would give you the skeleton to start the object interaction process.

All and all, we're comparatively at the stage when cell phones first came out -- big, bulky, inefficient and expensive (300-1650) for anything that you would want to utilize. 

Between Modular devices in the build (http://www.projectara.com/) and the Internet of Things -- there's plenty of possibility coming soon.

Nilesh Payghan

unread,
Aug 24, 2017, 2:28:02 AM8/24/17
to Joshua Belke, ss_...@googlegroups.com
Now get all Robotics project spare parts at one shop. Step to purchase-

Call on following number
Give your project details
List of your robot equipment
Ask if you need any help ( it's free of cost)
Participate in our Robo event and start your career in Robotics
Give your product delivery address.

Contact
Nilesh Payghan
9689416867

Thank you

Jeevanprasath .k

unread,
Nov 1, 2017, 10:20:54 AM11/1/17
to SixthSense Developers
Hai sir,
I need help to learn about the sixth sense technology
Manufacturing way

Jeevanprasath .k

unread,
Nov 1, 2017, 10:35:59 AM11/1/17
to SixthSense Developers
I am from india in tamilnadu,
Hai sir I need to talk with you sir
Because I am poor knowledge of software codings
So please help me to ok learn it
But I very interested to mark my own sixth sense technology.

Reply all
Reply to author
Forward
0 new messages