> What would be the possibility (or does it already exist) of making a
> custom Hit Detection Patch. By this I mean being able to take any spot
> in the XYZ grid and making a object to hit - definitely possible. Now
> instead of the hand or the head, or the leg interacting with this what
> about anything?
There is the multiple objects hit test patch in the latest version. With this you can generate an object hittable by one skeleton element. If you need it to react to multiple skeleton elements/users you'll need to copy the patch for each skeleton element (hand/foot etc.)/user, and connect the obect's coordinates time.
> The values from the Kinect seem to give properties to the data
> (attributing legs, hands, and such) but what about if anything could
> satisfy a hit. Say a soccer ball, or any hand of any person, without
> need for calibration?
>
> I could see many possibilities where what you need to interact with is
> not a Person, or better yet, is just a part (say a hand). So without
> the need for full calibration wouldn't we get better performance?
>
> What if we could make a patch for hit detection dealing with data
> points instead of translated values a.k.a. a skeleton. Then be able to
> use this with a skeleton possibly.
>
> Has anybody else come to this?
It could be possible, but without calibration you'll have to rely on some sort of blob detection.
This might help:
http://www.patriciogonzalezvivo.com/blog/?cat=118
I am downloading the Synapse project to take a look at it. It seems very intuitive and my Roommate DJ is going to help me with the Abelton side of things. Should be really cool. I totally see how this could save a lot of hassle so we will see where it goes. I think the hardest part of using the tryplex toolkit is the installation. After that it is easy sailing in my book ha. As for an app off of this, it would be interesting to see how to bundle it all together into a clickable app.
As far as the QC community goes, I think we should really try to pull together and collaborate to make this tool really great. I could imagine that the WWDC conference did not talk too much about it, but I feel it is really the way to go and should be progressed as well. I will start working on those letters and videos today, so that will help it along.
It will be interesting to see with Lion as well what updates were put in.
Also, I just got myself an Ipad to use with touchOSC and I think I saw your name on the forums. Have you been using that software? I just used it last night at a show and it rocked!!!
Until then,
Casey
> I am downloading the Synapse project to take a look at it. It seems very intuitive and my Roommate DJ is going to help me with the Abelton side of things. Should be really cool. I totally see how this could save a lot of hassle so we will see where it goes. I think the hardest part of using the tryplex toolkit is the installation. After that it is easy sailing in my book ha. As for an app off of this, it would be interesting to see how to bundle it all together into a clickable app.
True, it's definitely a great step forward in usability. I already contacted Ryan from Synapse, and he's willing to implement full skeleton tracking and maybe in the future some more stuff like multiple skeleton etc. so I'm hoping to put together a tryplex update soon.
> As far as the QC community goes, I think we should really try to pull together and collaborate to make this tool really great. I could imagine that the WWDC conference did not talk too much about it, but I feel it is really the way to go and should be progressed as well. I will start working on those letters and videos today, so that will help it along.
There are quite some discussions over at kineme right now about the position of quartz composer (you started one of those discussions), those are interesting reads. I think more exposure to QC, it's use for visualists and fast prototyping need to be propagated, and basic education is lacking. If the threshold will be less steep to start with QC it'll probably increase the user base. I've got time around the end of July, so it might be good to make some sort of plan together of what we're going to cover/how etc.
> It will be interesting to see with Lion as well what updates were put in.
>
> Also, I just got myself an Ipad to use with touchOSC and I think I saw your name on the forums. Have you been using that software? I just used it last night at a show and it rocked!!!
I did, and also used it for some live performances already. It's definitely really handy for a lot of stuff, although midi/homemade controllers with real buttons are still the best for slamming and rocking during live performances, but touchosc is a great companion!
Best,
Sebastian.
I would suggest you to read the paper proposed by
MS(http://research.microsoft.com/pubs/145347/BodyPartRecognition.pdf)
about the method they used to "track"(if you read the paper you'll
find that it's more like detection rather than tracking) full body
parts and hence avoid pose calibration.
I am glad to hear you were able to contact Ryan, it seems like if he is doing this for Burning Man, and sharing the knowledge with others along the way, then it is bound for good. If this could be used in a Tryplex Toolkit then it would really bring everything together.
For getting a stronger QC base together I think we should go for it. I am still waiting on a copy of Lion but we need not wait just for that. I am out of school for the summer, and if you are going to have some free time soon then why not pool everyone's skills?
And yes, for the OSC app, awesome once again. I see using it with a Tryplex composition to to change what is happening on the fly, eliminating going back to the computer every time I want to change a slider. Also, I had a question about getting custom layouts to send the correct data types to QC. I posted it on the Hexler site, and I was wondering if/how you encountered this problem.
http://hexler.net/forum/viewthread/294/
Cheers,
Casey
Furthermore, do you know of a good blob tracker that could be loaded into Quartz Composer? I did not see a link or any further resources beyond the videos on his blog for QC related files.
Thanks,
Casey
http://www.patriciogonzalezvivo.com/blog/?p=364
It can send out tuio messages that QC supports with this http://www.tuio.org/?software
Dust's sample gives a good representation of what is possible within the limits of QC, and there is also the Kineme CVtools. There are some basic samples in point tracking, and I think those are the basics that the openCCV also uses. However for speed I believe QC can't beat the speed that any C based external program can give you.
Also check out this thread:
http://kineme.net/forum/Discussion/DevelopingCompositions/OpenTSPSOSCdataintoQC
opentsps is something like openCCV, and here they actually got a workaround in getting blob outlines into QC with a processing reroute.
On the KinectCoreVision with finger tracking, I got stuck when receiving TUIO/OSC messages into QC. I was able to get the qcOSC patch to recognize incoming data in port 3333 but was not sure how to interpret "tuio2Dcur" which is what appeared when I started tracking my fingers.
I tried using a structure key but wasn't sure which key to use, 0 and 1 didn't seem to get me too far.
Maybe there is a quick fix for this, but thanks for the info regardless.
Casey
Synapse only outputs head&hands data, no full skeleton. I also ran into a problem with synapse, it freezes after running it for 30 minutes. Did you experience this?
Sebastian.
Great! Maxmsp is expired on my system, so I couldn't edit the patches.
If you could mail or upload it would be very helpfull.
I also had the 1.1 version running for a full day without any problems.
Thanks!
Sebastian.
Hello Victor.
I would be interested in the updated Max file. Would you mind sharing it?
Casey