We have one hand tracker and one head tracker, and both of them are connected by VRPN. Is it any possible that I can differentiate them? Also can I ask if there is one demo for testing the tracking please? For example, one program to display some spheres representing the trackable markers.
Also are there any examples for the head tracking?
Based on my understanding, the wand service consists of two services: xinput and vrpn. I guess the vrpn is for the trackable markers and the xinput is for the controller. Can I ask how can I connect the controller to the PC please? I think there is a software named MotioninJoy, so are there any other softwares or libraries I should install? Any examples for the use of wand please?
Thanks a lot!
as for the scale of movement, if you mean the head tracked movement, I would strongly discourage you from scaling it to anything different than real world coordinates, since things will start looking incorrect for various reasons. Your best solution would be to rescale the entire scene(ie by doing getScene().setScale(scalex,scaley,scalez))
Can I ask how different rigid bodies are differentiated by OmegaLib? The reason I ask that is that it seems that hand and head tracking share the same name but different sensor ids in vrpn so in the configuration file in OmegaLib I have to give them the same name. Thus I have some troubles to detect them individually. For example, when I move the wand, the system consider that as "my head" and starts the head tracking. Could you please give some suggestion about that?
" VRPNService:
{
serverIP = "localhost"; // This is the IP of all trackable objects below, unless marked otherwise
trackedObjects:
{
Head_Tracker:
{
name = "DTrack";
trackableID = 1;
sensorID = 0;
serverIP = "localhost";
};
Wand_Tracker:
{
name = "DTrack";
trackableID = 1;
sensorID = 1;
serverIP = "localhost";
}; };
};"