Brekel Kinect Pro Body

0 views
Skip to first unread message
Message has been deleted

Towanda Tuning

unread,
Jul 10, 2024, 9:16:38 AM7/10/24
to nogebatu

First, I bought two used old Xbox Kinects, connected them to a PC, and recorded mocaps using the brekel body. I was able to get some pretty bad results. However, when a part of the body was covered, tracking was not performed properly, and Microsoft stopped supporting Kinect, causing problems with long-term use.

Brekel Kinect Pro Body


Download File https://ckonti.com/2yLzro



I tried video-based motion capture such as Deepmotion and Plask. It was greatly affected by the brightness of the surrounding environment and the color of the clothes worn by the actors. I also tried TDPT, an ios-based motion capture app. It is an app that enables real-time motion streaming and export. I was able to get pretty decent results. However, like the Kinect, if a part of the body is covered, tracking is not performed properly.

The ideal environment is to buy a lighthouse-based base station and vive tracker, but since it is expensive, I first tried diying and testing a slime vr tracker that can be used in steam vr. slimevr is an imu-based VR tracker and has been released as an open source. The parts cost is very cheap(Almost $100) .

The VR equipment I have is Quest 2, and it is possible to connect it to a PC using a virtual desktop. The body parts where the slimevr tracker is worn are chest, waist, hips, knees, ankles, and feet. The head and both hands use the hmd from Quest 2 and the left and right controllers. The steam vr mocap tool used for testing is mocap fusion. I was able to get some pretty good results. Because it is imu-based, it was not affected by the body being obscured by obstacles, and it was possible to precisely track the rotation of the joint. However, since slimevr operates in the fk method, which calculates the position of the rest of the body based on the position of hmd, the foot slips when moving. It was possible to minimize it by entering the body length as accurately as possible. The same imu-based mocap suit (Rokoko, Xsens) seems to prevent slipping by checking whether the foot is in contact with the floor in the software. It is thought that it is possible to filter motion in a similar way in blender.

Inspired by the results of the mocap test using SlimeVR and Steam VR, I am now considering purchasing a vive tracker. However, I am hesitating because the price is relatively high.
If anyone has any other inexpensive mocap methods or ideas for improvement, please let me know!

There always is brekel. The hardest part afaik was parsing the data into touchdesigner. I think it also was pretty powerhungry, but that is more in regards to Azure Kinect in general:
-body-v3/
Pinging @MXZEHN with some in depth experiences.

Hi,
I want to share a music video that I made in Grasshopper. I have used MIDI data from the recorded instruments (i use digital drums and piano) to trigger the events in Grasshopper. The skeleton tracking was done with a kinect and was recorded in brekel body.

This is beyond cool! Wow!
Thanks for sharing, that is an awesome inspiration to start the summer with.
Great camera movements too, coherent with the mood of the music. A fun and great piece of artwork, I am truly impressed!

7fc3f7cf58
Reply all
Reply to author
Forward
0 new messages