PointCloud v3 can't get calibration to work at all

157 views
Skip to first unread message

Andrew Hogue

unread,
Mar 24, 2021, 10:49:59 AM3/24/21
to Brekel
Hi,

For the life of me I cannot get the calibration working to align even 2 of my 4 cameras let alone all 4.   

I have a project starting in May that requires us to capture volumetric captures of performances and I have yet to acquire any usable data since initially starting in September (although slowly due to COVID).

I have uploaded recordings of me trying to get the calibration set up with Markers, tried all settings, and with the pointcloud to no avail.

Here are the links to the videos:

Can someone please help me get this set up and working?

Setup Info:
Brekel Version: 0.73 Beta.
4x Kinect Azure : SDK version 1.4.1, Body Tracker version 1.0.1
4 PC's each dedicated to a Kinect Azure.
Good network (on university campus, lab, local, all on ethernet)

Thanks!

-- Andrew Hogue, Professor, OntarioTech University

Andrew Hogue

unread,
Mar 24, 2021, 1:21:38 PM3/24/21
to Brekel
I moved my cameras so they are all at similar heights, all in the same orientation, all pointing towards the center of a small volume.  I was able finally for the marker calibration to provide me with something reasonable in terms of calibration, but it certainly isn't tremendously accurate.  

I performed a short recording to test out the configuration but it seems like the clips are not synchronized at all.  I've tried to align the clips using Timestamps (option in the timeline editor) but they are WAAAY off. Here is a link to a recording of it:
https://drive.google.com/file/d/1SUrZ1z259vITQnLBhythxjKADa2zkOL5/view?usp=sharing

I'm so close I can taste it.  

I'm worried though that if the calibration method is so fickle that I won't be able to align a full 360 volume, any tips are appreciated.  I have 2 more Azure's to set up and more coming.... 

Andrew Hogue

unread,
Mar 24, 2021, 1:34:41 PM3/24/21
to Brekel
If it helps, here is picture of my messy setup, the cameras are so close that there is almost no point in having so many of them..... but I can't get the calibration to work if they are in other orientations or further apart.

Any suggestions on my setup for camera location?  I want to be able to acquire a complete 360 volume in my small recording space......



IMG_5942.jpgIMG_5943.jpg

Brekel

unread,
Mar 25, 2021, 11:28:33 AM3/25/21
to Brekel
Hi Andrew,

Indeed very weird that the timestamps are so far off, this could definitely affect calibration quality as well as the solver may reject samples that are too far apart (or interpolate them wrongly).

When looking at the sensor table in the GUI do those timestamps align? (it should automatically synchronize system clocks after the first few seconds of connection)
Is there any indication one (or more) of your machines are slowing down due to high CPU/GPU usage?
Are you recording to local disks so there is no bottleneck in saving data?
What does the output of the message log look like after a alignment? (please attach a full log as it contains a lot of info that helps debugging)

Note that running multiple machines with networked sensors is definitely more challenging than connecting all sensors to a single computer (which Azure Kinect supports), that will also be beneficial to hardware sync of the sensors afaik.


Op woensdag 24 maart 2021 om 18:34:41 UTC+1 schreef Andrew Hogue:

Andrew Hogue

unread,
Mar 25, 2021, 1:47:42 PM3/25/21
to Brekel
yeah, I'll have to check the timestamps in the sensor table.  What else should I be checking?  I can only get into the lab on Wednesdays these days for a few hours (COVID restrictions) so I like to ensure I can collect all of the info required.

Recording to local disks and then transferring the data afterwards.

I will try different machines but I doubt its the machines, no indications of slowdown on any of them.  

ok, I'll grab the Log next wednesday when I try again.

Problem with single machine though is USB bandwidth..... I'll try to put 2 on each machine next week so only 2-3 machines instead of 4-6.  I don't think I have any PC with 2 USB3 busses so I doubt I could use more than 2 per machine.  

Anything else I should be looking to test next week?

Brekel

unread,
Mar 25, 2021, 5:10:56 PM3/25/21
to Brekel
Well, collecting a message log from all your machines will help, as it gives me a reference on your machines and setup.

If you're using sync cables unhooking those is probably also a good idea as this will not work properly across network machines (since they main machine will not be able to know up front which are primary/secondary and can't start them in sequence as required by the drivers).

Depending on your machine's power starting with a low color resolution (720p or 1080p) is probably also a good idea.

Using manual exposure and making sure the marker board has good contrast across views is good, as well as making sure it's stuck to a rigid surface and can't bent.
Op donderdag 25 maart 2021 om 18:47:42 UTC+1 schreef Andrew Hogue:

Andrew Hogue

unread,
Mar 25, 2021, 6:57:41 PM3/25/21
to Brekel
I'll grab the logs for sure on wednesday.
Remove the sync cables..... interesting... maybe that's the issue I wonder. 

I'll try removing the sync cables but if that changes the behaviour then I would suggest that some options be defined on the camera setup configuration where you can specify the master/slave configuration (i.e. maybe the first camera on the system is the master or have a dropdown to select) and then define the ordering in the daisy-chain.  This would allow you to initialize/re-initialize the cameras with the appropriate timing offset as well for the pattern projection.  

Or some option to not initialize the cameras until the master server responds to the client, I think that would be a great option to include in the interface.

Brekel

unread,
Mar 25, 2021, 7:08:05 PM3/25/21
to Brekel
The GUI has no knowledge of clients before it connects with them and it has no knowledge about client sensors (and in case of Azure Kinect if/how sync cables are connected to those) before they are connected (which happens at client start). So it has no control over the order of when sensors are started (which is what the Azure Kinect drivers require for sync.

So in short network sensors and Azure Kinect sync cables currently don't work together, only when sensors are connected to a single machine. (which is possible with one or more PCI express cards for example)

Op donderdag 25 maart 2021 om 23:57:41 UTC+1 schreef Andrew Hogue:

Andrew Hogue

unread,
Mar 25, 2021, 7:16:06 PM3/25/21
to Brekel
yeah, I see that, I'm just sayin I think that it would be a good option to have the headless client wait until connected to initialize the cameras, then a configuration in the main interface as an option would allow you to specify the order to tell the clients to init.  Would be a good advanced option for those who need control over the sync.

Is there a way then to use an external sync signal then?  if I have a separate timecode sync generator and pump in the appropriate signal to the Azure, is there a way to support this?  

Brekel

unread,
Mar 25, 2021, 7:19:48 PM3/25/21
to Brekel
Maybe someday.

Op vrijdag 26 maart 2021 om 00:16:06 UTC+1 schreef Andrew Hogue:

Andrew Hogue

unread,
Mar 31, 2021, 2:42:43 PM3/31/21
to Brekel
Tried out a few things today, slightly better but still not usable.
The calibration is not working well, the timecode is still not synchronized (waaay off).

I have uploaded the logs in this shared folder as well as some videos and the recorded .bpc clips so you can see what's going on here:
https://drive.google.com/drive/folders/1OZdz6BVmt8nfvTUTMFhhwFLLq4uyi0ml?usp=sharing

I'm disheartened by this as there is so much potential here.  However I've been fighting with this for months and still have not been able to capture any usable data.

This is what I have tried today:
- mounted my calibration target to cardboard so it is rigid, measured it (330mm square).
- the timecode in the sensor view looks "sync'd" but it does look like there is still at least a 1sec delay between some of them 
- calibration is "more reliable" when doing it from recordings rather than live
- calibration gives completely different results with the different modes (static, moving single pass, moving multi-pass).  
- removed Sync cables from Azures
- multiple Azure's per PC (still same issues). 
- multiple Azure's per PC at lower framerates (15fps)
- single Azure per PC at 30fps
- single Azure per PC at 15fps
- I went around the lab and found 4 PCs with the same specs and set them up for this. Each PC is a Core i7-9700k CPU @3 GHz, 8 cores, RTX2080 GPU, 32GB RAM
- They are all connected on ethernet on a local network, I did a speed test and consistently > 600Mbps Download, > 930Mbps Upload

I am willing to help out to get this to work but if there is no way to get any improvement over the current state of things then I will need to find a different solution.  

Today I also tried out an opensource solution (LiveScan3D), while the interface is nothing like yours it took like 20 min to set up on my PCs, calibration worked immediately across the networked pcs, and supports the sync cables to ensure synchronization.

Please let me know what I can do to help get this working.

Frustratingly yours,

-- Andrew Hogue

Brekel

unread,
Apr 1, 2021, 5:29:23 AM4/1/21
to Brekel
Hi Andrew,

Thanks for the additional testing and files, I'll go through them to see if I can find some clues why things aren't working on your setup.

The 1 second delay sounds like it could be the culprit, when doing offline calibration it will try to interpolate a marker between frames (since it know past and future frames) which can help but 1 a large difference is not good for interpolation in any case.


I understand your frustration, trust me it's just as frustrating being on the other end.
Hopefully my efforts of completely rewriting the calibration module and redesigning some of the algorithms will soon help.

Op woensdag 31 maart 2021 om 20:42:43 UTC+2 schreef Andrew Hogue:

Andrew Hogue

unread,
Apr 6, 2021, 2:03:35 PM4/6/21
to Brekel
Hi again!

I'll be in the lab tomorrow and able to test out new configurations.  Is there any data you think would be helpful that I can capture for you in order to track down the issues?
Let me know!
Thanks,

Brekel

unread,
Apr 7, 2021, 4:28:21 AM4/7/21
to Brekel

The latest Beta v0.74 should log details about the first few timestamp synchronization packets on the console/headless client machines which should help diagnose if UDP packets are being received and if there are large variations in network roundtrip times between the server/client machines. I still suspect this is the main reason why the aligner is rejecting marker frames (too large of a timestamp gap)
Op dinsdag 6 april 2021 om 20:03:35 UTC+2 schreef Andrew Hogue:
Reply all
Reply to author
Forward
0 new messages