It seems temporal calibration between an optical tracker's poses and timestamped robot arm joint angles (or cartesian end effector pose) could be done in a manner similar to the camera-imu calibration approach in kalibr. I already have a good hand-eye calibration, but do you have any recommendations for performing temporal calibration and adapting the Jacobians fed to the Levenberg-Marquardt (LM) algorithm provided by Kalibr and described in the paper "Unified Temporal and Spatial Calibration for Multi-Sensor Systems"?
Since the algorithm looks designed to be run online at startup, how would clock drift be accounted for in this case? This was listed as future work in the paper, so I'm interested to hear if there are any updates.
hardware details
-----------------
Optical tracker: ~330 hz update, 4ms latency
Robot arm: configurable between 100-1000 Hz update and 1-10ms latency
For those not familiar, an optical tracker provides output poses to a marker much like AR/April tags. With these devices, you do not have access to the camera images so it is not possible to use the calibration targets detailed in the paper. Here is a detailed explanation: http://www.iotracker.com/indexdaed.html?q=optical_tracking