IMU Coordinate Conversion Question

33 views
Skip to first unread message

Ross Lunan

unread,
Sep 14, 2025, 8:13:05 PM (8 days ago) Sep 14
to HomeBrew Robotics Club
Regarding REP-103 that defines ROS Axis orientation,  X Forward, y Left, z uP referred to as FLP. So, I have a OAK-D-Lite Camera with included IMU. But guess what it publishes its data using the standard Camera Coordinate System, x Right, y Down, z Forward, i.e RDF, and the Axis Rotation in the Quaternion. So I figure to use this IMU on a Robot, there needs to be a running Node that subscribes to RDF and Publishes FLP Quaternion data. I made a stab at creating code, using Google AI, a ROS2 package that does the Axis only  -untested! I ran out of time to create the other quaternion data, but presumable doable. As anybody encountered this requirement and/or dev the code? Ross (It can be said why not use a separate $10 IMU ? Certainly - I have a MPU9250 working great)

James H Phelan

unread,
Sep 15, 2025, 7:25:07 AM (7 days ago) Sep 15
to hbrob...@googlegroups.com

The .urdf .xacro for cameras usually have a separate "optical" or similar link and joint to rotate the RDF optical axis to point the same as the physical FLP axis.  Perhaps one can do a camera IMU link/joint to accomplish the same for the camera's IMU.

James H Phelan
"Nihil est sine ratione cur potius sit quam non sit"
Leibniz
On 9/14/2025 7:13 PM, 'Ross Lunan' via HomeBrew Robotics Club wrote:
Regarding REP-103 that defines ROS Axis orientation,  X Forward, y Left, z uP referred to as FLP. So, I have a OAK-D-Lite Camera with included IMU. But guess what it publishes its data using the standard Camera Coordinate System, x Right, y Down, z Forward, i.e RDF, and the Axis Rotation in the Quaternion. So I figure to use this IMU on a Robot, there needs to be a running Node that subscribes to RDF and Publishes FLP Quaternion data. I made a stab at creating code, using Google AI, a ROS2 package that does the Axis only  -untested! I ran out of time to create the other quaternion data, but presumable doable. As anybody encountered this requirement and/or dev the code? Ross (It can be said why not use a separate $10 IMU ? Certainly - I have a MPU9250 working great) --
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/5e987021-1550-4fd0-b2b3-99ba5de3f6acn%40googlegroups.com.

Ross Lunan

unread,
Sep 15, 2025, 8:04:16 PM (7 days ago) Sep 15
to HomeBrew Robotics Club
Firing up the Camera w/IMU on the robot, topic echo /imu displays a captured Frame below. So it  seems I just need to write a Python script that Subscribe and assign values of angular_veloity and linear_acceleration into  two in_msg:Vector3  and convert the values into  out_msg:Vector3. Then assemble the Publish frame with these 2 Vector 3 and 0.0 orientation and covariances.   Is this too simple?
Convert  
x_out = z_in
y_out = x_in
z_out=-y_in
 
header:
  stamp:
    sec: 1757970940
    nanosec: 599164685
  frame_id: oak_imu_frame
orientation:
  x: 0.0
  y: 0.0
  z: 0.0
  w: 1.0
orientation_covariance:
All 0.0 
angular_velocity:
  x: -0.003195793367922306
  y: 0.011717909015715122
  z: 0.003195793367922306
angular_velocity_covarience
All 0.0
linear_acceleration:
  x: 0.15322890877723694
  y: 9.818621635437012
  z: 0.2992752194404602
linear_acceleration_covariance
All 0.0
...

Sergei Grichine

unread,
Sep 15, 2025, 9:20:30 PM (7 days ago) Sep 15
to hbrob...@googlegroups.com
Ross,

I don't think we had all detailed info here to guess what a solution to your problem could be.

James has mentioned that the Optical frame and camera frame don't have to be aligned, but that relates to the visual data and probably doesn't address the IMU frame (or data) orientation. It deserves a second look though, as Luxonis driver might be dealing with it.

I played with a Kickstarter "first batch" OAK-D Lite, which doesn't have an IMU - my notes are here: https://github.com/slgrobotics/robots_bringup/blob/main/Docs/Sensors/OAK-D_Lite.md#ros2-operation

I asked the Iron Man: "converting OAK-D camera IMU data to ros2" and got this:
"To convert OAK-D camera IMU data to ROS 2, you must use the official depthai_ros driver developed by Luxonis. 
The driver integrates the IMU chip data directly into the ROS 2 ecosystem, publishing it as standard sensor_msgs/msg/Imu messages on a specific topic."
(well, I'd guess you are doing exactly that)
Step 4: Address coordinate frame differences
The IMU's coordinate frame might not align with the standard ROS convention (Z-up).
    Luxonis forum posts indicate that for some models, you may need to flip the sign of a particular axis or apply a transform to align the data.
(I'd guess this is the problem you are facing)"

I asked it to elaborate:
 =============================================================
elaborate on "Address potential coordinate frame issues" - what would be a solution?
----------------------

"A primary solution to an incorrect coordinate frame is to apply a static transform using ROS 2's
static_transform_publisher. This tool publishes a fixed transform between two coordinate frames, which corrects the IMU's native frame to align with standard ROS conventions (Right-Handed, Z-up) and your robot's base frame. 
Why coordinate frames matter
ROS follows a standard convention called REP-103, which defines a right-handed coordinate system with X forward, Y left, and Z up. Many IMU sensors, including some in OAK-D cameras, may report data in a different, sometimes left-handed, frame. 
  • If you ignore the frame misalignment, all subsequent calculations will be wrong. An acceleration vector meant to be along the Z-axis might appear on the X-axis, or a positive rotation might be interpreted as negative.
  • This is especially critical for state estimation nodes like robot_localization, which expect specific coordinate frame conventions for proper fusion with other sensor data. 
How to apply a static transform in ROS 2
The goal is to define a transform from the IMU's native frame (e.g., oak_imu_frame) to a standard frame (imu_link or base_link) that follows REP-103.
Step 1: Characterize the IMU's frame
To determine the necessary transform, you need to understand how the OAK-D's IMU defines its axes.
  1. Test the axes: Launch the depthai_ros_driver and echo the IMU topic: ros2 topic echo /oak/imu/data.
  2. Move the camera: With the OAK-D stationary on a flat surface, note the gravity vector (linear_acceleration). It should be -9.8 m/s^2 along the axis that points towards the ground. Then, rotate the camera around each axis and observe how the angular velocity changes.
  3. Define the transform: Based on your observations, you can derive the rotation and translation needed to align the IMU frame with your desired frame (e.g., base_link for your robot). For example, if the IMU's native X-axis is pointed up, you'll need to apply a rotation to align it with the Z-axis. 
Step 2: Publish the static transform
You can create and publish the static transform using a launch file. This is preferable to the command line as it is repeatable and part of your system configuration. 
Create a launch file:
xml
<launch>
  <node pkg="tf2_ros" exec="static_transform_publisher" name="imu_transform_publisher" args="0 0 0 1.5708 0 1.5708 oak_imu_frame imu_link" />
</launch>
Use code with caution.
  • args: The arguments define the transform. The standard format is x y z yaw pitch roll parent_frame child_frame.
  • Example values: The rotation values (1.5708 rad = 90 degrees) are illustrative. The correct values will depend on your specific camera model and mounting orientation. You must substitute the correct translation and rotation values based on your findings in Step 1.
  • Frames: Replace oak_imu_frame with the frame ID that the depthai_ros_driver uses for its IMU data, and imu_link with the REP-103 compliant frame you want to use. 
Step 3: Integrate with state estimation
Once your static transform is in place, you can use the corrected IMU data with a state estimation package like robot_localization. 
Example robot_localization setup:
The ekf_node in robot_localization can be configured via a YAML file.
  1. Subscribe to the transformed topic: Ensure the ekf_node is configured to listen to the correct IMU topic, which might be a remapped topic if you also use an imu_transformer node.
  2. Enable fusion: In the YAML configuration, enable the IMU for fusion:
    yaml
    # Example for `ekf_filter_node.yaml`
    imu0: true
    imu0_config: [false, false, false, # x, y, z position
                  true,  true,  true,  # roll, pitch, yaw orientation
                  true,  true,  true,  # x, y, z angular velocity
                  false, false, false] # x, y, z linear acceleration
    
    Use code with caution.
Note that while orientation and angular velocity are fused, linear acceleration is often not used directly due to noise and sensitivity to gravity. The orientation is corrected relative to gravity, so a stationary IMU on a flat surface should read near zero roll and pitch. 

By following these steps, you can ensure that the IMU data from your OAK-D camera is properly transformed and ready for use in complex robotics applications in ROS 2.

  • Restart the launch file after saving your changes."
----------------------


Here is a bunch of links it provided to confuse us, humans, even more:




--
Best Regards,
-- Sergei

Ross Lunan

unread,
Sep 15, 2025, 9:52:57 PM (7 days ago) Sep 15
to HomeBrew Robotics Club
Thank u very much for your analysis, suggestions, and references. Your assumptions as to my problem statement are right on. Yes I have a newer version of the Lite that has a Velocity & Accelerometer  sensor that correctly publishes the msg:Imu data but on the xyz coordinates not compliant with REP-103. I’ll give a your advice a try to write a Node to republish that data correctly so Nav 2 fusion package can use it. I’m surprised this issue has cropped before now. Ross

Nima Rahnemoon

unread,
Sep 16, 2025, 12:20:30 PM (6 days ago) Sep 16
to HomeBrew Robotics Club
I think you have the right idea.  Alternatively you can define a static tf transform and left the tf tree do the work.

flu: front left up
rdf: right down front

rotation_matrix_flu = rotation_matrix_rdf * [[0 0 1], [-1 0 0], [0 -1 0]]
^-- simply convert your rdf to a rotation matrix and right multiply by this matrix and you'll have your front left up rotation matrix result.

Michael Wimble

unread,
Sep 16, 2025, 4:52:51 PM (6 days ago) Sep 16
to hbrob...@googlegroups.com
I don’t think you need to create any code. Any sensor data in ros is tagged with a frame id, since sensor data usually is related to the position of the sensor. Any code that wants the sensor data must use one of the tf2 transform calls to translate the data to a more interesting position, often the base_link frame id. All you have to do is update the rpy value in the joint for the camera in the urdf and all should be right in the world. 

On Sep 15, 2025, at 6:53 PM, 'Ross Lunan' via HomeBrew Robotics Club <hbrob...@googlegroups.com> wrote:

Thank u very much for your analysis, suggestions, and references. Your assumptions as to my problem statement are right on. Yes I have a newer version of the Lite that has a Velocity & Accelerometer  sensor that correctly publishes the msg:Imu data but on the xyz coordinates not compliant with REP-103. I’ll give a your advice a try to write a Node to republish that data correctly so Nav 2 fusion package can use it. I’m surprised this issue has cropped before now. Ross

Michael Wimble

unread,
Sep 16, 2025, 5:15:18 PM (6 days ago) Sep 16
to hbrob...@googlegroups.com
See my previous reply. In my repo at


Where you will see, using the urdf macro from Luxonis which is in the same directory:

<xacro:oakd_link mount_point="elevator_top" /> -->
  <xacro:oakd_joint mount_point="elevator_top">
    <origin xyz="-0.12 -0.02 1.47" rpy="0 ${(pi / 4)+13*(2.0*pi/360.0)} ${-0*(2.0*pi/360.0)}" />
  </xacro:oakd_joint>

You can see how I use the Oakd. Especially:

<!-- <xacro:oakd_link mount_point="elevator_top" /> -->
  <xacro:oakd_joint mount_point="elevator_top">
    <origin xyz="-0.12 -0.02 1.47" rpy="0 ${(pi / 4)+13*(2.0*pi/360.0)} ${-0*(2.0*pi/360.0)}" />
  </xacro:oakd_joint>

Which says my camera is slightly off center, 1.47 meters up a pole and tilted downward. 

On Sep 15, 2025, at 6:53 PM, 'Ross Lunan' via HomeBrew Robotics Club <hbrob...@googlegroups.com> wrote:

Thank u very much for your analysis, suggestions, and references. Your assumptions as to my problem statement are right on. Yes I have a newer version of the Lite that has a Velocity & Accelerometer  sensor that correctly publishes the msg:Imu data but on the xyz coordinates not compliant with REP-103. I’ll give a your advice a try to write a Node to republish that data correctly so Nav 2 fusion package can use it. I’m surprised this issue has cropped before now. Ross

Ross Lunan

unread,
Sep 16, 2025, 6:48:32 PM (6 days ago) Sep 16
to HomeBrew Robotics Club
Mike, yes I looked at your OakD.urdf.xacro and can see the 1.57 r axis rotations. But doesn’t the ekt fusion and nav2 expect data per  rep103 orientation ?  What’s the connection between the URDF and necessary corrected / imu topic coordinates ? Ross

Michael Wimble

unread,
Sep 17, 2025, 3:10:32 AM (5 days ago) Sep 17
to hbrob...@googlegroups.com
Well, yes, but you might be thinking that the data produced by the sensor must be oriented to rep 103. That’s not true. Sensor data can be in any orientation it wants, as long as it is tagged with a frame id. And there needs to be someone generating a ’tf’ message that connects that frame id to some link in the tf tree. The easiest way to do all of this is to create a URDF entry that declares a link with the same name as the frame id in the sensor message. And then the URDF declares a joint that connects that link to some other link in the tree of frames. If you do it in the URDF, then the robot_state_publisher, when it reads the URDF (usually after being preprocessed by xacro), the robot_state_publisher turns all the static joints into a set of static transforms so you don’t have to create a static transform yourself. For all the non-static links (like wheels), robot_state_publisher creates a listener for joint_state messages and, with that data, creates a dynamic transform for the link.

Which is a long way of saying what I talked about before. Anyone that wants to read the IMU or video from the OAK-D doesn’t just read the topic data and use it directly. It is required that you ALWAYS use the tf2 system to transform the data before you can use it. The IMU could be attached to a spinning ball and robot_state_publisher or some other code will be responsible for generating a transform from the frame_id in the topic containing the sensor data to some other link in the frame tree—which ultimately has to connect to the base_frame in the robot (not technically true, but it’s rare that it is not true).

As an example of this, some of us create wheels in the URDF that aren’t just cylinders turned on the side. My wheels, for instance, have a black outer ring representing the rubber tire, a thinner, grey disk in the center representing the wheel holding the tire, and then there is a a radially oriented rectangle with a different color. It’s a stripe along the radius of the wheel. When the wheel turns, the motor driver publishes an odom and/or joint-state message which robot_state_publisher uses to generate a transform describing how the wheel has turned. And since the rectangle is attached to the rotating wheel, rviz2 will show the strip rotating around the wheel as it turned. All because of the tf2 system.

So, ekf will not just read the IMU data and use it, it will read the data and use a tf2 operation to transform it to the rep 103 coordinate system. The output after the tf2 transform will make the odom data’s x-axis point along the same x-axis as the base_link frame, even if the IMU was spinning around. All you need to do is make sure that the URDF or some custom code you write (custom code is the hard way to do it) ultimately publishes the transform from the sensor’s frame id to some frame already in the frame tree for the robot. That’s the same for the camera itself—you can’t know what a pixel is in the universe until you first transform it. 

Does that make sense to you?

Ross Lunan

unread,
Sep 17, 2025, 12:43:52 PM (5 days ago) Sep 17
to hbrob...@googlegroups.com
Mike
Yes makes sense and thanks-regret missing last night Tuesday Forum with its relevant subjects :(

My MockBot Roomba/Create project does have a working URDF in its project description package with tf transforms. But unfortunately  does not (yet) have a Camera and IMU URDf xacro - now on my ToDo list :). For now to simplify this project work , I’m  using a MPU9250 where the RPY coordinates line up with the Chip.
Sent from Ross Lunan’s iPad

On Sep 17, 2025, at 3:10 AM, Michael Wimble <mwi...@gmail.com> wrote:

Well, yes, but you might be thinking that the data produced by the sensor must be oriented to rep 103. That’s not true. Sensor data can be in any orientation it wants, as long as it is tagged with a frame id. And there needs to be someone generating a ’tf’ message that connects that frame id to some link in the tf tree. The easiest way to do all of this is to create a URDF entry that declares a link with the same name as the frame id in the sensor message. And then the URDF declares a joint that connects that link to some other link in the tree of frames. If you do it in the URDF, then the robot_state_publisher, when it reads the URDF (usually after being preprocessed by xacro), the robot_state_publisher turns all the static joints into a set of static transforms so you don’t have to create a static transform yourself. For all the non-static links (like wheels), robot_state_publisher creates a listener for joint_state messages and, with that data, creates a dynamic transform for the link.
--
You received this message because you are subscribed to a topic in the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/hbrobotics/0krhRs54tlA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to hbrobotics+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/2FCABFED-8472-4E0B-9D04-882C45D07156%40gmail.com.
Reply all
Reply to author
Forward
0 new messages