Items with no label
3338 Discussions

RealSense T265 Wheel Odometry configuration

victorbl
Beginner
2,262 Views

I have integrated the T265 tracking camera into my robot stack and are seeing great results in general odometry tracking of the robot. I developed a small C++ package to stream the camera poses, and can create the appropriate transforms and topics. In an attempt to wrap up the package features, I have spent the last week trying to integrate the wheel odometry feature as expected by the T265 camera and have a few question to ensure I get this right.

 

For some brief details on my robot configuration, I am using the T265 on a differential drive robot, two motorized wheels centered on a circular base. I am using the Parallax Arlo robot platform. I have mounted the T265 on the upper deck, at the very front of the robot roughly .31 meters from the floor, and .2 meters forward from the center of the bot and wheels.

 

My questions

1. With the robot being differential drive (two motorized wheels), do I provide the T265 a single ‘Velocimeters’ block in the json file with the extrinsics transform between my robots ‘base_link’ and the ‘camera’. The odometry velocity I would send via ‘send_wheel_odometery’ would be the overall linear velocity of the robot from ‘base_link’ 

 

-or-

 

Do I provide the T265 two ‘Velocimeter’ definitions, each defining the extrinsics transform from each unique wheel? Then presumably the odometry velocity sent for each velocimeter definition would represent the unique linear velocity for each wheel.

 

2. Is there a way to verify the wheel odometry configuration and velocity stream provided to the T265 are behaving as expected? I tried covering the lenses of the camera, hoping it would fall back on wheel configurations and the data stream for localization, but saw no significant pose changes coming from the camera based on the wheel odometry. It’s possible I had this mis-configured and wanted to ask if there is a good way to verify the provided wheel configuration and velocity match the camera’s expectations and are oriented correctly with my robot’s transforms.

 

Thank you in advance for insights into these two questions. I’ve done some relatively deep digging around the internet and have had no luck getting clarity on these two open items in my implementation.

 

0 Kudos
4 Replies
MartyG
Honored Contributor III
1,980 Views

The RealSense ROS forum is the best place to post this message to get expert advice on this topic. Thanks!

 

https://github.com/IntelRealSense/realsense-ros/issues

0 Kudos
victorbl
Beginner
1,980 Views

Thanks Marty, I wouldn't really classify this as an issue to file in the RealSense ROS repository, more just seeking clarification on how to use the APIs exposed in the T265 SDK. I cleaned up my question to make it not specific to ROS, and more focused on the appropriate use of the T265's wheel odometry configuration and API.

0 Kudos
MartyG
Honored Contributor III
1,980 Views

Thanks Victor. I am the only person available on this particular forum and my T265 knowledge is limited. The guys with the really deep knowledge of it are on the ​Librealsense and ROS GitHub forums.

Having said that, Intel's guide to getting IMU data from T265 is well worth reading and has an API sample script for getting pose data at the end of it.

 

https://www.intelrealsense.com/how-to-getting-imu-data-from-d435i-and-t265/

0 Kudos
victorbl
Beginner
1,980 Views

Ok, thanks for the additional context Marty, will take this to the RealSense ROS github issue forum.

0 Kudos
Reply