To view camera data from an Intel RealSense camera, you need to install the ROS Wrapper for Intel RealSense cameras. This allows you to use Intel RealSense cameras with ROS1 or ROS2. Once you have installed the wrapper, you can start the camera node and stream camera sensors, which will be published on the appropriate ROS topics.
To start the camera node in ROS2, plug in the camera, then enter the following command:
source /opt/robot_devkit/robot_devkit_setup.bash # To launch with ros2 run ros2 run realsense_node realsense_node # Or use ros2 launch ros2 launch realsense_examples rs_camera.launch.py
This will launch RVIZ2 and display the five streams: color, depth, infra1, infra2, and pointcloud.
Characteristics | Values |
---|---|
Command to start camera node in ROS2 | source /opt/robot_devkit/robot_devkit_setup.bash # To launch with "ros2 run" ros2 run realsense_node realsense_node # Or use "ros2 launch" ros2 launch realsense_examples rs_camera.launch.py |
Command to start camera node in ROS1 | roslaunch realsense_camera r200_nodelet_default.launch |
Command to start camera node in ROS2 (alternative) | ros2 run realsense2_camera realsense2_camera_node --ros-args -p enable_color:=false -p spatial_filter.enable:=true -p temporal_filter.enable:=true |
Command to start camera node in ROS1 (alternative) | roslaunch realsense2_camera rs_camera.launch initial_reset:=true |
Command to check if depth-topic data is being published | rostopic echo /camera/depth/image_rect_raw |
Command to check live-updating FPS speed for the depth stream | rostopic hz /camera/depth/image_rect_raw |
Command to set Kimera's online mode to True | roslaunch kimera_vio vio.launch online:=True |
Command to launch RVIZ2 and display the five streams: color, depth, infra1, infra2, pointcloud | source /opt/robot_devkit/robot_devkit_setup.bash rviz2 |
Command to set camera name and camera namespace | ros2 launch realsense2_camera rs_launch.py camera_namespace:=robot1 camera_name:=D455_1 |
Command to set camera name and camera namespace (alternative) | ros2 run realsense2_camera realsense2_camera_node --ros-args -r __node:=D455_1 -r __ns:=robot1 |
Command to set camera name and camera namespace (alternative) | ros2 launch realsense2_camera rs_launch.py depth_module.depth_profile:=1280x720x30 pointcloud.enable:=true |
What You'll Learn
Starting the camera node
To start the camera node in ROS2, plug in the camera, then type the following command:
Source /opt/robot_devkit/robot_devkit_setup.bash
To launch with "ros2 run" type:
Ros2 run realsense_node realsense_node
Or use "ros2 launch" and type:
Ros2 launch realsense_examples rs_camera.launch.py
This will stream all camera sensors and publish on the appropriate ROS2 topics. PointCloud2 is enabled by default.
To launch RVIZ2 and display the five streams (color, depth, infra1, infra2, pointcloud), type the following in Terminal #2:
Source /opt/robot_devkit/robot_devkit_setup.bash
Rviz2
If you are using ROS1, the following launch files are available:
- Rs_rgbd.launch
- Rs_d435_camera_with_model.launch
- Rs_aligned_depth.launch
- Rs_t265.launch
- Demo_t265.launch
- Rs_multiple_devices.launch
- Rs_d400_and_t265.launch
If you are using ROS2, the following launch files are available:
- Rs_launch.py
- Rs_intra_process_demo_launch.py
To start the camera node, you can also use the following command:
Ros2 run realsense2_camera realsense2_camera_node
Or, with parameters:
Ros2 run realsense2_camera realsense2_camera_node --ros-args -p enable_color:=false -p spatial_filter.enable:=true -p temporal_filter.enable:=true
CVS Camera Surveillance: What You Need to Know
You may want to see also
Camera name and namespace
The user can set the camera name and camera namespace to distinguish between cameras and platforms, which helps identify the right nodes and topics to work with. This is especially useful if you have multiple cameras (which may be of the same model) and multiple robots. For the first robot and first camera, the following parameters can be used:
With ros2 launch (via command line or by editing these two parameters in the launch file):
Ros2 launch realsense2_camera rs_launch.py camera_namespace:=robot1 camera_name:=D455_1
With ros2 run (using remapping mechanism):
Ros2 run realsense2_camera realsense2_camera_node --ros-args -r __node:=D455_1 -r __ns:=robot1
This will result in the following nodes and topics:
/robot1/D455_1
/robot1/D455_1/color/camera_info
/robot1/D455_1/color/image_raw
/robot1/D455_1/color/metadata
/robot1/D455_1/depth/camera_info
/robot1/D455_1/depth/image_rect_raw
/robot1/D455_1/depth/metadata
/robot1/D455_1/extrinsics/depth_to_color
/robot1/D455_1/imu
/robot1/D455_1/device_info
If no parameters are given, the default values will be `camera_namespace:=camera` and `camera_name:=camera`, resulting in nodes and topics such as:
/camera/camera
/camera/camera/color/camera_info
/camera/camera/color/image_raw
/camera/camera/color/metadata
/camera/camera/depth/camera_info
/camera/camera/depth/image_rect_raw
/camera/camera/depth/metadata
/camera/camera/extrinsics/depth_to_color
/camera/camera/imu
/camera/camera/device_info
Accessing Raspberry Pi Cameras on Tablets
You may want to see also
ROS2-vs-Optical Coordination Systems
ROS2 and Optical Coordination Systems have different coordinate systems. The ROS2 coordinate system is defined as (X: Forward, Y: Left, Z: Up), whereas the Camera Optical Coordinate System is (X: Right, Y: Down, Z: Forward). This distinction is important when working with camera data and performing tasks such as path planning and path tracking for robots.
The Intel RealSense ROS2 Wrapper provides static and dynamic TF topics that allow users to convert between these two coordinate systems. The TF message expresses a transformation from the source coordinate frame ("header.frame_id") to the destination coordinate frame ("child_frame_id"). In RealSense cameras, the origin point (0,0,0) is taken from the left IR (infra1) position and named as the "camera_link" frame.
When working with camera data, it is essential to understand the relationship between the ROS2 and Optical Coordinate Systems. The Intel RealSense ROS2 Wrapper facilitates this conversion by providing the necessary TF topics and static TFs between each sensor coordinate and the camera base ("camera_link"). Additionally, it provides TFs from each sensor's ROS coordinates to its corresponding optical coordinates.
The Intel RealSense ROS2 Wrapper supports cameras such as the D400 series and T265, and it can be installed on Ubuntu and Windows. It is worth noting that the ROS2 Wrapper for Intel RealSense cameras is specifically designed for ROS2, as the developers are focusing on the ROS2 distribution. While ROS1 support is still available, it is no longer the main focus.
In conclusion, when working with camera data and Intel RealSense cameras, understanding the ROS2-vs-Optical Coordination Systems is crucial. The Intel RealSense ROS2 Wrapper provides the necessary tools and topics to convert between these coordinate systems, ensuring accurate data processing and enabling advanced robotics applications.
Last Glimpse: Camera's Final Moments Before Death
You may want to see also
TF from coordinate A to coordinate B
The Intel RealSense ROS Wrapper allows users to access camera data from Intel RealSense cameras with ROS1 and ROS2. The wrapper provides static TFs (Transformations) between each sensor coordinate and the camera base, as well as TFs from each sensor's ROS coordinates to its corresponding optical coordinates.
The TF message expresses a transformation from the source coordinate frame "header.frame_id" to the destination coordinate frame "child_frame_id". In Intel RealSense cameras, the origin point (0,0,0) is the left IR (infra1) position, named the "camera_link" frame. The depth, left IR, and "camera_link" coordinates converge together.
For example, in the D435i module, the origin of the depth coordinate system is the centerline of the left infrared sensor. The origin of the RGB coordinate system is the centerline of the RGB sensor. When depth is aligned with colour, the origin of the depth coordinate system becomes the centerline of the RGB sensor.
Extrinsics from sensor A to sensor B:
Extrinsics refer to the position and orientation of sensor A relative to sensor B. If sensor B is the origin (0,0,0), then the extrinsics describe the location of sensor A relative to sensor B. For instance, in the D435i, the depth sensor is 0.0148m to the right of the RGB sensor when viewed from behind the camera.
To start the camera node in ROS2, plug in the camera and enter the following command:
Source /opt/robot_devkit/robot_devkit_setup.bash
Ros2 run realsense_node realsense_node
Or, to use "ros2 launch":
Ros2 launch realsense_examples rs_camera.launch.py
This will stream all camera sensors and publish on the appropriate ROS2 topics. PointCloud2 is enabled by default.
To convert 2D image coordinates to 3D coordinates, the rs2_deproject_pixel_to_point function provided by Intel RealSense can be used. This function takes into account sensor_msgs/CameraInfo to increase accuracy. The pyrealsense2 library in Python can be used to obtain all the required values for the function.
Adjusting Your See HD Ultra Camera: Tips and Tricks
You may want to see also
Extrinsics from sensor A to sensor B
For example, in the case of the D435i, the extrinsic from depth to colour means the position of the depth sensor relative to the colour sensor. If we look at the X coordinates from behind the camera and assume that the RGB sensor is at (0,0,0), we can say that the depth sensor is 0.0148m (1.48cm) to the right of the RGB sensor.
The extrinsic message is made up of two parts:
- Float64 [9] rotation (Column-major 3x3 rotation matrix)
- Float64 [3] translation (Three-element translation vector, in metres)
Are Camera-Proctored Exams Watching Your Screen?
You may want to see also
Frequently asked questions
To start the camera node in ROS2, plug in the camera, then type the following command:
> source /opt/robot_devkit/robot_devkit_setup.bash
> ros2 run realsense_node realsense_node
> ros2 launch realsense_examples rs_camera.launch.py
To see the camera data, launch RVIZ2. This will display the five streams: color, depth, infra1, infra2, pointcloud.
The camera node examples are:
- PointCloud visualization
- Open From RosBag File Example
- Multiple D400 cameras
- Launch Parameters From Yaml File
- Launch Params From File
The ROS2 supported distributions are:
- Rolling Ridley (Ubuntu 24.04 Noble Numbat) - in Development phase
- Jazzy Jalisco (Ubuntu 24.04 Noble Numbat) - in Development phase - LTS
- Iron Irwini (Ubuntu 22.04 Jammy Jellyfish)
- Humble Hawksbill (Ubuntu 22.04 Jammy Jellyfish) - LTS