Items with no label
3335 Discussions

Simple PointCloud2 program

GSR
Beginner
1,673 Views

Hi there!

I'm in the process of learning ROS and I'm trying to do a simple program. I'd like to use the depth camera of my Euclid in order to have a threshold distance. When an object is closer than that, I'd like to print some warning message. In theory it's quite simple but I couldn't find any documentation about it.

I know I could use the pointcloud2_to_laserscan node but I'd like to visualize it in RViz.

Do you guys have any idea?

Thanks!

0 Kudos
7 Replies
Meitav_K_Intel
Employee
355 Views

Hi Georgge,

If calibrated correctly, the depth image should give you a scale of 0-65535 in mm, and each pixel's intensity would range accordingly.

So that means that if you want to give out an alert when an object approaches closer than 4 meters, you want to filter for pixels under 4000 in intensity in the depth image.

Take note that objects under the ZR300 minimal Z might not properly show up.

Let me know if that helps and if you need any further assistance.

Regards,

Meitav

Intel Euclid Development Team

GSR
Beginner
355 Views

I'm having trouble accessing to the "intensity" parameter of the depth node. Plus, I don't know how to communicate that intensity to RViz in order to do a real-time simulation of the positions of the objects in front of the sensor.

Help would be appreciated.

0 Kudos
GSR
Beginner
355 Views

OK so after some research I've managed to display in RViz the PointCloud. It does not give me any option about mapping (i.e. adding the already-scanned points to form a map of the room) and it certanly does not give any info about the mm of each wall/object. Any ideas MeitavKleinfeld

0 Kudos
Meitav_K_Intel
Employee
355 Views

I might not have explained myself well.

The intensity would be the value of that pixel. Meaning that if for example the pixel located at 36x201 and is 3 meters close, its value ( = its gray level, in normal images) might be be 3000~, and it will probably show as dark gray in an image.

As for your other question - Rviz can't map the points from the pointcloud into a static map, since that usually requires a lot of estimation that Rviz doesn't do on its own.

What I'd recommend is to use RTABMAP (which estimates and creates a 3d map in real time and publishes it to a topic, which you can then open in Rviz. We have a tutorial for it right here: http://www.euclidcommunity.intel.com/static/tutorials/pdf/rtabmap.pdf http://www.euclidcommunity.intel.com/static/tutorials/pdf/rtabmap.pdf

0 Kudos
GSR
Beginner
355 Views

I finally understood what you meant by the pixel intensity, thank you.

But now I face two problems. When I launch RViz and add a PointCloud2 topic, the range of intensity is 0-4096, I tried to modify it to 0-65535 but it doesn't seem to make a difference.

The other thing is that I already tried to do the RTAB tutorial, but as I don't have a turtlebot available, I didn't get it to work. I searched in this forum and I found that the roslaunch should be modified by: "roslaunch rtabmap_ros rtabmap.launch rtabmap_args:="--delete_db_on_start" depth_topic:=/camera/depth/image_raw rgb_topic:=/camera/color/image_raw camera_info_topic:=/camera/color/camera_info" but as hard as I tried all possible combinations (changing visual_odometry from true to false, taking it out, adding the cloud_floor_culling_height and changing it to a different value...), I couldn't see any map because the image freezes almost instantly. Do you have any idea of what could be wrong?

Thanks.

0 Kudos
Meitav_K_Intel
Employee
355 Views

Hi Georgge,

That depends on the resolution of depth you are measuring. If you want to use the camera for a longer range, you can modify the depth gain in your Euclid as described by Yermi in this topic:

As for the RTAB tutorial, I'll try to get a guide for getting it to work without turtlebot. The main issue is that RTABMAP can very easily lose tracking without good odometry (such as from the turtlebot). It becomes nearly impractical to use from my experience. I think that might be the problem you're having. There is a chance that making a good calibration file to use with RTAB will give you better results.

You can also try ORBSLAM. It creates a sparse map (which isn't 100% what you're looking for, but works much better when held by hand. I used it to map my friend's house from outside once, and it works fantastic. I also have a tutorial for it here: http://www.euclidcommunity.intel.com/static/tutorials/pdf/orbslam/orbslam2_tutorial.pdf http://www.euclidcommunity.intel.com/static/tutorials/pdf/orbslam/orbslam2_tutorial.pdf

Let me know if any of this works for you, and feel free to ask anything if you run into any trouble.

Regards,

Meitav

Intel Euclid Development Team

0 Kudos
GSR
Beginner
355 Views

So I've decided to take a step back and focusing on getting the raw data of a simple frame of the PointCloud. I'm trying to use the PCL and I already looked what kind of data contains a PointCloud2 message.

I'm facing several problems:

-import pcl (in Python) gives me an error. It says it doesn't find it.

-If I do a ros::spin() everything seems to run OK, but of course that data stream is not usable for me. I'm interested in a single frame.

-I don't know how to save it in a .pcd file so I can recreate it in RViz (for example)

MeitavKleinfeld

0 Kudos
Reply