- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I've been experimenting with both RTAB-MAP plus visual odometry and the Intel Realsense SLAM (both with the Euclid).
It seems like RTAB-MAP has all the nice hooks and features, while using the Intel SLAM I have to roll in a lot of my own items (loading, saving, handling parameters, etc.).
The benefit of using the Intel SLAM is that (in theory) the visual odometry would be better since it incorporates IMU data.
I've also experimented using Intel SLAM for its vitual odometry, and RTAB-MAP for mapping/SLAM.
My "feeling" (unscientific) so far is that RTAB-MAP with its built in visual odometry performs the best.
What are others using?
--Jay
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I have been trying to use RTAB-MAP on Euclid for now without success. I use Euclid alone (without a robot), hence I wanted to test RTAB-MAP's visual odometry.
Would you mind explaining how you proceeded to use RTAB-MAP with visual odometry on Euclid ?
I am quite new at using ROS, Euclid, and RTAB-MAP, and I would really appreciate your help.
Best,
Sophie
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Have you seen this PDF manual on using RTAB-MAP with Euclid and ROS?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for your reply. Yes, I saw it.
As I do not have a turtlebot, I tried to adapt this tutorial, but I was not able to produce a map.
I followed it until those lines, and changed them like this :
roslaunch rtabmap_ros rtabmap.launch rtabmap_args:="—delete_db_on_start"
depth_topic:=/camera/depth/image_raw
camera_info:=/camera/color/camera_info visual_odometry:=false visual_odometry:=true
odom_topic:=/odom rtabmapviz:=false cloud_floor_culling_height:=0.5
I also launched the realsense_camera nodelet (lr200m_nodelet_default.launch), which outputs depth and rgb related topics (amongst others).
But when I try to display the map with RViz, I get a warning 'no map received'.
It seems to me that RTAB-MAP needs to subscribe to some topics but cannot find them, because I get warnings from rtab-map:
[ WARN] [1491045751.097035646]: /rtabmap/rgbd_odometry: Did not receive data since 5 seconds! Make sure the input topics are published ("$ rostopic hz my_topic") and the timestamps in their header are set.
/rtabmap/rgbd_odometry subscribed to (approx sync):
/camera/rgb/image_rect_color,
/camera/depth/points/image_raw,
/camera/rgb/camera_info
[ WARN] [1491045753.847811637]: /rtabmap/rtabmapviz: Did not receive data since 5 seconds! Make sure the input topics are published ("$ rostopic hz my_topic") and the timestamps in their header are set. If topics are coming from different computers, make sure the clocks of the computers are synchronized ("ntpdate"). If topics are not published at the same rate, you could increase "queue_size" parameter (current=10).
/rtabmap/rtabmapviz subscribed to (approx sync):
/rtabmap/odom,
/camera/rgb/image_rect_color,
/camera/depth/points/image_raw,
/camera/rgb/camera_info,
/rtabmap/odom_info
I guess there is some remapping to do between what rtab-map expects and what Euclid camera node outputs, but I could not figure out how to do it. Maybe changing stuff in the launch file, but I don't know exatly what to do.
I also tried to start rtab-map's GUI, by typing 'rtabmap' in the terminal .
This way, I can get it working (a map starts to appear) for about 1min, but then I get the error "the camera has reached the end of stream".
Sorry if I missed something obvious.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This topic is unfortunately outside of my experience, so I'll link the author of that PDF manual, Tsahi_Mizrahi , into this discussion in the hope that they can help you with your problem.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I think you're missing the rgb_topic
roslaunch rtabmap_ros rtabmap.launch rtabmap_args:="--delete_db_on_start" depth_topic:=/camera/depth/image_raw rgb_topic:=/camera/color/image_raw camera_info_topic:=/camera/color/camera_info
==
I followed the PDF (/docs/DOC-112089 https://communities.intel.com/docs/DOC-112089) and built RTABMAP from source (https://github.com/introlab/rtabmap_ros GitHub - introlab/rtabmap_ros: RTAB-Map's ROS package. latest source, -not- the Kinetic branch).
When building from source, I started at "Build from Source" step # 2
Then I did this to run RTABMAPVIZ on the Euclid (which was pretty slow):
roslaunch realsense_camera lr200m_nodelet_default.launch
roslaunch rtabmap_ros rtabmap.launch rtabmap_args:="--delete_db_on_start" depth_topic:=/camera/depth/image_raw rgb_topic:=/camera/color/image_raw camera_info_topic:=/camera/color/camera_info
When I use rviz remotely, I start RTABMAP like this:
roslaunch rtabmap_ros rtabmap.launch rtabmap_args:="--delete_db_on_start" depth_topic:=/camera/depth/image_raw rgb_topic:=/camera/color/image_raw camera_info_topic:=/camera/color/camera_info rtabmapviz:=false
Hope some of this helps!
--Jay
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes indeed, I was missing this topic.
Thanks a lot!
In the end, I just added visual_odometry:=true to the line you posted, and it worked :
roslaunch rtabmap_ros rtabmap.launch rtabmap_args:="--delete_db_on_start" depth_topic:=/camera/depth/image_raw rgb_topic:=/camera/color/image_raw camera_info_topic:=/camera/color/camera_info visual_odometry:=true
(I don't know why the "camera has reached the end of file" error does not happen when I launch rtab-map this way though, but I will not complain =) )
I cannot answer the question you asked in the first place, as I only tested rtab-map with its own visual odometry (and only briefly).
Using your earlier post , I might also try your solution with Intel SLAM, but then this would be nothing new to you.
Did you (or others) also tried to use any SLAM algorithm in combination with the ros navigation stack, or something similar ?
Sophie
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Jay,
So far we've been experimenting with different SLAM methods and have already uploaded guides for Gmapping, RTAB Map and ORB SLAM for euclid, which you've probably already seen.
Our next candidate is going to be SVO 2.0.
What I didn't really like about the RTAB results was that I wasn't really able to reuse it afterwards (in terms of naviagation), and that relocalization was poor, so at a few times I would lose localization and wouldn't update the map anymore. I do believe its dense results give the most comprehensive map, at least for humans, out of those algorithms.
As for the IMU data, I assume you mean when using it handheld? When mounted on a turtlebot, the noise you get from the movement can overshadow the IMU results, which is why we're also working on incorporating wheel odometry for realsense slam as well.
Regards,
Meitav
Intel Euclid Development Team
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Meitav,
Since my original plan was to use a robot with known poor mechanical odometry (skid steer with tracks), I was really hoping the visual odometry and SLAM (along with IMU corrections) on the Euclid would work well enough. The Intel demo videos of strapping the Euclid on a robot with cheap, non-encoder equipped motors was super encouraging.
As far as I can tell now though, that must have been a very unique experiment. I haven't been able to reliably reproduce the results in a non-constrained (and non-visually adorned) space. I'm looking forward to the "Mapping and Navigation" project by Majr Srour. Hopefully that will set me on a better path.
I've since moved the Euclid to a platform with "reasonable" mechanical odometry, made much more progress using Gmapping, but have some issues there too. I'll post those issues as a new questions, since they're not quite related to this thread.
Thanks,
Jay
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Jay,
From my experiments, most wheeled robots produce a lot of inertial noise that overshadows measurements from our IMU.
In a handheld setup, the Intel Realsense SLAM (which is preinstalled in Euclid) should work very well.
We're still working on finding a best known solution. Perhaps something along the lines of ORB and SVO would work well since it does not need inertial data to work well.
As far as Majd's project is concerned, he is no longer with us unfortunately, and I'm taking over that field with the SLAM projects. The project's preview shouldn't be up anymore, and I apologize for that.
SLAM is still a pretty complex problem, and algorithms such as ORB SLAM and SVO 2.0 still can't overcome many non constrained environments, but if you have any specific algorithms and setups you'd be interested in seeing on our community website, I'd be happy to work on getting them working.
Have you tried RTAB Map? I've had much more success with it than with Gmapping.
Regards,
Meitav
Intel Euclid Development Team
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page