Use the Neato Botvac with ROS

Neato’s Botvacs are not only cool for keeping your home clean, they also are an affordable robotics platform compatible with the robot operating system (ROS). To use your Neato with ROS, first install ROS, then download and install my fork of the neato_robot package:

To use my Neato wifi hack with the ROS package neato_robot the telnet access has to be bridged back to serial with socat:

Now you can launch the necessary ROS nodes with roslaunch:

While the console window is active, you should be able to move your robot around by:
u i o
j k l
m , .

q/z : increase/decrease max speeds by 10%
w/x : increase/decrease only linear speed by 10%
e/c : increase/decrease only angular speed by 10%
anything else : stop

CTRL-C to quit
Watch out, it can be surprisingly fast 😉

Neato ROS

Alternatively you can command your neato via rviz to go to a nav goal…

War dieser Beitrag hilfreich?
[Gesamt:6    Durchschnitt: 4.2/5]

  • Krst _

    Hi, I am trying to clone you neato robot node from github but I get that i dont have permission to acess the file when trying to copy, have you moved the file by any change? Thanks for the nice tutorial!

    /Krst

    • jmtatsch

      For cloning with ssh you need a github account. You should be able to clone without an account by cloning over https: git clone https://github.com/jmtatsch/neato_robot.git

      • Krst _

        That solved it, thank you!

        I assume you run it on a mac? Running it on 14.04 i found a small „issue“ with you code after downloading from github? in the folder neato_robot/neato_node/msg/ the button.msg filename is with a lower capital b, this causes problem when compiling with catkin, had to change it to Button.msg…but pherhaps this is a simple setting…i am just a beginner?! 🙂

        If i may ask, how does one activate the navigation mode in rviz after creating a map with ros, which of the nodes to use it with gmapping? Only been playing around with a roomba turtlebot tutorials some years ago so not to used with ros to be honest! The scanning and showing maps in rviz works very well already with the bringup all launch so i get it up running.

        In general, how do you find the odometry on the botvac, need for calibration or you have not changed anything? Had issues with my roomba in the past why i ask…

        I am running ros on a Botvac 75, next step is to use a raspberry pi 2 to connect it with….

        Many thanks for publishing this guide again and uppdating the nodes to work with the Botvacs….

        • jmtatsch

          Yes, I run ROS mostly on a Mac. I fixed the Button.msg thing, which was already fixed locally but git doesn’t seem to recognize case changes in filenames as change. What do you mean by navigation mode in rviz? If you want to command your robot in rviz, set an initial 2d pose estimate and a 2d nav goal in the top menu bar in rviz. There is an excellent tutorial on how to use the navigation stack here: http://wiki.ros.org/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack
          Concerning the odometry, I did not have the time to calibrate it yet so it is really off. If you make progress let me know

          • Krst _

            ok, thanks. I am a beginner but will try to see if i can use the turtlebot cal method also for the neato. But when trying to calibrate the „manual“ way i realise that the TF of the base_link is rotatated by 180 deg (3.14 rad) in the neato.py? This belongs i see to that the scanangles from the lidar are rotated by 180 deg, -3.26 to +3.26. In the „original“ neato.py file from ferguson for the x11 the scan angles are 0 to 6.26 which equivalent of 360 deg.When trying to set ut the scan angle in the same way now in indigo with your drivers i get the error message that scan.angle_min must be equal to scan.angle_max why i cant render my map in rviz althought i can see the movements, so this syncs the neato lidar with the odometry. Is there a special explanation to why you have done so?

            scan.angle_min = -3.26
            scan.angle_max = +3.26

            also i have seen that in the urdf file there was used 90 deg instead of 1.57 rad that throws the neato model of in rviz.

            /Krst

          • jmtatsch

            How did you fix the rotated base_link? I changed scan.angle_min max, to get it running somehow. Not sure if it was the right thing to do.

          • Krst _

            I never did reach that far to try to do the transform of the baselink. Took me a while to understand at all why the robot was totally off in the odometry until I realised the offset in angle 🙂 reading the „manuals“ it is recommended to do as you did it, have negative and positive angle the same, it is the preferable way for gmapping and not to start at zero (although I se that this has been done in the past), one has to rotate the base link to make it align with the odom instead.

            Did you mange to solve this? I notice that there are updates to github hydor branch.

            When I check in you indigo launch all file, would it not be possible to change the tf transform link in the laser to base to rotate it? Have you tried that?

            static_transform_publisher x y z yaw pitch roll frame_id child_frame_id period_in_ms

            /kst

          • Krst _

            Hi again

            Rotate the TF transform in the bringup launch file appears to solve the issue with the laser rotation relative the odometry.Rotate the base to laser link by the same amount one had to rotate the laser for the gmapping node to make it „work“.i.e pi rad.

            have rotated the yaw angle in the laser link to 3.14, it seams to solve the problem what i seen during some basic testing…

            /krst

          • jmtatsch

            Finally had the time to test your findings. As soon as I rotate the yaw angle by 180° via static transform, gmapping won’t integrate the laser scans anymore. No idea why. Otherwise it tries to integrate them but fails badly probably, because the odometry is completely broken. Anyways, I changed some stuff to comply with the coordinate system conventions from http://www.ros.org/reps/rep-0103.html . Let me know if you make further progress…

          • Krst _

            OK, strange?! When i rotate the transform to compensate for the rotation of the laser node it works „perfect“. I am just trying to learn git but i managed to make a fork of your respiratory https://github.com/kcneato/neato_robot.git , but not any signifcant change of your code. I am running it with a RPI2 onboard the botvac why I configured the launch files a little different.

  • Torben

    Hi.

    I also just got a new Neato Botvac 85, read your tutorial, and directly ordered a wifi module (hoping the MPR-A2 will do the work too…).^^

    Now I would like to now how your Botvac, including the wifi adapter, looks like – I cannot really imagine the final setup…
    Are you running the robot without dustbin? Or is the router mounted externally? Could you probably upload an image of your setup?
    Is it possible to run the robot in „auto“ mode with ROS as ‚listener‘ to visualize the internal botvac algorithms (I hope you can understand what I mean here).
    Have you tried to use GAZEBO for visualization?

    Btw, thanks for your nice Tutorial!

    Best regards,

    Torben


    Lubuntu 14.04 32bit
    ROS Indigo
    Hame MPR-A2

    • jmtatsch

      Hi Torben, I am currently running the robot without the dustbin, but that is not a final solution. Lurch at robotreviews http://www.robotreviews.com/chat/viewtopic.php?f=20&t=18374#p129934 has managed to modify a microusb connector to fit with the dustbin installed. I have sacrificed several microusb cables but couldn’t get it to fit. I guess I will just drill out a little part of the dustbin. I guess visualizing a cleaning run in ROS could work, but I haven’t tried it so far. Also haven’t visualized the robot in gazebo, but a 3D model is on github. Drop me a line if you make progress 🙂