- The controlling laptop publishes steering data (this is what you see running in the terminal at the beginning)
- The robot in turn listens to this steering topic, and writes the appropriate signal to the Arduino onboard, which then controls the actual motors.
- The robot also publishes the Kinect data over ROS topics, however they don't seem to get through (my network stats show that the outgoing data constantly peaks at around 230KiB/s, but no actual images get displayed on the host computer) I believe this is because of the poor wifi card built into the laptop on to of the robot, I'm currently working on getting this to work
- The controlling laptop listens in to this data, and will, once it's working, run it through the ROS RGB-D SLAM algorithm.
Here's the video: