After over a year in the corner, I brought the project back out, and I think I found the solution. I had been looking at using RGBD-Slam for publishing odom, but according to this answer on ROS Answers, it is overkill, instead the author suggested using laser_scan_matcher. I'm going to give that a try.
I have slightly refined my hardware and software setup, as the available items have improved. This is not set in stone, but I'm thinking of using a Odroid XU-4 as the computer on board, with the Arduino Mega interfacing to the motors. I'm still using the Sabertooth motor controller.
Let me outline where I think this project will be headed:
- Use Python to write a node that communicates with the Arduino, and incorporate sufficient failsafes.
- Map the values that Python uses with the Arduino to the velocities used in ROS(specifically cmd_vel)
- Test the Kinect to make sure that it works on the Odroid.
- Setup the transforms and setup pointcloud_to_laserscan.
- Make a package with a launch file to run all the needed nodes.
- Test everything up to here.
- Make a map.
- Setup navigation stack.
- Have fun!
I need your help! You, the reader, showing interest, keeps me motivated to continue. If you would like to help with software, feel free to comment and I'll get back to you. Thank you for your continued support and (I know, it's cheesy) remember to follow and comment.
Edit: I forgot to include the first, most basic, thing: setup the Arduino to run the robot base. This will be the topic of discussion for my first post.