Changes

Jump to navigation Jump to search
243 bytes added ,  03:23, 27 April 2020
no edit summary
I have tested this running Ubuntu 10.10 as a VMware virtual machine as well as installed on a system as the booted OS. Ubuntu is a very easy to use Linux release you can download from [http://www.ubuntu.com/ here] Now lets get started! Open a terminal by clicking on '''Applications -> Accessories -> Terminal''' Enter the following commands by copying and pasting them into the terminal window. Use '''CTRL-C''' to copy and then '''CTRL-SHIFT-V''' to paste them in the terminal window.
<br />
sudo sh -c ‘echo “deb <nowiki>http://code.ros.org/packages/ros/ubuntu </nowiki> maverick main” > /etc/apt/sources.list.d/ros-latest.list’ wget <nowiki>http://code.ros.org/packages/ros.key </nowiki> -O - | sudo apt-key add -
sudo apt-get update
sudo apt-get install ros-cturtle-base
<br />
cd; mkdir ros; cd ros
svn co <nowiki>https://brown-ros-pkg.googlecode.com/svn/tags/brown-ros-pkg/teleop_twist_keyboard</nowiki> svn co <nowiki>http://albany-ros-pkg.googlecode.com/svn/trunk/slam_coreslam/coreslam</nowiki> svn co <nowiki>http://albany-ros-pkg.googlecode.com/svn/trunk/neato_robot</nowiki>
echo ‘. /opt/ros/cturtle/setup.sh’ >> ~/.bashrc
echo ‘export ROS_PACKAGE_PATH=~/ros:${ROS_PACKAGE_PATH}’ >> ~/.bashrc
From this point you need to create your own map using gmapping and save the map so a map of your area is loaded when launching 2dnav_neato. gmapping needs to be modified from the base install on your system at this point to correct the reversed laser scan data. This wiki will be updated shortly with that procedure. gmapping is what will allow you to create a map of your surroundings which you will use to navigate.
[[#top]]
==Disassembly and Reassembly==
Pictorial instructions for disassembling and reassembling an XV-11 can be found at the [http://www.hbrobotics.org/ Homebrew Robotics Club] wiki linked off of the page: [http://hbrobotics.org/wiki/index.php?title=Dave%27s_XV-11_notes Dave’s XV-11 Notes]. There are a few other XV-11 notes there as well, such as how to tap into the battery supply, and I’ll be adding more as I go along. I’m too lazy to update two wiki’s so I’ve decided to make the HBRC wiki my ‘home’ and cross-link from here to there and there to here. Anyway, the disassembly/reassembly instructions were done because I kept forgetting where all the screws go.
Runtime V2.6.15295
OK
<nowiki> #</nowiki>testmode off 
This is after the upgrade
<br />
Runtime V2.6.15295
OK
<nowiki> #</nowiki>testmode off
==Interfacing with LIDAR Sensor==
I created a board to interface the LIDAR sensor to a PC without the rest of the XV-11. The PCB consists of a PIC18F2221, and FTDI-232R, a 3.3V regulator, and a Fet for controlling the motor. The pic watches the data from the LIDAR and uses the speed information to control the PWM pin attached to the FET. In this way the correct speed can be maintained. The Firmware for the pic currently only supports the old LIDAR firmware ( that is what I have). Hopefully someone else can modify it to work with the other firmware as well. There are jumpers for configuring who talks to what between the PC, PIC, and LIDAR. There are also options for supplying your own 5V instead of getting it from USB and an option for using an XBEE for wireless communication. Schematics are posted below as well as the Eagle brd files. Source code for the PIC will be posted soon. I have spare boards for sale if anyone is interested. Ringo (dot) Davis (at) gmail (dot) com.
[[File:LIDAR mounted on PCB.jpg|none|thumb|533x533px|LIDAR mounter on interface board|alt=]][[File:LIDAR plugged in to Interface board.jpg|none|thumb|533x533px|LIDAR plugged in to Interface board|alt=]]
<br />
TestEncoder
Commands are case-insensitive, and can be entered incompletly incompletely : getversion, getvers, getv, GeTvErSiOn will all work alike. Be aware that one at least of these commands can brick your LDS... Here are the details of what is currently known about these commands: (To be completed!)
GetVersion
Draws a squirrel in a heart, in ASCII art...
Example:
[[File:Wanderer Screen Shot.png|none|thumb|400x400pxalt=]]
<br />
**You can get free samples of these parts from [https://tycoelectronics.com/ TE’s website]
2D CAD files of the XV-11’s LIDAR unit, in mm. Thanks goes to '''''chenglung''''' from the trossen robotics forums[https://www.trossenrobotics.com/ Trossen Robotics] Forums.
3D CAD model of the LIDAR unit. Note that I used the 2d cad files mentioned above along with my own measurements, so be warned that the following is not completely accurate. Also, I’m no CAD professional, so you won’t find much detail in the model - just enough to account for any significant design properties which may be useful to know when building the module into your own applicatio
*Data 0 to Data 3 are the 4 readings. Each one is 4 bytes long, and organized as follows :
<br />
byte 0 : Distance 7:0
byte 1 : “invalid data” flag : “strength warning” flag
byte 3 : Signal Strength 15:8
As [https://sites.google.com/site/chenglung /home/xv-11-open-lidar-project-matlab-script chenglung] points out, the distance information is in mm, and coded on 14 bits. This puts the tests made by Sparkfun in a room of around 3.3m x 3.9m (11ft x 13 ft ?), which seems reasonable.
The minimum distance is around 15cm, and the maximum distance is around 6m.
It is organized as follow :
<br />
5A A5 00 C0 XX XX  data
`` is composed of 360 group of 4 bytes, organized like this :
<br />
byte 0 : Distance 7:0
byte 1 :  “invalid data” flag : ”quality warning” flag : distance 13:8
byte 3 : Quality 15:8
As [https://sites.google.com/site/chenglung /home/xv-11-open-lidar-project-matlab-script chenglung] points out, the distance information is in mm, and coded on 13 or 14 bits. This would put the tests made by Sparkfun in a room of around 3.3m x 3.9m (11ft x 13 ft ?), which seems reasonable to me. 13 bits should be enough if the sensor is destined to work up to 6m. This needs some tests...
The bit 7 of byte 1 seems to indicate that the distance could not be calculated.
That was the basic idea which led to realization of the full wireless remote control.
As you may know, you can manually control Neato right out of the box by connecting it to any computer via usb and, through any terminal program, send commands to robot (for some reason article describing command list on official Neato site is unreachable at this moment, but you can get it by typing ‘help’). The way you can control Neato movements is described here: [[#XV-11 API Commands]]
This is really great, but how can one use it for robot’s intended purpose if he or she is limited by the length of the wire? Of course you can take a laptop!  But for me it was not the answer. Luckily, my friend recently has brought a compact (very compact) WiFi router (Commonly available from eBay as “2g/3g/4g wifi router”, also known as HAME MPR-A5 and MIFI-F5. MPR-A1 and clones are likely to work as well if you manage to fit them in. Some additional material is available on http://my-embedded.blogspot.com/2013/12/mini-4g-router-rt5350f.html) and suggested that we should try to embed it into my Neato.

Navigation menu