This page is used as a log for the [[drexel_duct_navigator|Ventilation Ducting Navigator]] project. ===== Log ===== **August 27, 2013** Successfully got hector slam to work without odometry on laptop. Installed hokuyo node on raspberry pi: **August 26, 2013** Attempting to get broadcasters to work. Found sources from [[https://github.com/ros/geometry/tree/fuerte_devel|here]] Broadcaster tutorial would not link header files. solution -> add to the manafest Broadcaster tutorial successfully completed. **August 25, 2013** Successfully installed and ran Hector Slam on fuerte. However, I discovered that I would need to create transform broadcasters to link the lidar reference frame to the vehicle reference frame. Hector slam has the ability to operate without an imu for 2 dimentional Mapping so I plan to do that initially. This means that all I need to do is add a static transform broadcaster linking the laser frame to the "base_link" frame. I plan to do it similarly to this [[http://dl.dropboxusercontent.com/u/32191086/Public%20Media/Selection_001.png|tf diagram]]. **August 16, 2013** Successfully interfaced Hokuyo with a serial terminal on a Mac. Response time was much better and output was accurate as far as I could tell without being able to plot it. Discovered c libraries from hokuyo with linux support for transmitting data and parsing it. Installed linux on a laptop so I could begin working with aforementioned libraries and find a good linux compiler. **August 15, 2013** Problems had been encountered before with Arduino slow downs when there was too much draw on the Arduno's power. So, a dc power supply was used to power the Hokuyo and another was used to power the Arduino and host shield. This did not seem to have any effect. Attempted isolating the Arduino USB host shield's power from the Hokuyo using a powered usb hub in attempt to reduce current draw on the host shield's usb port. This had no effect. Attempted to see if slowdown was in software by using Serial.Print() commands. It seems like the entire Arduino loop() function is slowing down after the usb host shield is activated. It is not known why. I am considering using a raspberry pi to operate the Hokuyo and process the slam data. It has built in usb ports, significantly more processing power, more memory for data storage. In addition it runs a desktop version of linux. Dr Zhang was able to successfully run the Hokuyo using a linux platform already. **August 14, 2013** Attempted to solve the slow processing by changing settings on the hokuyo such as serial baud rate. This had no effect. Looked through Arduino code for possible causes of slowdown. Nothing was found. Contacted Paresh and Dr. Zhang to ask for assistance. Neither had any advice. **August 13, 2013** Attempted to read data from using an Arduino mega and usb host shield. [[http://www.madajimmy.com/artikel/tutorial/59-interfacing-lidar-hokuyo-urg-04lx-ug01-to-arduino-with-usb-host-shield.html|This page]] proved useful in setting it up. The Lidar was powered externally and the USB chord was plugged into the USB host shield. The the ACM terminal demo sketch from the [[https://github.com/felis/USB_Host_Shield_2.0|the usb host shield library]] was then uploaded to the Arduino. The Arduino IDE serial terminal was then used to send commands to the lidar. The following command was sent to the lidar to switch it to SCIP2.0 mode. SCIP2.0 Next the following command was used to acquire version information. VV The commands produced the proper response. However, the responses took a long time (~30 sec) to arrive. **August 12, 2013** Acquired Hokuyo lidar. Installed the Hokuyo windows data viewing application from [[http://www.hokuyo-aut.jp/02sensor/07scanner/download/products/urg-04lx/data/UrgBenri.htm|www.hokuyo-aut.jp]] . Initially I was unsuccessful at acquiring data from the lidar over usb. Drivers were found from [[http://www.hokuyo-aut.jp/02sensor/07scanner/download/products/urg-04lx/|www.hokuyo-aut.jp]] **August 11, 2013** Tested and debugged 3d movement on the ductwork assembly. Captured video to use with write up. **August 9, 2013** Added ability to measure z direction movement and did some preliminary testing. Movement is computed by using mouse x and y reference frame movement and rotating to global reference frame. **August 8, 2013** IMU Installation and configuration An MPU 6050 was ordered from amazon.com and wired on breadboard. A [[http://www.i2cdevlib.com/devices/mpu6050#source|library]] was found to accompany it. The library was used to read out orientation data in the form of yaw pitch roll. Attempted to a upload the sketch to the board used to process motion. However, the board's bootloader seems to be corrupt. A replacement Arduino mega has been found. IMU code has now been integrated with the encoder optical mouse code. At the moment only 2d mapping is implemented to test the concept. The vehicle was driven around Bosone atrium and back to initial starting point. When it returned the readings were within 50 cm of the initial reading even with this rudimentary test. {{youtube>sQ9ik5Nwy3o?large}}\\ **August 7, 2013** Began exploring the viability of using an optical mouse as an encoder. The mouse was attached to an Arduino mega using a SainSmart USB host shield. A library was downloaded from [[https://github.com/felis/USB_Host_Shield_2.0|the usb host shield git hub page]]. Several Mice were then tested an unknown brand gaming mouse, a corsair gaming mouse and an Apple Mighty Mouse. Though the gaming mice had high precision they had non-linear output with velocity. All three mice worked on desktop, tile floor, carpet and sheet metal. The Apple mouse was selected because of it's linear output and it's ability to function on sheet metal. {{dylanw:optical mouse encoder setup.jpg}}\\ The mouse output was then calibrated to meters and summed to show absolute position. The mouse was then moved around a flat surface in various ways, without rotation, and moved back to the initial position. The error in the original position was consistently +/- 1 cm in the x and y direction over a approximately a 1 meter movement. The optical mouse also has the advantage to be able to measure odometry even when the vehicle is slipping. It is planned to use two mice a fixed distance apart and orientation relative to each other to be able to measure rotation. Alternatively an IMU could be used to measure rotation and cross check the odometry data. The mouse was then mounted on an adjustable height bracket so it always has contact with the ground. {{dylanw:mouse attached to chassis.jpg}}{{dylanw:mouse under chassis.jpg}}\\ A dedicated Arduino was used to integrate the position data. The Arduino sends it's data to the main Arduino using serial communication. Preliminary tests seemed promising if the vehicle moves at low speeds on tile, carpet, or sheet metal. {{youtube>lSVsTbBPnQQ?large}}\\ **August 6, 2013** Began exploring viability of using internal counter registers on the Arduino mega. It is hoped that this will allow odometry to be measured completely in hardware on the Arduino thus freeing up processing power. Encoder output: {{dylanw:encoder scope.jpg}}\\ Calibrated the velocity and distance measurements from the encoder to meters. Velocity scale=.025961 seems to yield measurements well within uncertainty. velocity scale= .258 4.74, 4.82 6.10, 6.49 5.78, 6.04 velocity scale= 0.26961 6.15, 6.10 difference= .05m 6.04, 6.10 difference= -.06m 6.18, 6.19 difference= -.01m 5.93, 5.94 difference=-.01m this yields a standard deviation of +/- 5 cm over approximately a 6 meter linear course. ===== Previous videos ===== Vehicle operating in ductwork {{youtube>tU2zyrExpes?large}}\\ Onboard Footage In Ductwork {{youtube>Y_MsZHUyhZg?large}}\\