User Tools

Site Tools


ardrone_line_following

Making the AR.Drone do Line Following

Introduction

In this tutorial we will scratch the surface of computer vision and robotics. In the past tutorials you controlled the ardrone using the keyboard and also joystick but autonomy did not come into play. This tutorial opens the gate of machine learning ,control systems and autonomous aerial vehicles.

Goal

The purpose of this tutorial is to implement basic computer vision algorithms for the autonomy of the ardrone. The bottom camera of the ardrone is use to track a line and keep on course. I used a proportional controller in this case to make this simple and you can develop it from here to make it do complex things. All of the code developed in my tutorial is for basic autonomy of the quad-rotor ardrone.

Add the line following module

In order to continue with this tutorial make sure you finished this tutorial.

Step 1: Open up a new terminal and enter:

roscd drone_teleop

Step 2: Go to folder named “bin” and create a file called “linefollow.py” and add the following code to it:

linefollow.py
#!/usr/bin/env python
import roslib; roslib.load_manifest('drone_teleop')
import rospy
import rosbag
from geometry_msgs.msg import Twist
from std_msgs.msg import Empty
from std_msgs.msg import Float64
from turtlesim.msg import Velocity
from turtlesim.msg import Pose
from sensor_msgs.msg import Joy
 
 
import sys, select, termios, tty
import time
 
xpos=0
ypos=0
xdis=0
joy_axes=[0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
joy_buttons= [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
move_bindings = {
		68:('linear', 'y', 0.1), #left
		67:('linear', 'y', -0.1), #right
		65:('linear', 'x', 0.1), #forward
		66:('linear', 'x', -0.1), #back
		'w':('linear', 'z', 0.1),
		's':('linear', 'z', -0.1),
		'a':('angular', 'z', 1),
		'd':('angular', 'z', -1),
	       }
def callback(RecMsg):
 
    global xpos #displacement from the bottom line to be followed
    global ypos		
    global xdis	
    xpos = RecMsg.linear
    ypos = RecMsg.angular
 
    global i
def callback1(laserp):
 
    global xdis	
    xdis=  laserp.x	
 
def callback2(joystick_cmd):
	global joy_axes
	global joy_buttons
	joy_axes = joystick_cmd.axes
	joy_buttons=joystick_cmd.buttons
 
def turnleft():
    twist.angular.z=-1
    pub.publish(twist)	
    time.sleep(1)	
    twist.angular.z=0
    pub.publish(twist)		
 
def turnright():
    hover()				
    twist.angular.z=1
    pub.publish(twist)	
    time.sleep(1)	
    twist.angular.z=0
    pub.publish(twist)	
def getKey():
	tty.setraw(sys.stdin.fileno())
	select.select([sys.stdin], [], [], 0)
	key = sys.stdin.read(1)
	termios.tcsetattr(sys.stdin, termios.TCSADRAIN, settings)
	return key
def hover():
	twist.linear.x = 0; twist.linear.y = 0; twist.linear.z = 0
	twist.angular.x = 0; twist.angular.y = 0; twist.angular.z = 0
	pub.publish(twist)
def motion():
	twist.linear.x = 0.1*joy_axes[5] #forward(+) and backward(-)(pitch) 
	twist.linear.y = 0.1*joy_axes[2] #lateral movement::LEFT(1):RIGHT(-1)(ROLL)
	twist.linear.z = 0.1*joy_axes[3] #altitude movement:UP(+):DOWN(-)
	twist.angular.x = 0 
	twist.angular.y = 0 
	twist.angular.z = 0.1*joy_axes[4]#angular::left(+)::right(-)	
	pub.publish(twist)
 
if __name__=="__main__":
    	settings = termios.tcgetattr(sys.stdin)
	print msg
 
	auto = True
	pub = rospy.Publisher('cmd_vel', Twist)
	land_pub = rospy.Publisher('/ardrone/land', Empty)
	reset_pub = rospy.Publisher('/ardrone/reset', Empty)
        toggle_pub=rospy.Publisher('/ardrone/togglecam', Empty)
	takeoff_pub =rospy.Publisher('/ardrone/takeoff', Empty)
	rospy.Subscriber('/drocanny/vanishing_points',Velocity,callback)
	rospy.Subscriber('/drone/walldis',Pose,callback1)
	rospy.Subscriber('/joy',Joy,callback2)
        twist = Twist()
	rospy.init_node('drone_teleop')
 
 
	try:
		while(True):
 
 
                        buttonvalue=  joy_buttons
			if buttonvalue[1] == 1:
				land_pub.publish(Empty())
				time.sleep(0.25)
			if buttonvalue[3] == 1:
				reset_pub.publish(Empty())
				time.sleep(1.5)
				land_pub.publish(Empty())
				time.sleep(0.25)
			if buttonvalue[0] == 1:				
				takeoff_pub.publish(Empty())
				time.sleep(0.25)
			if buttonvalue[2] == 1:
		     		toggle_pub.publish(Empty())
				time.sleep(0.25)
			if buttonvalue[7]==1:
				time.sleep(0.25)
				while(True):
					     time.sleep(0.015)
		                             twist.linear.y=0.0011*(ypos)
			                     twist.linear.x=0.1
			                     print twist.linear.y	                     		
			                     pub.publish(twist)
			                     time.sleep(0.015)
			                     twist.linear.x=0.0
			                     pub.publish(twist)
                                             if(buttonvalue[7]==1):
						print 'quit'
						break		
			               	     time.sleep(0.25)	
 
 
 
			motion()
 
	except Exception as e:
		print e
		print repr(e)
 
	finally:
		twist = Twist()
		twist.linear.x = 0; twist.linear.y = 0; twist.linear.z = 0
		twist.angular.x = 0; twist.angular.y = 0; twist.angular.z = 0
		pub.publish(twist)
 
    		termios.tcsetattr(sys.stdin, termios.TCSADRAIN, settings)

Save and close the file.

Step 3: Go to the folder bin in drone_teleop and right lick on the file “linefollow.py” and in the permissions tab click the check box “Allow executing file as a program”

Step 4: Go to folder named “src” in the package “roscv”(which you would have created in the previous tutorials and create a file called “linefollower.cpp”

Step 5: In the empty file and paste the following code and after that save the file.

linefollower.cpp
#include <ros/ros.h>
#include <stdio.h>
#include <iostream>
#include "std_msgs/String.h"
#include <image_transport/image_transport.h>
#include <cv_bridge/cv_bridge.h>
#include <sensor_msgs/image_encodings.h>
#include <opencv2/imgproc/imgproc.hpp>     //make sure to include the relevant headerfiles
#include <opencv2/highgui/highgui.hpp>
#include <opencv/cv.h>
#include <opencv/highgui.h>
#include <cv_bridge/CvBridge.h>
#include <cvaux.h>
#include<math.h>
#include <cxcore.h>
#include "turtlesim/Velocity.h"
 
/*here is a simple program which demonstrates the use of ros and opencv to do image manipulations on video streams given out as image topics from the monocular vision
of robots,here the device used is a ardrone(quad-rotor).*/
using namespace std;
using namespace cv;
namespace enc = sensor_msgs::image_encodings;
static const char WINDOW[] = "Image window";
 
float prevVelocity_angular ,prevVelocity_linear ,newVelocity_angular ,newVelocity_linear ;
float derive_angular, derive_linear, dt = 0.5;
float horizontalcount;
 
class ImageConverter
{
  ros::NodeHandle nh_;
  ros::NodeHandle n;
 ros::Publisher pub ;
  ros::Publisher tog;
  image_transport::ImageTransport it_;    
  image_transport::Subscriber image_sub_; //image subscriber 
  image_transport::Publisher image_pub_; //image publisher(we subscribe to ardrone image_raw)
  std_msgs::String msg;
public:
  ImageConverter()
    : it_(nh_)
  {
      pub= n.advertise<turtlesim::Velocity>("/drocanny/vanishing_points", 500);//
      image_sub_ = it_.subscribe("/ardrone/image_raw", 1, &ImageConverter::imageCb, this);
      image_pub_= it_.advertise("/arcv/Image",1);    
  }
 
  ~ImageConverter()
  {
    cv::destroyWindow(WINDOW);
  }
 
  void imageCb(const sensor_msgs::ImageConstPtr& msg)
  {
 
     sensor_msgs::CvBridge bridge;//we need this object bridge for facilitating conversion from ros-img to opencv
   IplImage* img = bridge.imgMsgToCv(msg,"rgb8");  //image being converted from ros to opencv using cvbridge
   turtlesim::Velocity velMsg;
 CvMemStorage* storage = cvCreateMemStorage(0);
     CvSeq* lines = 0;
       int i,c,d;
       float c1[50]; 
       float m,angle;
	float buf;
	float m1;
       float dis;
       int k=0,k1=0; 
      int count=0;  
 
      float xv;
      float yv;
      int vc=0;
     float xvan=0,yvan=0;
      static float xvan1=0,yvan1=0;
    float num=0;
   static float count1=0;
  float dv;
float vxx,vyy;
 
         cvSetImageROI(img, cvRect(0, 0,170, 140));
        IplImage* out1 = cvCreateImage( cvGetSize(img), IPL_DEPTH_8U, 3 );   //make sure to feed the image(img) data to the parameters necessary for canny edge output 
        IplImage* gray_out = cvCreateImage( cvGetSize(img), IPL_DEPTH_8U, 1 ); 
        IplImage* canny_out = cvCreateImage( cvGetSize(img), IPL_DEPTH_8U, 1 );
        IplImage* gray_out1=cvCreateImage( cvGetSize(img), IPL_DEPTH_8U, 3 );
	IplImage* canny_out1 = cvCreateImage( cvGetSize(img), IPL_DEPTH_8U, 1 );
	IplImage* canny_out2 = cvCreateImage( cvGetSize(img), IPL_DEPTH_8U, 1 );
 
        cvSmooth( img, out1, CV_GAUSSIAN, 11, 11 );
 
      cvCvtColor(out1 , gray_out, CV_RGB2GRAY);
        cvCanny( gray_out, canny_out, 50, 125, 3 );
      cvCvtColor(canny_out ,gray_out1, CV_GRAY2BGR);
 
 
 
       lines = cvHoughLines2( canny_out, storage, CV_HOUGH_PROBABILISTIC, 1, CV_PI/180, 80,50, 10 );
        for( i = 0; i < lines->total; i++ )
        {    
 
             CvPoint* line = (CvPoint*)cvGetSeqElem(lines,i);
{           
{
cvLine( out1, line[0], line[1], CV_RGB(0,255,0), 1, 8 );
cvLine( gray_out1, line[0], line[1], CV_RGB(0,255,0), 2, 8 );
xv=line[0].x-line[1].x;
yv=line[0].y-line[1].y;
velMsg.linear = atan2(xv,yv)*180 /3.14159265;
angle=velMsg.linear;
if(velMsg.linear<-90)
{
  velMsg.linear=velMsg.linear+180;
}
buf=(line[0].x+line[1].x)/2;
 
if(abs(85-buf)<=15)
{
velMsg.angular =0;
}
else
{
velMsg.angular =(85-(line[0].x+line[1].x)/2);
}
 
if(abs(velMsg.angular)>50)
{
velMsg.angular =0;
}
 
 
 
 
 
printf("\nX::Y::X1::Y1::%d:%d:%d:%d",line[0].x,line[0].y,line[1].x,line[1].y);
 
pub.publish(velMsg);
 
 
} 
 
 
 
}
 
cvSeqRemove(lines,i);
 
}
        cvShowImage( "OUT1", out1 );//lines projected onto the real picture
       cvShowImage( "GRAY_OUT1", gray_out1 );
       cv::waitKey(3);
   sensor_msgs::ImagePtr out = sensor_msgs::CvBridge::cvToImgMsg(img, "rgb8");//image converted from opencv to ros for publishing
   image_pub_.publish(out);
 cvClearMemStorage(storage);
cvReleaseMemStorage(&storage);
}
 
};
 
int main(int argc, char** argv)
{
  ros::init(argc, argv, "image_converter");
  ImageConverter ic;
  ros::spin();
  return 0;
}

Make changes in the CMakeLists.txt

Step 1: Open up the file named “CMakeLists.txt” and search for the lines beginning with “rosbuild_add_executable”. In that add the file you created which is “linefollower.cpp”. Save and close the file.

Environmental setup

Step 1: Make sure you have a plain ground and paste tape on it so that it would form a straight line(not necessarily perfect straight). You can do the same procedure on a tiled floor too. Here is a sample picture of such environment.

line1.jpg

line2.jpg

line3.jpg

Step 2: After you have set the line,run and test the program. Make sure you have done the following things before you fly.

1. Interface the drone with laptop 2. Make sure you have connected the joystick to the computer (logitech dual controller)

Step 3:

Open up a terminal and enter the commands one after the other:

roscd roscv
cmake .
rosmake

Step 4:

Its time to fly and follow the line now. Enter the following commands each in a new terminal:

roscore
rosrun ardrone_brown ardrone_driver
rosrun joy joy_node
rosrun roscv roscv
rosrun drone_teleop linefollow.py

Step 5:

With the environment setup and line set on the floor.Press button 1 in the joystick to take off. Place the drone directly above the line and hit the button 8.

Now the drone is in autonomous mode. Your controls wont have any effect here.It should be following the line now.To have the control of the drone hit button 8 again and it should be in manual mode.

Video


ardrone_line_following.txt · Last modified: 2016/11/06 02:38 by dwallace