Home » Drone, Issues, Programming, Testing » Coding with the drone – Performing roundel tracking

Coding with the drone – Performing roundel tracking

Developing settings

Our developing environment for the drone is now properly settled, meaning that we can finally now be efficient while programming by writing a few lines of codes and testing it on the drone seconds later. It wasn’t an easy task, between the drone motherboard that suddenly ceased to work properly (thanks to the warranty and the good customer service of Parrot, this issue was solved ten days later, by receiving a brand new motherboard), the wifi connection that behaved randomly since the last Ubuntu update and an existing API code that is sometimes hard to follow.

Basically, we are now in the following configuration -as long as the drone is concerned:

  • Ubuntu 11.04 (natty) with GNOME on the computer side (Intel Core 2 Duo @2.20GHz)
  • Firmware 1.5.1 on the drone
  • ARDrone API 1.6
  • XBox 360 controller for manual inputs (keyboard mapping currently broken)
  • jEdit as a code editor

Software-wise, nothing else is needed. Obviously, some librairies are required, such as SDL, g++ for the compiler, and later OpenCV for the image analysis. All the code will be indeed done in C and C++; most of what already exists is written in C (i.e. the API), and our personnal code shall mostly be written in C++.

Activating our own algorithm

We now have the possibility to switch between a manual control for the drone (e.g. just after taking off and before landing) and a automated control managed by our own custom algorithm, by merely pushing one button on our controller. Besides, all other necessary commands are also here, coded by ourselves, like performing an emergency shutdown or a flat trim (calibration on the ground). A lot of this was achieved thanks to a helpful presentation found on the web1, on top of excerpts of code2.

Some early tests were about having the drone describing a square pattern on a horizontal plane, or circles of ever increasing radius. Everything responds well – the biggest task remaining to avoid drone collisions with its surroundings was to understand how to handle properly the power of the motors, whose ranges go from -25000 to 25000 (what’s the difference in numbers between fast and really fast for instance ?). It has to be stated that the whole custom algorithm is running in real time on the computer, that constantly exchange data and commands with the drone.

Tracking a roundel

One of the other objective we had in mind while taking the time to set a neat developing environment was to be able to soon integrate our own image analysis. This will be done in a specific part of our code using the library OpenCV.

But before moving on to this next step that has still to be mastered, we wanted to use the already existing roundel detection enabled by the latest ARDrone firmware. Thanks to the API, we can get the coordinates of one (or many) roundels detected on the ground, by using the vertical camera. With these information, we quickly developed a really basic algorithm supposed to keep track of a roundel by hovering on top of it and hopefully following it. The drone basically uses a kind of a proportional controler: the furthest it is from its goal (that is, having the roundel located in a square centered in its vertical camera video field), the fastest it will activate its rotors to correct the error. Our first rough results with this approach can be seen in the video below.

 

NB: the XBox controller only purpose is to assure that the drone is located on top of a roundel before activating our algorithm with a button.

 

Some obvious issues appear after our first tests :

  • The drone is not stable enough, it oscillates a lot, which may be enough for tracking one unique robot, but certainly a problem when more are involved
  • The drone overshoots regularly while trying to correct its error, risking losing track of the roundel
  • The altitude handling is also far from being smooth

All those remarks boil down to one: the controller is not satisfying enough, and more tuning with the constants won’t provide a dramatical improvement. This therefore leads to this conclusion: we need a better controller, and me may want to investigate a PID (Proportional, Derivative, Integrative) one. We have already done some promising tests with it so far, and it proves to be much more promising in terms of steadiness and robustness. It will however be the topic of a future article.

  
References

  1. OpenCV/ARDrone Two Parts Presentation – PDF file, by Cooper Bills []
  2. Robot learning, page by Cooper Bills – see at the bottom of the page, in optional TA lectures []

Comments are closed.