Category: Smartphone

Testing the drone with third-party applications

Trying to connect to the AR.Drone

When you buy the AR.Drone, the first thing you may want to do before developing your own programs for it is to test the UAV itself, straight out of the box. The easiest way to proceed is to own an iDevice, i.e.an  iPhone/iPad/iPod Touch, download the free application AR FreeFlight made by Parrot, connect to the drone and simply run the app. It will enable you to pilot the UAV while streaming the video feed from both cameras directly to the phone screen.

As far as we are concerned, we do not use such devices, and we only plan on piloting the drone with a computer or an Android phone. Unfortunately, a technical detail prevent us so far from doing it as easily as it can be done with an iPhone. The AR.Drone requires an ad-hoc connection which cannot be established with our Android phone, since it is still not software-wise supported by Google. It is however possible to do that with the Parrot Software Development Kit, as long as you manage to circumvent the WiFi connection issue.

Many applications for the AR.Drone are already available on the Android market (see our Links and Downloads page), however most of them require that you root your smartphone prior to using it. Rooting a smartphone usually void your warranty (it is yet possible to unroot it), and furthermore you need to apply a patch to enables ad-hoc connections. It is completely doable and we even tried it. However the process is tedious and is not easily explainable to anyone. Since we want to develop a system that can be started by anybody as long as the material is there, this is not an option we would like to keep.

 

What is efficiently working so far

On a HTC Desire smartphone

Fortunately, Shellware developed a little PC program that reverses the connection process. Instead of having the phone connecting to the hotspot generated by the drone, it makes the drone connect to the hotspot enabled by the phone.

To achieve that is easy thanks to the AR Assist infrastructure WiFi program (running on a WiFi-enabled computer) that tells the drone which hotspot it has to connect to. Then, one only has to start the AR Pro Android application (running on an Android phone) to pilot the UAV as it would be done on an iPhone.

 

Screenshot of the AR.Pro interface running on a HTC Desire. The screen is mainly covered with the frontal camera view, while on the top left is located the vertical camera view. Joysticks enable throttle, roll, pitch and yaw, while buttons take care of automatic maneuvers. Altitude, quality of WiFi connection and battery information are displayed on the screen. Note that the vertical size of the camera view has been scaled to fit the screen resolution.

 

The last annoying thing with this solution is that you need a PC around and it takes around one minute to configure. Unless you then install a patch on the drone (that can be made through AR Pro, even if we did not succeed in doing so), you then have to follow those same steps every time you reboot the drone.

On a laptop

It is also possible to skip the phone layer and pilot the drone directly with the computer. You just need to connect to the drone hotspot with your computer through WiFi, and then run for instance the program WD ARDrone that uses Windev 16.

 

Screenshot of the WD ARDrone application running on Windows. This is what the UAV sees with its horizontal camera.

 

This program is really reliable and enables video recording and piloting with a specific controller, like a joystick or an XBox 360 controller – as long as you map your keyboard keys with your controller buttons. The interface provides the user with information about all navigation data which increases the impression of flying from inside a cockpit. Experimenting piloting commands with a game controller proved to be more accurate and steady than with a smartphone.

The powerful Xpadder software has been used to do the key mapping with our game controller.

Getting started

First things first, we had to take the time to find out all the equipment we would need, all the objectives we wanted to reach and therefore all the tasks we had to achieve so as to complete the project properly. Nevertheless we obviously can’t predict everything right now, but we think that the board below gives a rather accurate overview of the project in itself. This board represents our current objectives, so it might not be the last time you see it on this blog section.

 

Giving you more details about the project’s ideas and mechanism should be a good start. The basic idea of the project is to gather a flock of land units in real-time. This flock would have a leader and all the other land units would have to keep in formation. Now the interesting point: all the other robots following the leader would be “dumb” robot as we called them in the way that they do not have any sensor mounted on their vehicle. The question being was to find a solution in order to balance this lack of sensor, this absence of communication between those machines and the environment in which they progress.

This question brings us to the interesting part of the project: the use of a drone. Indeed, the drone we’re using has two cameras on board (on looking straight forward and the other one pointing at the ground). The camera looking down will do some pattern recognition, locate every vehicle and the movement of each land unit will be adjusted so as to keep the formation. Namely, the drone will have to stay above the flock so as a first idea, it will stay above the leader (an autonomous or remote-controlled unit).

So far, we established a list of all we need (that you will probably see in our incoming videos relating our tests) and split the work in different parts which would be:

  • Pattern recognition: how to match patterns, which patterns to use, which platform to use for the image analysis;
  • Communication : we need to create an interface using the Bluetooth for the computer-land units layer and WiFi for the computer-drone layer;
  • Algorithmic: which strategy would be the most suitable for the trajectory of the land units and the position of the drone;
  • Design: making the robots ergonomic, stable, easily movable (for example, should we go on a omniwheel architecture or not…).