# Tracking algorithm: considering the inclination of the drone

July 22, 2011 | Posted by Michael under Drone, Ideas, Issues, Programming, Refining the project, Testing |

## Setting down the problem

Our PID controller has proven to be working but without achieving an almost perfect stability, even when it comes to stay on top of a still roundel. A hypothesis was then made to explain our difficulty to fulfill our goal, apart from having to correctly tune the gain parameters. So far, we have not taken into account the fact that the drone tilts a little while it moves. Yet, an inclination on one or two axis moves also the vertical camera, which then changes the roundel position returned by our algorithm.

Indeed, if the drone is located of top of the same spot where there is a roundel, the coordinates returned will vary more or less depending on the tilt angle. The greater the tilting, the bigger the offset. And a PID controller cannot behaves well if its core principle, that is the parameter measured which has to be corrected, is changing in an unexpected way because of the results of the PID correction.

## Situation modeling

### Geometric representation

The figures below illustrate the problem that occurs while the drone is moving. First, Figure 1 pictures the ideal situation, where the camera keeps itself perfectly vertical at any time. The field of view (FOV) of the camera is represented by a 1000*1000 matrix whose size does not change accordingly to the altitude. The coordinates returned by the detection algorithm are therefore given without units (in blue on the picture) and only specify relatives distances. To apply our own correction, we will need to work with SI units. This will be possible by using the altitude value that the drone navigation data knows at any time, and the FOV angle, which is equal to 64 degrees.

Figure 2 shows how tilting the vertical camera distorts the coordinate system on the ground. Furthermore, the roundel is clearly not at the same location anymore when it is viewed from the drone viewpoint, whereas the drone and the roundel are still over the same spot.

On Figure 3, it is possible to see that the inclination angle and the position of the roundel may affect the representation of the situation: the subsequent angles are not calculated as in the previous case. All those figures are obviously symmetrical, and what happens on one side of the x axis happens the same way on the other side.

The goal is now to analyse all these possible cases and find a corrective function that can be applied to the coordinate thats the algorithm receives, no matter what they are.

### Mathematical analysis

First, we need a function that returns a converting factor that will be used to transform a value into millimeters from a measurement given in arbitrary units (as returned by the embedded algorithm on the drone).

Then, when we need to convert a value read by the camera into millimeters at a given altitude, we just need to apply the following:

To keep our explanation simple, we take only two dimensions into account, that are the height and a length along the x axis. The reasoning and calculus are exactly the same with the y axis, apart from one minus sign.

Let us now consider a tilted camera that makes a φ angle with a vertical line perpendicular to the ground. Figure 2.b illustrates the problem we have to solve: even if neither the roundel nor the drone have moved -except for the tilting-, the coordinates returned by the tracking algorithm will be much different from what is expected (Figure 1.b).

The value *xRead * returned by the camera is not actually the one corresponding to the real distance as seen on the ground, since the scale on the projected field of view on the ground is now distorted because of the tilting. To keep an orthonormal coordinate system with evenly scaled values, we have to consider a plane perpendicular to the line that go straight into the camera lens. Then, no matter where this plane is located along this line, every single point that belongs to this newly enclosed space will keep the same *relative* distance to the origin zero.

We define a new angle α as showed on Figure 2.a, such as:

We also define *xReal* as the actual position of the roundel on the x-axis in a situation where the camera is perfectly vertical.

Where And *xOut_1||2* is the equivalent of *xRead* distorted on the ground (one cannot talk about “projection” since not perpendicular angle is considered there). The value of *xOut_1||2* is actually different depending on the camera inclination and the roundel location (Figure 2.a involves *xOut_1*, Figure 3.a shows *xOut_2*) . Keep in mind that *xRead*, *xReal*, *xIn* and *xOut* can be negative depending on the tilt angle φ. Besides, the value φ is returned positive by the drone navigation data when the drone is in a situation likewise to Figure 2.a, and negative when the tilting is in the opposite direction. Using the law of sines^{1}, that states that the ratio of the length of a side to the sine of its corresponding opposite angle is constant, we get, from the green triangle in Figure 2.a: And since We get The same goes with *xOut_2*, except that the angles are different (cf. Figure 3.a): Hence the result that applies in a case similar to Figure 2.a: When we generalize the calculus and consider every possible situation, we get the following conclusion:

### Experiments and performance results

#### Protocol

To test our model in a real world setup, we built and filled a datalog in real-time during different test flights to keep track of the raw values returned by the detection algorithm and the corrected values. Besides, we also saved the angles made by the drone on both axis. Again, to keep the results readable, we chose to display data referring only to the x-axis, so it makes sense to compare our previous data. We performed the same experiments on the y-axis, for the same performance.

Both graphs below reports these data on the same timeline, during one of our running test times. Basically, we took the drone, activated our recognition algorithm, and did the following, in this order:

- The drone is put on top of the roundel, at a steady altitude. We then rotate it around one axis at a regular pace, from one side to the other (no more than 60 degrees on each side), in order to register different lateral angles.
- The drone is then put on the far right of the roundel, without changing the altitude nor the y axis position. Rotations are then applied as before.
- Step 2 is repeated, except that it is done one the far left of the roundel.

#### Experiments

The experiment results are reported on the graphs below. Please note that values are actually registered when a roundel is detected. That is why the range of the angle vary for each step, even if the drone is each time moved the same way. The achieved results for these steps are:

- As expected, the more the drone is tilted (in green on the graph), the further from the zero origin the roundel is detected (in blue on the graph). The corrected value (
*xReal*) is staying really close to zero, which is what we wanted to perform. - The corrected value stays close to the real one, with a range of 50 cm at maximum, way better than a range of 2 meters as it is the case with the raw values.
- Observed results are symmetrical to those of step 2.

#### Closing comments

What can also be noted is that the sensors perform really well: they return accurate and consistent values at any time. This is especially true with the altitude, since the tilt sensors seem to lose accuracy when they are shaken too fast or if the angle is too big, which accounts for bigger errors in the correction. Overall, they all refresh themselves fast enough to be consistent with each other at a given time, and the communication delay does not really interfere with this process.

As for the drone itself, once the correction is applied for the PID controller, we clearly noted that a lot of steadiness has been gained through this process, with a reduced settling time and a less random behavior. This itself confirmed the relevance of our study and the efficiency provided by the sensors and by our algorithm. We will soon provide new results about our tracking controller, with some further investigation into other solutions.

## Approximating our model

### Simplification of the situation

The high mathematical precision that we got with our previous model is not required because the sensors do not allow for such precision. Hence a simplification may be welcome, be it only out of concern for maximum clarity in the explanation. Besides, it saves having to rely more than once on values returned by the sensors. If a sensor value is slightly offset, it is indeed better to use it once and for all in our equations, rather than reporting errors many times and increasing its effects on the results (especially here with the *altitude* and *xRead _{mm}* that were each called three times before, because of the α angle). Figure 4.a shows how the model can be simplified.

We therefore have only one function to compute *xReal*, whose domain of definition is broader, because neither the tilting angle φ nor the sign of *xRead _{mm}* change the model anymore :

### Performance

This alternative model performed surprisingly very well, insofar as we got on average a shift of about 2 millimeters between both models. It even appears on average more accurate when we are dealing with positions further from the roundel, that are critical ones since the angle is greater there, and the sensor accuracy worse. This is explained by the fact that this new equation is less sensitive to small variations of the parameters.

As a conclusion, we plan on keeping this last implementation because of its really good performance, both in terms of simplicity and accuracy.

**References**

Good post! We are linking to this particularly great content on our site. Keep up the good writing.|