Page 1 of 1

Lighthouse positioning questions

Posted: Thu Jun 06, 2019 11:25 am
by jamador

I have some questions about the lighthouse positioning system:

1) I only want to use the system to get a more accurate pose avoiding the use of expensive systems (OptiTrack, etc). In my case, I understand (reading that I only need two Base Station 1.0 and a VIVE controller/Tracker (is this last really necessary?) plus CF lighthouse decks. I want to mount a more or less 3x3x3 meters testbed, I need additional cables or some tool?

2) I need POSE, not only position. Do you know aprox the date that this feature could be implemented?

3) What is the precision in the obtained pose using this system?

4) Can Crazyswarm package work with lighthouse deck? Perhaps using "none" in the "motion_capture_type" of the crazyswarm_server node?

This is not a important question but, Do you know why the lighthouse deck need to use a FPGA? probably the POSE calc need an high burden in a microcontroller?

Thank you in advance!!

Re: Lighthouse positioning questions

Posted: Fri Jun 07, 2019 8:07 am
by arnaud

I can answer in order:
1) This is correct that you need 2 lighthouse base station V1 and one tracker or controller. The tracker is required to setup the system geometry (ie. find the base-station pose), eventually this could be done with only a Crazyflie and the tracker will not be required anymore. If the base-station do not see each-other you might also want to have a synchronization cable between the base-station. Though, for 3x3m wireless optical synchronization should not be a problem.

2) The Crazyflie is estimating its full pose using an EKF and its IMU. The Lighthouse system only provides position but we could have the pose already: we get the individual position of all 4 sensors. The current Crazyflie EKF does not support full-pose sensors though so this is the main reason we do not feed the full pose yet.

3) We observe less than 1mm of relative accuracy. We still have to perform better measurement but the absolute accuracy seems to be within +/-10cm over the tracked space. The error is constant: in our ICRA demo we sample the position of the landing pad at take-off and we are able to come back to the same point later with a precision much better than 10cm. We are currently working on calibrating the system which should allow to get closer to millimeter precision over the tracked space.

4) Yes this is correct, lighthouse will appear very much like an LPS system to Crazyswarm and using 'none' as motion_capture_type will work.

As for the FPGA, the main reason to have it instead of a MCU is for Lighthouse base station V2 support: with V1 you only need the timing of the pulses and so a simple MCU timer is enough to acquire the signal but V2 encodes the synchronization in the laser beam. Since this is still very experimental, putting an FPGA on the board make it more likely that we will be able to decode the fast signals. This FPGA is very small and will not help very much for computation unfortunately, it is there for signal acquisition.

Re: Lighthouse positioning questions

Posted: Wed Aug 14, 2019 10:17 am
by jamador

thanks for your answer. Finally I bought the LH system and it works great. Congratulations for your work!

Now I want to implement the orientation estimation using LH. I have alredy tested the transformation optimization out of the crazyflie, now I want to code the crazyflie-firmware with it and i would like to understand better the code of crazyflie.

Looking for in the code, I arrived to lighthouse.c where I found the function that calculates the drone position using the four IR positions:

Code: Select all

static void estimatePosition(pulseProcessorResult_t angles[]) {
  memset(&ext_pos, 0, sizeof(ext_pos));
  int sensorsUsed = 0;
  float delta;

  // Average over all sensors with valid data
  for (size_t sensor = 0; sensor < PULSE_PROCESSOR_N_SENSORS; sensor++) {
      if (angles[sensor].validCount == 4) {
        lighthouseGeometryGetPosition(lighthouseBaseStationsGeometry, (void*)angles[sensor].correctedAngles, position, &delta);

        deltaLog = delta;

        ext_pos.x -= position[2];
        ext_pos.y -= position[0];
        ext_pos.z += position[1];


  ext_pos.x /= sensorsUsed;
  ext_pos.y /= sensorsUsed;
  ext_pos.z /= sensorsUsed;

  // Make sure we feed sane data into the estimator
  if (!isfinite(ext_pos.pos[0]) || !isfinite(ext_pos.pos[1]) || !isfinite(ext_pos.pos[2])) {
  ext_pos.stdDev = 0.01;
I understand that "estimatorEnqueuePosition" function is the function that 'send' the data to the EKF, or the current estimator. My question is: Can I simply implement the orientation calc in this function and use "estimatorEnqueuePose" instead to 'send' the pose data to the estimator?

Re: Lighthouse positioning questions

Posted: Thu Aug 15, 2019 9:15 am
by arnaud

This is correct, the code you copied is the part of the code that average sensors position to feed the EKF. You could implement a more clever algorithm there to calculate the pose and push it to the EKF with estimatorEnqueuePose which has been newly added by Wolfgang.

This would be a great improvement so please keep us updated of your progress :).

Re: Lighthouse positioning questions

Posted: Mon Aug 19, 2019 12:41 pm
by jamador
arnaud wrote: Thu Aug 15, 2019 9:15 am Hi,

This is correct, the code you copied is the part of the code that average sensors position to feed the EKF. You could implement a more clever algorithm there to calculate the pose and push it to the EKF with estimatorEnqueuePose which has been newly added by Wolfgang.

This would be a great improvement so please keep us updated of your progress :).
I am finally testing the algorithm coded in the crazyflie firmware using the console (I attach a console output example of the code I added when de drone is in my flat table facing X axis, where the rotation matrix is near to identity). It works apparently, but I am not being able to feed the EKF with the pose data. I am not sure why, but "estimatorEnqueuePose" function is not working for me. I know it because when I plot the StateEstimate in cfclient it does not converge (attached image of this), while using "estimatorEnqueuePosition" it works. After some tests I tried to feed the EKF with a simple fixed pose:

Code: Select all

    static poseMeasurement_t ext_pose;
    ext_pose.stdDevPos = 0.01;
    ext_pose.stdDevQuat = 0.001;
    ext_pose.x = 1.0;
    ext_pose.y = 2.0;
    ext_pose.z = 3.0;*/
    ext_pose.quat.x = 0;
    ext_pose.quat.y = 0;
    ext_pose.quat.z = 0;
    ext_pose.quat.w = 1;
and it does not work either.

perhaps I need to call some other function before? I was analizing "crtp_localization_service.c" but do not see what could be my mistake. Do you know what is happening?

PS: I attach the complete code file.

Re: Lighthouse positioning questions

Posted: Tue Aug 20, 2019 9:55 am
by jamador
Today I was able to feed the estimator. The problem was that pose enqueue only works with KalmanUSC. I had to change the current Estimator. As you said in your first post, your current EKF does not implement the full pose feed. I had never used KalmanUSC and in some test its behavior is strange, with some fluctuations.

Re: Lighthouse positioning questions

Posted: Wed Aug 21, 2019 1:42 pm
by jamador
I attach a picture of the fluctuations in the roll/pitch angles using kalmanUSC being feed with the new LH rotation data, having the drone resting. I do not know what is the reason behind of this oscilations. Any case, I wanted to use the default Kalman estimator, since is the estimator I was using in my experiments lastly. I tried to study a little bit the Mueller paper to be able to integrate the rotation data in the EKF, but then i found the SO3 formulation and error attitude parameters in the state vector. I am not familiarized with SO3 and this kind of formulation, then I am going to need more time to get an observation model in order to integrate the rotation data in your EKF with this state vector.

For now, I created a enqueuePose for the default kalman, that integrates position exactly how you do and overwrittes the orientation quaternion. I know this is not optimal, and probabilistics is going to hate me :lol: , but I have a better orientation than only using IMU for the experiments I need now.

My questions:

Do you know another way to integrate in the kalman filter the orientation measurements?
What are the differences between kalman and kalmanUSC?

If do you think some of this work can be usefull for you, I can share all the code with some documentation!