Using Flow Deck with laser pointer beacons

Discussions about autonomous flight in general, regardless of positioning method
Post Reply
Astrobiologist
Beginner
Posts: 25
Joined: Fri Jan 19, 2018 10:35 pm

Using Flow Deck with laser pointer beacons

Post by Astrobiologist »

Hello, I am new to this forum.
I have bought a CrazyFlie2 STEM bundle with the flow deck to attempt some basic indoors navigation. The Loco system is out of my budget and in any case is not precise enough for my purposes.
What I had in mind was to attempt to navigate very precisely using upwards-pointing laser pointers.
I have successfully installed all the Python libraries etc for the flow deck and can run the example scripts for MC Commander.
I had hoped that these commands in conjunction with the flow deck would allow me to fly a coarse search pattern with the CrazyFlie.
I am intending to use vertically pointed laser pointers on the ground and then simple boresighted sensors (photodiodes etc) on the drone. Upon overflying one of the laser pointers, once it illuminated the boresighted sensor the CrazyFlie would attempt to retrace its steps and then lock on to the beam and ride it down to ground level. What I would do then is another thread for the future.
Do people think this would actually work?

Can the flow deck sensors survive if they are illuminated by such a laser or would they be burned out?

In initial tests, I don't think the CrazyFlie in conjunction with the flow deck can really give such precise (<1cm) movements, but of course my expectations were no doubt too optimistic for such an excellent lightweight, low cost drone :-)
Can anybody give me any tips on steadying the drone?
One thing I had thought of was somehow changing the hover mode with the flow deck as supported by MC Commander - I have scant idea how to do this and have never coded in Python previously.

As an example, the drone often exhibits what I call the "crumpled duvet effect" because I usually fly it over my duvet in case of crash landings! Undulations in the duvet cause quite a bumpy flight for the drone since it tries to hold an exact hover height. Basically there is a touch of an autopilot induced oscillation (AIO) here, since as the drone undulates up or down the ToC sensor in the flow deck sees a larger or smaller slice of the ground which varies with the square of the distance so it is not "seeing" the same area as it bobs up and down. Woe betide if I try flying it off my bed altogether, since then it suddenly tries to drop down to the set altitude above floor level, and nearly always crashes (I assume this is due to vortex rign state due to an excessively quick descent)
All of this could be avoided if I just could turn off hover - so what is the command in MC Commander etc to do this?

Overflying objects of known height would even be a useful way of providing waypoints in its own right.

I would be extremely grateful if someone could point me in the direction of example snippets of Python code so that I could begin to learn to program the drone in Python alongside the basic MC Commander commands (for instance, turning hovering off, and also perhaps slowing down or rationalising the data logged - I have replied to a separate developer thread about some odd logging behavour I have seen).

This also segues into making a larger chassis perhaps in conjunction with the BigQuad. For instance, the obvious thing to do would be to have a central boresighted sensor to detect the laser pointers- except that the flow deck is mounted centrally.
Can the flow deck double as the laser sensor? (Would it be damaged?)
Otherwise, either the flow deck or the boresighted sensor would have to be mounted off-centre, on one of the arms for instance.
I would also hanker after adding a Qi charger deck and, for instance, landing the drone on a charger plate using the laser pointer (or something else?) as a beacon. Ultimately I would also like to try picking things up and dropping them off, but that is a discussion for another day. Basically it doesn't all fit in the normal stack of decks, so I need some sort of breakout board to distribute different components on different arms.

Can additional sensors (e.g photodiodes) be added to spare pins on the CrazyFlie? And if so, how can this be coded - in Python? Is there any such example code that people could point me towards?

This is all too much for one post of course. But I can see a realistic progression:
Learn more about commands in Python for the drone
Turn off hovering somehow and test
Experiment with CrazyFlie hardware and in Python for adding additional sensors such as boresighted photosensors
Attempt to create BigQuad footprint or similar with different components spread out on different arms

Any advice on ANY of this very gratefully received
Many thanks

Astrobiologist
(Oliver)
Astrobiologist
Beginner
Posts: 25
Joined: Fri Jan 19, 2018 10:35 pm

Re: Using Flow Deck with laser pointer beacons

Post by Astrobiologist »

...Or I could just wait until the Qualisys system is supported in the Python client!
https://www.bitcraze.io/2018/01/externa ... m-support/
arnaud
Bitcraze
Posts: 2538
Joined: Tue Feb 06, 2007 12:36 pm

Re: Using Flow Deck with laser pointer beacons

Post by arnaud »

Hi and welcome!

This is a long message, I will try to answer the questions in order. I appreciate that you created a new thread and told us about the context (too often there is too precise question without context and I am forced to ask "what are you trying to achieve"), but in this case it is a bit too long and too diverse which makes it hard to answer. It would be useful if in the future you could separate different questions in different threads, it would make it much easier to answer and to discuss the different subjects.
Do people think this would actually work?
If I understand your idea well, It might work. You could actually run a particle filter to estimate your position looking at when you hit the different laser. Depending of your required precision this might even work with IR light source and use the strength at which you receive them. Are you expecting to 'hit' the laser very precisely with your sensors?
Can the flow deck sensors survive if they are illuminated by such a laser or would they be burned out?
How strong would your laser be?
Can anybody give me any tips on steadying the drone?
In good condition we have observed very good performance of the flow deck, how much drift are you observing?
For the best performance, the lightning condition must be good, the flow sensor should not see the shadow of the Crazyflie and the ground should have some nice high-contrast pattern to track.
so what is the command in MC Commander etc to do this?
The behavior you are observing comes from the way the estimator is implemented in the Crazyflie. The ranging sensor is fed to the estimator as absolute height. This means that the Crazyflie expects to fly on a flat floor. I doubt this can be fixed in the python api, this is something that has to be fixed in the Crazyflie itself. Though, without using the pressure or other sensors, it might be hard and it will most likely result in the Crazyflie height drifting (since we would not have an absolute height sensor anymore).

One point to note as well is that the flow sensor needs the absolute height above a flat floor in order to measure the Crazyflie velocity. The model used by the flow sensor algorithm is that the Crazyflie is above a flat horizontal floor.
I would be extremely grateful if someone could point me in the direction of example snippets of Python code so that I could begin to learn to program the drone in Python alongside the basic MC Commander commands (for instance, turning hovering off, and also perhaps slowing down or rationalising the data logged - I have replied to a separate developer thread about some odd logging behavour I have seen).
The best is to look at the example folder in the crazyflie-lib-python project. It contains a lot of examples showing how to perform various action. If you have a precise question about something you would like to do, please create a new thread about it.
Can the flow deck double as the laser sensor?
No, the flow deck basically contains a low resolution camera but there is currently no public documentation on how to get an image from the sensor. All we can get is DX/DY movement.
Can additional sensors (e.g photodiodes) be added to spare pins on the CrazyFlie? And if so, how can this be coded - in Python? Is there any such example code that people could point me towards?
You, you can look at the expansion port documentation: https://wiki.bitcraze.io/projects:crazy ... ards:index and API: https://wiki.bitcraze.io/doc:crazyflie: ... mware:deck. The low level driver has to be written in C in the Crazyflie. The low level driver will expose log and param variable that can then be used from the ground in Python.
...Or I could just wait until the Qualisys system is supported in the Python client!
Do you already have a Qualisys system? If the LPS is over budget, a mocap will be as well :).
If you already have access to a MOCAP system, it is definitely a solution to get a good (ie. awesome) position control in room coordinate. You can feed the crazyflie position using the external position port. I think the ROS driver has support for it, otherwise it can also be done from the python API.
Astrobiologist
Beginner
Posts: 25
Joined: Fri Jan 19, 2018 10:35 pm

Re: Using Flow Deck with laser pointer beacons

Post by Astrobiologist »

Thanks for your detailed reply.
I actually have a professional interest in this with regards to molecular biology laboratory automation. Robotic arms for lab use are quite expensive, so I was wondering if drones could be used in internal laboratory environments instead (and even to transfer samples onwards to other labs).
At the moment, I am just pursuing this from a hobbyist perspective in my own time.
So, I can only make use of everyday objects!

In this regard, I would use laser pointers or similar modules from hobbyist suppliers. So we are probably talking Class II(b) lasers, outputting 3.5mW or so. Hence my concern about dazzling, or possibly burning out, the flow deck sensors. A laser pointer throws out a dot of light about six times brighter than the sun!

I was assuming that the CrazyFlie would fly a search pattern, and hopefully would overfly one of the beams. They would spread out to a few mm wide for a typical laser pointer at a distance of a few metres.
Ultimately, to delineate large objects, I might use, say, three laser pointers for three corners of an object, which triples the chance of overflying at least one of them. The CrazyFlie could then attempt to lock on to one and rotate until acquiring the other two (for instance, with a total of three photodiodes on the CrazyFlie in a layout matching the laser pointers on the ground), and would then know it was directly above a known object and in the right orientation.

I have tried moving the CrazyFlie a few centimetres in a given direction using mc.commander but it tends to "pendulum" slightly. This would make it hard to "ride" a laser pointer as suggested above since it would keep losing the beam and would have to re-search for it.

I have noticed that takeoff with the flow deck is far more steady with some surfaces than others, just as you say. For instance, on a wooden floor I see lot of drift (surprising, given the texture of the wood), and on a featureless bedsheet the same thing happens. Over a nice flat duvet, the drone is extremely stable indeed. I assume this is because the croqueting etc on the duvet gives a nice high contrast texture! It is also possible that it is a less smooth surface and so there is less ground effect?
The flow deck is a revolutionary product! Surely one of the most rock-solid drone hovers ever?

It is probably unrealistic though to then say, "from this hover, move 1cm left", because the drone will inherently pendulum slightly before settling in its new location. I was wondering if there was any way to reduce this.

I will have a look at the other coding resources that you referred to and get back if I have any more questions, a customer for my actual (paying) day job is beckoning at me right this moment ;-)
Astrobiologist
Beginner
Posts: 25
Joined: Fri Jan 19, 2018 10:35 pm

Re: Using Flow Deck with laser pointer beacons

Post by Astrobiologist »

OK, if I may just come to this: (I will give some links for the convenience of others who might stumble across this thread looking for similar details - so apologies for another long post)

No, I don't have a Qualisys, next time before I get excited about something I will check the price first ;-)

However, I was able to do some decent quick tests with Python and the opencv2 library for image recognition. I posted this to a thread in the developers section:
viewtopic.php?f=6&t=1193&start=30

I am optimistic that from not knowing any Python at all a couple of days ago, I have got this far... this might be a better place to start than the laser beacon (or LED beacon) idea.

Instead of laser pointers, one could use webcams pointing upwards, and then attempt to recognise the drone as it enters the field of view (against a nice, featureless ceiling - should help a lot with contrast recognition).
From its location in the image, you now have an X and Y coordinate, and of course the flow deck gives Z, and then it should be basic trigonometry to attempt to land the drone at a particular location.
Each location you wish to navigate to could have its own webcam with a central control PC running Linux or similar (so each webcam is /dev/video0, /video1 etc)

This takes the pressure off trying to code for a custom deck with photodiodes etc. However, I still need to fly some hardware. I read the links you supplied and checked some other discussions and git pages as well, but I didn't feel at all confident with the C code.

Again, please could people just check quickly if I am on the right track: (thank you for your patience)

When you have built your own deck, you need to add the one wire memory EEPROM chip so that it can be recognised by the CrazyFlie. You can use the CrazyFlie itself to program the EEPROM:
viewtopic.php?f=6&t=2301&p=11775&hilit= ... rom#p11775
(not that I really understand this)

Create a driver for your custom deck - if you have gone as far as programming the EEPROM with your deck's name, then you can incorporate it here, or you can just force the CrazyFlie to recognise it in the first instance (have I got this right?) *:
https://wiki.bitcraze.io/doc:crazyflie: ... deck:howto

Here is an example of someone reading an ADC pin from the CrazyFlie in C:
viewtopic.php?f=6&t=1462
But this goes TOTALLY over my head.

Here is a simpler example of turning an LED on - could probably easily adapt it to reading in the state of an IO pin instead:
https://wiki.bitcraze.io/doc:crazyflie: ... mware:deck

But how to do get the data (pin state, ADC data etc) to display?

I note that in the example custom deck driver above, it appears in the console (e.g "Hello test passed!" *)
But how would you get something to appear as a parameter, as you mentioned below?

Here is a Python script that can display parameters:
https://github.com/bitcraze/crazyflie-l ... icparam.py

Is there any way to display the console output from Python similarly, or is this only a feature of the client?

You may note that I am having trouble understanding the difference between the console downlink and that of the parameters, and how you can make a C script that writes one or the other - and indeed how I can them capture them in Python at the other end.

The main thing I need to mount on the drone itself would be some sort of gripper - so probably a pin to write high to cause a motor to open or close a gripper (for instance) and perhaps another one to read in whether something has been gripped or not.
Or how about a Bernoulli grip?:
https://en.wikipedia.org/wiki/Bernoulli_grip
- You could use a fifth motor channel to do this! Is it possible to add extra motor channels?

Sorry for all the questions, I am well outside my comfort zone here. Much of what I want to do will be in Python on the computer side, because motion.commander is so useful and also because opencv2 works well for image recognition and triggering webcams etc.
For that matter I could use microPython to control other things at each webcam location (local grippers etc perhaps).

But now I need to begin to get to grips with (pardon the pun) designing and programming hardware on the CrazyFlie itself... hence all the questions. Many thanks in advance.
Post Reply