• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

Halloween Flight Testing Blues

clolsonus

Active member
#1
I maidened a new airplane yesterday. It is capable of fully autonomous flight from launch through landing, but I was having trouble with the airspeed sensor reporting corrupted data. I made a movie for each flight that also shows my HUD and augmented reality system ... maybe because it was halloween, the airplane was playing tricks on me? :)



I can't get the airspeed (i2c reads) to fail on the ground or back in the lab no matter what I try, but as soon as there is air flowing over the wings, I get errors on the i2c reads.

Thanks,

Curt.
 

Ihichi Bolls

Well-known member
#2
I cant help you with the issue you are having as I know nothing about that type set up. I do however want those virtual gates in your hud. That would save me a TON of money and down time for quad racing practice due to crashes.

Ohh on a side note just remembered.. @Mid7night uses a pitot tube in a high speed plane he is developing. HE got some dirt in his tube when landing and that caused issues. Maybe open the pitot assembly and clean the tube to make sure a small grain of sand or something is not fouling air flow?
 

clolsonus

Active member
#3
We suspect our i2c cable to the airspeed sensor (which also senses temp and baro alt pressure) is too long. We simultaneously see garbage values for temp, differential pressure, and static pressure. We can't the system to produce any errors in the lab, but I think we are going to try a shorter cable and rearranging things a bit. Unfortunately at the moment, the HUD system is drawn over the video after the flight, not in real time. For us it helps visualize what went on in the flight (for example you can see pretty much exactly when the airspeed readings started to flake out and the freeze ... and the subsequent affect on the autopilot pitch/throttle commands.)
 
#4
I maidened a new airplane yesterday. It is capable of fully autonomous flight from launch through landing, but I was having trouble with the airspeed sensor reporting corrupted data. I made a movie for each flight that also shows my HUD and augmented reality system ... maybe because it was halloween, the airplane was playing tricks on me? :)



I can't get the airspeed (i2c reads) to fail on the ground or back in the lab no matter what I try, but as soon as there is air flowing over the wings, I get errors on the i2c reads.

Thanks,

Curt.
Great video, thanks for sharing. I'm new to the hobby (relatively), i would love to know the HUD setup you got there (airspeed sensor, OSD, Baro). The Artificial Horizon looks great especially as you got the right units for airspeed (Kts) and Altitude (feet) sitting on either side. Being a pilot i would love to have these parameters display on my RC models too, Thanks in advance for any assistance.
 

clolsonus

Active member
#5
Great video, thanks for sharing. I'm new to the hobby (relatively), i would love to know the HUD setup you got there (airspeed sensor, OSD, Baro). The Artificial Horizon looks great especially as you got the right units for airspeed (Kts) and Altitude (feet) sitting on either side. Being a pilot i would love to have these parameters display on my RC models too, Thanks in advance for any assistance.
Hi, thanks for your kind words! I work at the University of Minnesota UAV Lab (AEM department). The HUD system is something we cooked up in-house. It is mainly a collection of python+opencv scripts that read the flight data log (from our autopilot/flight controller) and automatically time syncs it with our nose cam video. Then it processes the video frame-by-frame drawing the hud over the top, saving the final result.

If this still sounds interesting and you are a bit adventurous, and don't mind reading and thinking a bit about this (and maybe asking a few questions) then you can find pretty much all the code at our github repository here (all MIT open-source licensed.)

https://github.com/UASLab/ImageAnalysis

As a tiny bonus, this project includes a full open-source drone aerial mapping/stitching system. Sshhh, this is actually the main point of this project ... the HUD overlay stuff was just something I added on the side because I thought it would be cool and useful.

For our lab purposes, the HUD has actually been a really great visualization tool. We have developed our own flight controller hardware and software stack (also MIT open source license) and EKF (attitude/location estimator) and our HUD helps us review the flights and understand intuitively how our systems are working. For some things it can help easily spot issues that might not jump out in raw data plots. Plus it's kind of a cool end result in itself.

If you want to see more, I have posted tons of flight footage from different missions and tests at my other youtube channel here: https://www.youtube.com/channel/UC_AWqZyWYvnA-h9MMcbNYyA

Hopefully that helps explain a bit more about the whole project. I'm happy to answer more detailed questions if any one out there tries to dive in and finds anything confusing (or something breaks or doesn't seem to work right.)

Thanks!

Curt.
 

LitterBug

Troll Spammer
#6
Is it correct that this is not a real time HUD, but rather a post processing of video and Flight controller logs?

Cheers!
LitterBug
 
#7
Hi, thanks for your kind words! I work at the University of Minnesota UAV Lab (AEM department). The HUD system is something we cooked up in-house. It is mainly a collection of python+opencv scripts that read the flight data log (from our autopilot/flight controller) and automatically time syncs it with our nose cam video. Then it processes the video frame-by-frame drawing the hud over the top, saving the final result.

If this still sounds interesting and you are a bit adventurous, and don't mind reading and thinking a bit about this (and maybe asking a few questions) then you can find pretty much all the code at our github repository here (all MIT open-source licensed.)

https://github.com/UASLab/ImageAnalysis

As a tiny bonus, this project includes a full open-source drone aerial mapping/stitching system. Sshhh, this is actually the main point of this project ... the HUD overlay stuff was just something I added on the side because I thought it would be cool and useful.

For our lab purposes, the HUD has actually been a really great visualization tool. We have developed our own flight controller hardware and software stack (also MIT open source license) and EKF (attitude/location estimator) and our HUD helps us review the flights and understand intuitively how our systems are working. For some things it can help easily spot issues that might not jump out in raw data plots. Plus it's kind of a cool end result in itself.

If you want to see more, I have posted tons of flight footage from different missions and tests at my other youtube channel here: https://www.youtube.com/channel/UC_AWqZyWYvnA-h9MMcbNYyA

Hopefully that helps explain a bit more about the whole project. I'm happy to answer more detailed questions if any one out there tries to dive in and finds anything confusing (or something breaks or doesn't seem to work right.)

Thanks!

Curt.
Hi Curt,

I really appreciate you taking the time to put this post and reply together. Great work you have there and I'm really glad that we have great minds such as you sharing this hobby and contributing. I wish you the best of luck with your projects. It would be really nice if OSD manufacturers would produce something similar to what you have as a HUD, keeping the simple ergonomic design but also capturing all that is vital to be displayed for the pilot in the correct units.

Best Regards,
Ali
 

clolsonus

Active member
#9
Quick follow up comment. If someone has a pixhawk type flight controller, it should be able to collect all the data needed to drive the hud display. Probably 2 years ago now I added basic pixhawk log file support (but haven't tested it since then.) If someone was interested in adding this to their own flight videos, it should be possible. It's a great way to see what your autopilot is doing and how well it is actually performing.
 

evranch

Well-known member
#10
I missed this post the first time around, been away from the forum and everything to do with flight for awhile. Too much farming. My first impression of course was "Wow, is that a custom HD video link and how much processor power is on board to render that!"

Of course it's a little disappointing as I hoped you had come up with an open source HD video link to share, but still, it looks great. I love the idea of being able to see the track and waypoints on the video and compare the actual flight with the expected performance. I might have to try it out if it can be fed Ardupilot data or if I can convert Ardupilot logs into a compatible format.

Do you have pics of this plane? Always interested in seeing mapping rigs. I still haven't built a new plane but I'm kind of messing around and looking at different airframes.
 

LitterBug

Troll Spammer
#11
Quick follow up comment. If someone has a pixhawk type flight controller, it should be able to collect all the data needed to drive the hud display. Probably 2 years ago now I added basic pixhawk log file support (but haven't tested it since then.) If someone was interested in adding this to their own flight videos, it should be possible. It's a great way to see what your autopilot is doing and how well it is actually performing.
FYI, ArduPilot is now available for more common F4/F7 flight controllers at a much lower cost and reduced weight. I'll be taking a closer look at this project. I need to look back at a recent flight I did where I was testing a set of pre-defined waypoints and see if it was logging. I don't think I had an SD card in, so will probably need to make a few more flights first.

Cheers!
LitterBug
 

clolsonus

Active member
#12
@evranch, I am still flying a full size x-uav talon (with the wing extensions) on a 4 cell battery.

@LitterBug, I haven't fired up any version of ardupilot in the last 10 years, but if you can share a data file (maybe in some sort of text or parse-able format that I can figure out) I could look into adding ardupilot log support. It would be really interesting (I think) to be able to "see" the performance of the EKF, how well the horizon tracks reality, and what the control surfaces are doing exactly to track the target roll/pitch angles. It can be a nice tool for gain tuning for anyone who wants to go a step or two beyond default gains.

I probably posted this here before, but here is an autonomous dusk landing with one of our talons equipped with night led lights (and apologies that the person taking the video wasn't aware of the true and extreme evils of portrait video.):

 

LitterBug

Troll Spammer
#13
@evranch, I am still flying a full size x-uav talon (with the wing extensions) on a 4 cell battery.

@LitterBug, I haven't fired up any version of ardupilot in the last 10 years, but if you can share a data file (maybe in some sort of text or parse-able format that I can figure out) I could look into adding ardupilot log support. It would be really interesting (I think) to be able to "see" the performance of the EKF, how well the horizon tracks reality, and what the control surfaces are doing exactly to track the target roll/pitch angles. It can be a nice tool for gain tuning for anyone who wants to go a step or two beyond default gains.

I probably posted this here before, but here is an autonomous dusk landing with one of our talons equipped with night led lights (and apologies that the person taking the video wasn't aware of the true and extreme evils of portrait video.):

I was assuming when you mentioned compatibility with pixhawk, you were talking about the ardupilot firmware that usually runs on them. A pixhawk is the physical hardware. Maybe I'm confused, but what else would be running on a pixhawk? The latest versions of ardupilot have autotune capabilities built in. Would be interesting to do a before and after comparison to see the difference their autotuning methodology makes.

LOL, and thank you for understanding the evil of portrait video!

Cheers!
LitterBug

EDIT: Looks like PX4 and ArduPilot are the main firmwares used with the Pixhawk
 
Last edited:

clolsonus

Active member
#14
EDIT: Looks like PX4 and ArduPilot are the main firmwares used with the Pixhawk
Yeah, everyone I know around here seems to prefer to run px4 on their pixhawks. I built my own hardware/software ecosystem because that seemed more interesting and none of the pixhawk/px4/ardupilot things existed when I started working on autonomous flight. :)
 

evranch

Well-known member
#15
@clolsonus Seeing your videos on the drone catcher thread reminded me how cool your rendered HUD is. I took a look at your GitHub repo since I'm flying with ArduPilot again. I'd love to make some flight videos with the HUD, and I see you also have a script to pull frames from a video and geotag them - which would be great to be able to do away with the camera trigger mechanism and just use a 4k action cam facing downwards.

Wanted to see what would be involved in writing a parser for ArduPilot logs to feed to these scripts. The logs are in plaintext CSV so it should be fairly simple.

I'm not really a Python guy - OK I'm not a Python guy at all, I'm an C/C++ guy who usually uses Perl for parsing this sort of thing.

So in both scripts, IMU and GPS data are loaded into an array by flight_loader.load() which presumably is the parser for your data format. I was going to take a look, learn Python for the day again and try to make a version that parses ArduPilot logs.

However, I can't find flight_loader!
It appears to be included from aurauas_flightdata which... doesn't come up in a search of the repo? (Or a search of GitHub, though that never works very well) Is this part of a different package?
 

clolsonus

Active member
#16
@clolsonus Seeing your videos on the drone catcher thread reminded me how cool your rendered HUD is. I took a look at your GitHub repo since I'm flying with ArduPilot again. I'd love to make some flight videos with the HUD, and I see you also have a script to pull frames from a video and geotag them - which would be great to be able to do away with the camera trigger mechanism and just use a 4k action cam facing downwards.

Wanted to see what would be involved in writing a parser for ArduPilot logs to feed to these scripts. The logs are in plaintext CSV so it should be fairly simple.

I'm not really a Python guy - OK I'm not a Python guy at all, I'm an C/C++ guy who usually uses Perl for parsing this sort of thing.

So in both scripts, IMU and GPS data are loaded into an array by flight_loader.load() which presumably is the parser for your data format. I was going to take a look, learn Python for the day again and try to make a version that parses ArduPilot logs.

However, I can't find flight_loader!
It appears to be included from aurauas_flightdata which... doesn't come up in a search of the repo? (Or a search of GitHub, though that never works very well) Is this part of a different package?
Shoot I owe you an email/message on that. I have too many candles burning at too many ends right now. I'm with you, I used to swear by perl and C/C++, but then ran into a job that I wanted to script, but it really needed an object oriented design (one of the few things I've done that actually benefited from a class architecture, but I digress ...) I ended taking a serious look at python because the object oriented extensions for perl were pretty hideous syntactically ... almost as bad as modern C++ ... oh, but I digress. Anyway, python worked really well for that project and it has grown on me ever since. Here is the flight data loader project link on github:

https://github.com/AuraUAS/aura-flightdata

It should be a simple "sudo ./setup.py install" to install (presumes python 3.)

I have created a few maps with an action cam pointed straight down, so that definitely is a doable thing. I wasn't super happy with the resolution and jpg compression losses from movie still shots, but it was ok, it worked, and it might be a good balance of factors for some tasks.

Don't feel bad about bugging me on any of this ... I sometimes need a polite (or not so polite) reminder. :)
 

evranch

Well-known member
#17
Perfect, looks like you have modules for each different type of log, so I can write one for ArduPilot and slot it in. Looks like it won't be too far removed from the PX4 parser so I can use that as a starting point.

As I said, I really don't know python. Since aura-flightdata is the library I would be working on, how can I call my working copy from the other scripts rather than reinstall it globally every time I revise it? Will Python look for libraries in a local path? Edit: Figured this out.

Python honestly looks like a good language and it's certainly popular. I use Perl because it's disorganized like my brain is... when I need some glue to patch two things together, and want to do it in a hurry - Perl. The only language that I know that will let you execute arbitrary code by dereferencing a pointer to a string containing it. I'm still not sure if that's a good or a bad thing, but at least Perl almost always does what I want it to with its magically sloppy typing.

It appears you have a similar opinion to me on OOP. I claim to work in C++, but usually I just write C code that calls C++ libraries. You're far more likely to see me malloc() up a struct than write a whole class with constructors and... bleh. I mostly stick to embedded these days.

I figured that resolution would suffer because even HD video is much lower resolution than stills. 1080p is a meager 2MP. However, 4k changes this a lot with 8MP, and I've had decent results for my purposes with a 5MP camera. As mentioned before, my mapping usage case is very different from yours, focusing on DEM rather than high-res imagery.

Glad to know I'm not the only one with too much on the go. Now that I have the code to hack at, it could be done anywhere from tomorrow morning to 6 months from now, but at least I can get started.
 
Last edited:

evranch

Well-known member
#18
Have got a parser working and bringing the initial values in. ArduPilot is Mavlink based so I was able to use the Mavlink message parsing library, which should allow the use of any Mavlink compatible autopilot.

However I'm coming up against the need for movie_log which is a csv file related to the video itself? Is this generated by a script or is it created in flight?