Hi, thanks for your kind words! I work at the University of Minnesota UAV Lab (AEM department). The HUD system is something we cooked up in-house. It is mainly a collection of python+opencv scripts that read the flight data log (from our autopilot/flight controller) and automatically time syncs it with our nose cam video. Then it processes the video frame-by-frame drawing the hud over the top, saving the final result.
If this still sounds interesting and you are a bit adventurous, and don't mind reading and thinking a bit about this (and maybe asking a few questions) then you can find pretty much all the code at our github repository here (all MIT open-source licensed.)
https://github.com/UASLab/ImageAnalysis
As a tiny bonus, this project includes a full open-source drone aerial mapping/stitching system. Sshhh, this is actually the main point of this project ... the HUD overlay stuff was just something I added on the side because I thought it would be cool and useful.
For our lab purposes, the HUD has actually been a really great visualization tool. We have developed our own flight controller hardware and software stack (also MIT open source license) and EKF (attitude/location estimator) and our HUD helps us review the flights and understand intuitively how our systems are working. For some things it can help easily spot issues that might not jump out in raw data plots. Plus it's kind of a cool end result in itself.
If you want to see more, I have posted tons of flight footage from different missions and tests at my other youtube channel here:
https://www.youtube.com/channel/UC_AWqZyWYvnA-h9MMcbNYyA
Hopefully that helps explain a bit more about the whole project. I'm happy to answer more detailed questions if any one out there tries to dive in and finds anything confusing (or something breaks or doesn't seem to work right.)
Thanks!
Curt.