Panorama and Map Stitching Software for AP Use

Snarls

Gravity Tester
Mentor
Hey all, I'm ready to do some panoramas and mapping with my Arducopter equipped TBS Discovery. I am wondering if anyone has experience with the processing side of things. What is a good free program I can use to stitch together all the photos off my camera into a nice panorama? I'd rather not have to purchase anything as I am only doing this for fun and experimentation at the moment.

With regards to mapping, I know of DroneDeploy from an FT episode a little while back. The iOS app won't work with Arducopter, but I believe I can still use the computer program to stitch together maps. Are there any other mapping programs I should know about?

Thanks everyone! PS: Check out this website if you want to see why I'm getting in to this.
 

makattack

Winter is coming
Moderator
Mentor
I had quickly looked into this for helping a friend with mapping their property using my APM equipped blunt nose versa wing, but quickly realized I just don't have the hardware and time to do a good job. Nevertheless, I found this OpenStreetMaps contributers' wiki entry on this subject useful:

http://wiki.openstreetmap.org/wiki/User:Balrog/Aerial_Imagery/Rectification
http://hugin.sourceforge.net/tutorials/index.shtml

Ultimately, my friend said he would simply like a couple of identical passes over his property at different seasons to see the change. With APM, this was easy as I simply mapped a mission that covered his property, uploaded it to the flight controller, and let it go. If you save the mission on a computer, you can re-upload at different times during the year and run the same mission. It's remarkably accurate as long as you have similar wind conditions and good GPS reception (aren't flying when Sun-spot activity is high). So, I upload the mission I setup for him at a suitable time in early Spring, Summer, Fall and Winter, with my mobius B lens camera set for video recording in 1080p/30fps mode with a locked exposure setting for each flight.
 
Last edited:

Snarls

Gravity Tester
Mentor
Thanks Rip. I found a link to that website on some thread on RCG and just had to save the link.

Thanks Mak, I will take a look at those links. I have experience planning and flying autonomous flights with the disco so flying the mission should be pretty easy. One thing I am concerned about is linking my photos up to GPS coordinates. Mission planner has a way to take the flight data and insert it into the photos I take from my Yi, but I am not sure how to sync things as best as possible. Like how filmmakers clap to sync the audio and video from the set.
 

Snarls

Gravity Tester
Mentor
I've been experimenting mapping the house I am at on vacation. First try I flew a mission at 30m high. My Xiaomi Yi was set on photo mode with a timer to take a picture every 0.5 seconds. I meant to take a picture every 2 seconds, but I guess I got the settings messed up at some point. After the mission and some extra side shots I ended up with 1000 photos! I used the geo referencing tool in Mission planner to geotag all my photos. I had to guess the time offset between the GPS logs and camera time to link coordinates to each photo, but I think it is accurate enough.

Using DroneDeploy, my first efforts resulted in a failure to process. It was quite discouraging after having to wait an entire day for the processing to occur. I tried cutting the total number of photos down to 200 and it still failed.

I decided to redo the entire mission at two altitudes. One at 50m and another at 20m to see if altitude was the issue. I also set the correct photo intervals on the camera, 5s for 50m and 2s for 20m, as well as apply lens rectification on the camera. With this I got 37 photos at 50m and 194 at 20m. DroneDeploy was able to process both sets before the end of the day, and happily they both worked out! I think the lens rectification made more of a difference than altitude. The wide angle likely made stitching very difficult. Altitude probably plays more into the quality of the created orthomosaic. Higher altitude is faster to fly, results in less pictures, processes faster, but has less detail when you zoom in. Lower altitude is slower to fly, results in more pictures, processes slower, but has more detail in the final image.

Here are some samples of the results:

BeachHouseLowAltS.jpg
Orthomosaic

BeachHouse3D.jpg
3D Model

I want to experiment with larger areas to create larger orthomosaics, as well as with more isolated structures to make 3D models out of. The house I mapped is surrounded by a large amount of vegetation which I imagine makes stitching and 3D modelling more difficult. The vegetation also makes it harder to get side shots to improve the 3D model.

Next I want to try creating a 360 panorama from pictures taken above this location.
 

clolsonus

Well-known member
There are a lot of good mapping and stitching tools out there. Most cost [a lot of] money, but many of these have free demos if you are just doing some experimentation. Drone deploy and pix4d are among the easiest to use ... just upload your images to the cloud and wait for an email saying your map is done.

For the 100% free route, take a look at "open drone map". That will produce your classic 3d model + orthophoto stitch. The quality may not be quite as good as the premier commercial programs, but it's definitely worth taking a look at.

There is another 100% free option that I have developed myself via research projects at the U of MN aerospace dept. It is done in a 'guts out' style for education and research purposes; written entirely in python to be as open as possible. Our use-case is surveying large areas and then looking for "needles in the haystack." (Specifically invasive plants.) The traditional orthophotos and 3d models that the premier tools create don't work very well for my purposes, so we rolled our own software. Our tools create a map mosaic that is othorectified, yet preserves all the original images. (It's bit hard to describe in a few sentences.) The final result is all the original pictures are presented in a big pile: arranged, lined up, and stetched, and positioned in their exact correct locations. Then we can go search through the map (with no quality or information loss) much like browsing a google map -- but you are looking at all the raw original images. As I'm viewing the final map, I can call forward any image that covers the current center of view, so I can see all the original angles of something of interest. This let's me spot vines crawling up tree trunks for example. I know, I'm pretty far out there in my own la-la land. :) If anyone wants to take a peek, here is the link to the software (scroll down through the readme to see some screenshots and more details):

https://github.com/UASLab/ImageAnalysis

Here are a couple videos that show the map results in action:




As you've started to see, the quality of your survey images go a long way towards determining the quality of your final maps. The rule of thumb is to have about 70% overlap (end-to-end and side-to-side) so the stiching algorithms can do their thing reliably. Low altitude surveys over crops/trees can be especially difficult to stitch. That is one of the things I addressed in my project to some level of success. I was able to stitch image sets that completely blew up drone deploy or pix4d.

Aerial mapping is a big field with a lot to learn on many levels. (I know a few things, but only a few ...) If anyone is interested in trying out our UMN mapping tools, feel free to ask questions. Some things are documented pretty well, some things not so much.

Thanks,

Curt.