• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

3D FPV (Stereoscopic 3D)

Hi FliteTest.

I would like to suggest you guys take a look at the stereoscopic 3D FPV arena, and, if you can figure your way around it, even do a show (or at least an article) on it. It seems still to be a niche thing, but hey - you're into all things flight-related, right :cool: Maybe, to start with, just find someone who's already into it who can give you a demo.

Also, with a negligible budget at the moment, I'm not able to get into even normal FPV, so this is by way of asking for a surrogate exploration of the field ;)

Personal fascination:

For some years now, I've been into stereoscopic 3D (i.e. individual images for each of the viewer's eyes, as you get at the movies), and, not to sound clichéed, it's changed how I perceive scenes around me. You'll hear people talk about looking at something "as an artist" or "as a photographer", and it's much the same when you start to realise what something would look like as a 3D image or video (I'll drop the "stereoscopic" from here on in - just don't get confused with "3D flying").

When compared to any 2D image (like an FPV feed, for example), it's so much more than being given extra resolution or framerate; the extra dimension is just that - an extra dimension - and something you really need to experience to understand the value of. And there's a distinction to be drawn between 3D entertainment (movies, games, magazines) and 3D imagery of the real world around you - the immersion you get from capturing the real world is brilliant, and all the more so because it so well represents the stories your own eyes (the best camera system you'll ever experience) relate to you.

A couple of years ago, as a university project, I built a ground-based rescue reconnaissance robot, into which I tried to incorporate 3D FPV. Since that was not my main goal and I was on a very tight budget ($1000 for the entire project - compared to $5000-$25000 for commercial options), I had to design the circuitry myself, and never got that system running completely, but the moments when I did get it running were amazing: a pair of very low quality, 320x240 analogue camera feeds gave me enough information to instantly note every detail of my cluttered workbench. I'd almost go so far as to say you can see detail beyond the resolution of the camera.

Less-Personal Reasons:

As for less personal reasons, I see the following benefits in 3D FPV:
  • While not directly comparable, if you're just going for the experience rather than movie-making, I reckon an entry-level 3D FPV system would outdo an expensive HD one on an immersion-for-buck scale.
  • You can see those dastardly, thin, low-hanging branches before you hit them.
  • You really get the feeling of the space in which you're flying, which is a large part of the appeal of flying in the first place.
  • General "Wow" factor...

What's already out there:

3D FPV is as old as FPV itself, just like 3D itself is almost as old as cameras (it was already well established, and was used extensively, in the world wars - especially WWII). So it's nothing new, and has been well explored.

Do a little Googling and you'll see quads and planes with home-made and off-the-shelf set-ups, tutorials, stores, crazy research projects (some people have hooked up goggle head-tracking with stereo cameras), people who mount cameras with conventional spacing and people who mount cameras on their wing-tips for super-exaggerated depth, and people who swear by 3D FPV and would struggle to revert to 2D.

But somehow, despite all this and the now common-place nature of 3D in homes, 3D FPV remains a barely-noted niche of the RC world. That's where I hope you guys at FliteTest could come in... Awareness and all that :)

Getting Started:

There are a number of options out there, at all costs and skill levels. In general the options are:
Video type:
  1. The most expensive option: two independent sets of Camera-Tx-Rx-screen (even if those come in one package - see below), or
  2. The generally cheaper option: combined transmissions - either field-sequential or side-by-side stereo, which merge the 2 camera feeds into 1 transmission feed so you only need 1 Tx-Rx kit. This then requires goggles or screen capable of viewing either field-sequential or side-by-side 3D, respectively, so you always need to check compatibility!
  1. Fixed separation (i.e. fixed 3D effect strength)
  2. Variable separation (i.e. customisable 3D effect strength)
To be honest, I don't know why most off-the-shelf options come with fixed spacing. Our human eyes are spaced about 63mm apart because, for the most part, the distances we need to judge are those between ourselves and those things we can reach with our arms or by a short walk across a room. In an aircraft, most of the time the space around you is so vast that everything appears quite flat, which is why most people using 3D imagery from planes (including the aforementioned photographers and cinematographers in WWII) use much wider spacing - even putting cameras on wing-tips for some high-altitude RC planes.

Specific options:
(at least the ones I've noted)

If want super-simple, there's the Skyzone dual-antenna goggle kit, that you can even get with a plug-and-play 3D camera and transmitter. That's actually 2 complete FPV systems in one set of packaging, but will set you back $500-$600, and the cameras are at a small fixed spacing, which may be okay for indoor flying, I suppose. The goggles alone, however, can be used with any pair of FPV set-ups if you want to rig your craft with your own electronics, or to keep the transmitters well separated or something.

A potentially lower-cost option is a field-sequential option like the Blackbird 1 or Blackbird 2. Those can just be plugged into an existing Tx-Rx kit. My personal favourite there is the Blackbird 1, since the cameras are separable. The Blackbird site (fpv3dcam.com), by the way, has pretty good explanations of how to set up a 3D FPV rig with compatible kit.

As far as you guys at FlightTest are concerned, since you seem to have FPV rigs lying around just about everywhere, you might be best off just getting goggles that can take 2 independent streams. Or even using a 3D TV if you can get digital signals (you may need a PC and software like Stereoscopic Multiplexer for that, though).

Lastly, since, as hinted at above, there are many 3D video formats, please, if you do go ahead and shoot some 3D video at some point and upload to YouTube, please-please-please use the actual 3D video upload options that YouTube provides. Otherwise you only provide one way to view the video, which is about as much of a faux pas as you can get in 3D.

Thanks for your wonderful work, though, guys, and I hope at some point you'll get to experience (at least) the wonderful world of 3D FPV.



Staff member
I've read that 3D only really works in close proximity. As soon as you gain height with an aircraft or keep distance from obstacles, the effect is lost.

I'd be interested to be convinced of the opposite though :)
I've read that 3D only really works in close proximity. As soon as you gain height with an aircraft or keep distance from obstacles, the effect is lost.
That reputation is a result of those fixed-separation camera kits I had a little rant about in the original post. If the cameras are set 63mm apart (the separation of most human eyes), then yes, it'll only work in "close proximity", just like you struggle to distinguish small differences in depth after about 20m. The lost of depth detail is worsened by the resolution of the cameras.

Note, though, that those fixed cameras should be fine for most low-altitude multirotor flights.

This "issue" is easily fixed, however, by increasing the separation of the cameras; if you're getting a good 3D effect within 10m of the aircraft, but you need to operate with distances of 50m when you're actually flying, then simply increase the camera separation by 50/10 = 5 times. You can think of this as "shrinking the world" or "increasing the effect" by that much. As a rule of thumb, a separation of 1 ft gives a depth magnification of 5 times natural.

The catch, though, is that you've got to align the cameras fairly carefully so that they're parallel, so they should ideally be mounted directly to a single beam or the main wing spar or something.

I just found a pretty down-to-earth Funjet retrofit project discussion, here, that seems to illustrate the process quite well. Otherewise there's a pretty comprehensive forum thread on rcgroups.com about 3D FPV. Both those threads are quite old as far as FPV goes, though, so there should be better tools available today.
+1 on this topic for a show...

Also, There are design sacrifice mounting a single camera facing forward; mount binocularly around a single tractor prop is a great idea. Why face the engine backwards or have two motors when we can have two cameras?!


i want to put two cameras on the quad i am designing/building, where the arrows point

  ↑    ↑
  +    +
   \  /
   /  \
  +    +
(f33r my l33t ASCII art skillz)

ok so let's say they are 720p cameras
each one outputs 1280x720 frames
one could take each camera, encode it and transmit it
then combine the two frames at the rx
but that would lead into synchronization issues sooner or later
and you do *not* want that in stereoscopic vision.

a better way would be to have them both cameras on a raspberry pi
snap one frame from each camera
crop them to center
croptest.jpg (1/10 scale)
and combine them to create one 1280x720 frame
which is then compressed and tx to the headset

*even* better and lower power consumption but much, much more difficult programmatically:
two cameras on an fpga
fpga does the cropping and compression and sends the data stream to the tx

the raspberry is a good candidate, because video compression on it would be quite fast
compressing a 720p stream would suck something between 1 W and 2 W so it's not a disaster

it would take some effort to minimize latency and there is good possibility that it can do 1080p fast enough, too


Hostage Taker of Quads
Staff member
LQ . . . why digital?

Yes, yes, resolution, but the overall lag is going to eat your lunch.

Other than the loss in bandwidth for other pilots to play, the weight and power of an onboard R-Pi to post process won't be more than the weight of a second VTX and if the other side is analog as well, the inherent lower latency of the analog link will enforce a near-perfect synchronization.

Sure you'll need fancy goggles to read in independent analog streams, and two VRXs as well so a ground station is a bit more complicated than usual, but you're talking about fabing your own FPGA board -- this isn't a "usual" project.


Active member
For what it's worth, I sometimes run RealFlight in stereo with red/blue glasses, and the effect is very good. If you can make this work in the real world, great!

a properly designed chain won't have more than 2 frames total lag, which at 60 Hz is 33 ms. We've been flying with 20 ms lag on ESCs for years, i highly doubt it will matter.

I won't be fabricating my own fpga board, that would be nuts. There are already lightweight boards out there.

the pi zero is 9 g

what loss in bandwidth? digital requires less bandwidth for the same image quality compared to analog (set aside that two analog streams will require *twice* the bandwidth) and don't say that a couple MHz more or less will matter, when we're talking about the 2.4 GHz band :|


Hostage Taker of Quads
Staff member
2 signals => extra bandwith not available to other pilots. That's a hit on the analog side not digital, but in either case, agreed, there's plenty of band to go around, so it's liveable.

33ms lag would be across your R-Pi alone. We've been flying with a 20-30ms lag on the analog video link total. Having the ESCs add another 30ms to their response time just makes this worse. That's +33ms, so you're looking at 50-60 ms video lag *IF* your digital link is as fast as the analog transmitter is (speed of light says the transmit part should be close). This also doesn't include decode and render on the DVrx side -- the higher the compression, the higher the res, the slower those steps become. Add on the 30ms ESCs and the minimum response time with spidey senses (no time to think) is approaching a tenth of a second, where a good analog system can be refined to half of that. It all adds up.

I'm not saying it can't be done -- clearly it can, but the lag will eat your lunch. By all means, see for yourself.

As for weight, fair enough. the latest VTX's are weighing-in in the 8g region without the antenna, and clearly I've not been keeping up with the Pi's. I'm still running with B models. If it was good enough for grandpappy . . . actually, I'm just cheap ;)

. . . and 2.4? Are you sure you want to run on that? Planning on running a different band for your control link?


Wake up! Time to fly!
That was one of the issues that I saw in may of the skyzone 3d reviews. The lag was the killer. BMSWEB did a good review not too long after it came out and said he flew it for a while as it looked spectacular but range was limited and lag was a no go for faster or precision flight.

Basically that system uses two TX and the diversity of the receivers which cut down on lag to as each "Eye" had its on feed in parallel not both pre-processed and stacked in the same carrier to be resperated and fed to each view screen in the goggles.

The idea with the raspberry pi does the same thing only the combining and recombining happens in different places in the chain. I would suspect the lag would be the same or worse depending on memory management for processing on the pi.
must have been some really shitty engineering to introduce that much lag in an analog transmission

don't expect proper optimization in products like these.

the pi zero can be made a bit lighter by removing the HDMI and uUSB ports.

not working on this project yet but from what i know so far it's viable


Wake up! Time to fly!
Rock on mate if you can pull it off and be lag free or at the very least more comparable to analog mono signals you will be a rich man.