Projector Sync for CAVEs

893 views
Skip to first unread message

Lorne Covington

unread,
Nov 23, 2015, 2:29:33 PM11/23/15
to vr-g...@googlegroups.com

I had always assumed that to do multi-projector CAVEs with active shutter glasses you needed to use expensive solutions like nVidi Quadro cards.

But I decided to test this assumption, and hooked up a couple of Optoma GT1080 projectors* to a PC with two GTX 970s in SLI.

With both projectors hooked to the same card, the 3D works perfectly.  Rock solid sync, absolutely no difference or flutter side-by-sde.

With the projectors hooked to different cards, the two projectors were still frame locked but were out of L/R sync.  Changing the SLI mode did not change this.  But it seemed consistent, so it could probably be worked around.

Since modern nVidia cards directly support up to four displays (more with splitters), it seems like a three-wall CAVE display can be made with one $350 graphics card, three $700 projectors, and some cheap(ish) DLP sync glasses.  Toss in a Kinect2 for tracking, and you're just over $3000 for the package.

Anyone try this?  Or something similar with consumer ATI cards?

Ciao!

- Lorne

* - I love these projectors; cheap, full-HD, 3D, short-throw (.5), 2800 lumens, very low latency, and a great picture.  Did I mention cheap?
-- 

http://noirflux.com

Jan Ciger

unread,
Nov 23, 2015, 4:47:09 PM11/23/15
to vr-g...@googlegroups.com
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi Lorne,

On 23/11/15 20:29, Lorne Covington wrote:
>
> I had always assumed that to do multi-projector CAVEs with active
> shutter glasses you needed to use expensive solutions like nVidi
> Quadro cards.
>
...
> Anyone try this? Or something similar with consumer ATI cards?
>

It is a bit more complicated than this. If the projectors are
connected to the same card, then the outputs are in sync. However -
this is an undocumented behaviour and may break with the next release
of the Nvidia drivers or hardware.

If you are using multiple cards (that is not the same thing as SLI!
SLI means you are using 2 cards to render *ONE* image), the cards will
drift apart over time, so I wouldn't rely on them being "frame locked"
(in quotes, because if the stereo is wrong, they are by definition not
frame locked - showing different image, despite their refresh being in
sync).

Re number of projectors per card - I believe that a GeForce (not
Quadro) allows only 2 outputs to be used simultaneously, regardless of
how many connectors are present on the card. If you connect a 3rd
screen, it will not work and tell you to disable one of the other two.
I haven't tested this with the very recent 9xx GeForces because I
don't have one, but the older 5/6/7xx series don't allow for more than
2 simultaneous outputs - I had to always turn off one of my monitors
when I wanted to use a Rift, despite having 4 output connectors on the
card.

The various splitters and such are not going to help you much, because
a splitter only duplicates existing image on two (or multiple)
screens. I don't think there are (cheap) splitters that would take a
side-by-side image and divide it into two outputs. There are some USB
graphic cards meant to provide additional outputs for laptops, but
that is not usable for 3D.

Re stereo sync for glasses - there I have done quite a bit of research
and work recently, trying to build a synchronization circuit for
common shutter glasses that would work without requiring a Quadro. In
short - it doesn't work unless you have a Quadro, because Nvidia seems
to drop frames in their driver as a way to manage vsync.

My hypothesis is that instead of actually blocking the rendering queue
when the application is too fast and has rendered more than 2-3 frames
ahead in the command queue, they silently drop a frame. That is not a
problem for an average game, but it breaks stereo horribly - you have
left/right/left/left/right suddenly because the "right" frame in the
middle was dropped.

This leads to a nasty "flash" every few frames when the stereo flips
around (where a frame was dropped) and recovers. Quadros don't do this
(even when stereo isn't enabled) and neither GeForce when the 3DVision
mode is enabled, so it is clearly driver related. Of course, if you
are using a Quadro or 3DVision, you don't need my home-grown circuit ...

Paradoxically, the dumbest possible shutter glasses (aka this type:
http://img.tomshardware.com/us/2001/12/18/win/3d-glasses-pny.jpg )
without any PLL or complicated driver (mine are just 2 LCD panels
directly wired to a 3.5mm jack) are best for this, because if I
synchronize them from a mark on the screen, they switch immediately
compensating for the dropped frame and making the glitch somewhat
tolerable.

The more modern glasses (IR/radio synced) will invariably use a PLL as
a "flywheel" to overcome periods of time when no sync is being
received and cannot deal with rapid changes of the sync signal -
leading to the horrid flashes of reversed stereo. Some glasses will
not even synchronize to a signal derived from a screen refresh
directly, because there is so much frame jitter there (again only on
GeForce, not on Quadros).

BTW, things like calling glFinish(), setting up a swap barrier, etc.
don't make any difference with this.

Re DLP-link - DLP link is unusable for driving a multiple-projector
setup. First, there is no relationship between the video signal from
the computer and what is the projector sending to the glasses as
synchronization using those quick white flashes. The projector simply
assumes that e.g. the first frame arriving is for the left eye and
then keeps swapping left/right from that. That works while the signal
is stable with little jitter (otherwise you will need to use the
"invert 3D" button often) and a single projector.

However, it breaks horribly when you have two or multiple projectors.
The projectors will not be synchronized (their color wheels don't spin
at exactly same speeds, they are not in phase neither) and you will
get broken stereo whenever you will look at the boundary between two
projectors. Also multiple projectors sending the DLP-link sync signal
at the same time will lead to horribly confused glasses ...

To conclude, I am afraid that there isn't really a way to build a
sensible CAVE with active stereo for cheap right now. If you want
stereo and don't want to spend for Quadro, synchronization cards and
high speed projectors, go with passive stereo instead. That you can
drive with a single GeForce per side (each 2 projectors) and it will
work. Unfortunately, it is more complicated mechanically and more
expensive to run. The mechanics could be avoided by using stereo
projectors that use a single lens for both left and right images
(basically 2 projectors sharing a lens), but all that I have seen were
horribly expensive.

All the best,

Jan
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2

iD8DBQFWU4lan11XseNj94gRAhkaAKDU1Z4BLoaRlWESUuuLLjf2lIvhqwCgsn2c
RYp4DG4zAExCnys2BVPlD3s=
=p+jc
-----END PGP SIGNATURE-----

Loïc Fricoteaux

unread,
Nov 23, 2015, 5:23:41 PM11/23/15
to vr-g...@googlegroups.com
Hi,

Just a precision about Jan's remark :

"I believe that a GeForce (not Quadro) allows only 2 outputs to be used simultaneously, regardless of how many connectors are present on the card."
That was true for the old Nvidia graphic cards but since the Kepler architecture a Geforce/Quadro card can handle 4 outputs (GeForce 600/700/900 series and Quadro K and M series).

Also some DLP 3D projectors have a sync connector like on a Quadro card. This must be more reliable than the DLP-Link signal for the synchronisation of the glasses when using multiple projectors, but it does not resolve all the problems mentioned by Jan.

Best regards,
Loïc


--
You received this message because you are subscribed to the Google Groups "VR Geeks" group.
To unsubscribe from this group and stop receiving emails from it, send an email to vr-geeks+u...@googlegroups.com.
Visit this group at http://groups.google.com/group/vr-geeks.
For more options, visit https://groups.google.com/d/optout.

Jan Ciger

unread,
Nov 24, 2015, 4:40:59 AM11/24/15
to vr-g...@googlegroups.com
Hi Loïc,

On Mon, Nov 23, 2015 at 11:23 PM, Loïc Fricoteaux
<loic.fr...@gmail.com> wrote:
> Hi,
>
> Just a precision about Jan's remark :
> "I believe that a GeForce (not Quadro) allows only 2 outputs to be used
> simultaneously, regardless of how many connectors are present on the card."
> That was true for the old Nvidia graphic cards but since the Kepler
> architecture a Geforce/Quadro card can handle 4 outputs (GeForce 600/700/900
> series and Quadro K and M series).

Ah good to know. I didn't know about this. I guess I should upgrade my
machine :-p
However, I doubt that all 4 would be in sync, even if you use
identical projectors - especially if you are using DisplayPort which
is a packet-based protocol, not a "simple" stream of serial data like
HDMI/DVI are. Invariably the receiver (monitor, projector, etc.) will
buffer the data and then all bets are off whether or not two of them
will be in sync. That can, of course, happen with HDMI too, it is just
a bit less likely due to the more tight coupling between the devices
(HDMI/DVI send video in a digitized version of the classic
hsync/vsync/lines of pixels setup known from TVs, DisplayPort is much
more advanced).

> Also some DLP 3D projectors have a sync connector like on a Quadro card.
> This must be more reliable than the DLP-Link signal for the synchronisation
> of the glasses when using multiple projectors, but it does not resolve all
> the problems mentioned by Jan.

Yeah, but those are usually not the cheap DLP-Link variety that you
buy for a few hundred euro. Moreover, that only lets you tell the
projector which frame is which, not to sync multiple projectors
together. As you say - it doesn't solve the problems with the color
wheels not being in sync (DLP projectors are pretty much all color
wheel based), so you would get incorrect stereo and lovely color
artifacts due to the glasses switching at the wrong time (with the
incorrect segment of the wheel being displayed).

Regards,

Jan

Lorne Covington

unread,
Nov 24, 2015, 10:43:11 AM11/24/15
to vr-g...@googlegroups.com

On 11/23/2015 4:47 PM, Jan Ciger wrote:
> It is a bit more complicated than this. If the projectors are
> connected to the same card, then the outputs are in sync. However -
> this is an undocumented behaviour and may break with the next release
> of the Nvidia drivers or hardware.

I would be surprised by this, as 3D (in particular as related to VR) is
more and more a standard feature. Multi-monitor gaming, particularly
with nVidia Surround, also pushes this to work.


> If you are using multiple cards (that is not the same thing as SLI!
> SLI means you are using 2 cards to render *ONE* image),

No, SLI also has an "activate all displays" mode which means the
resources are spread across all independent displays on all cards, as
well as a "Surround" mode where all resources are directed to all
displays (treated as one surface) on one of the SLI'd cards.

nVidia has specifically mentioned this in connection with 3D, I just
don't think it's a released feature yet:
https://developer.nvidia.com/designworks-vr

It is to include fun things like being able to use separate GPUs for
each eye, etc.

And time to upgrade your cards! (:}) (The GTX 970 is at an incredible
price/performance point:
http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+GTX+970&id=)



> the cards will
> drift apart over time, so I wouldn't rely on them being "frame locked"
> (in quotes, because if the stereo is wrong, they are by definition not
> frame locked - showing different image, despite their refresh being in
> sync).

By "frame locked" I meant they were both displaying the correct image
frame, but the wrong "eye", as both "eyes" are generated at the same
time (frame). They were locked in that there was no wavering or
polarity switching - the stereo was solid, just R/L swapped on one
projector.


> The various splitters and such are not going to help you much, because
> a splitter only duplicates existing image on two (or multiple)
> screens. I don't think there are (cheap) splitters that would take a
> side-by-side image and divide it into two outputs. There are some USB
> graphic cards meant to provide additional outputs for laptops, but
> that is not usable for 3D.

You need to check out modern "splitters" too! DisplayPort tech allows
multiple true displays out of one DP. So you can have things like this:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814998089

Which gives you up to three 1080p displays off of one DP port. I
haven't tried this with multiple 3D projectors but will.



> Re DLP-link - DLP link is unusable for driving a multiple-projector
> setup.

Actually that is what I tested. The two projector setup on the one 970
was rock-solid over multiple runs and different outputs on the card.
Maybe I just got lucky, but things were definitely not drifting. This
may be a projector model-by-model "feature" though, as the GT1080s are
designed for gaming so have a low-latency pipeline with little or no
video processing.

I will do more testing on this, using more outputs (I have six GT1080s I
got for a full-dome setup), splitters, and other projectors.

Again, since multi-monitor 3D gaming (off of one card) is already here,
the main issue is if projectors will allow it.

Thanks Jan!

- Lorne

--

http://noirflux.com


Lorne Covington

unread,
Nov 24, 2015, 10:45:05 AM11/24/15
to vr-g...@googlegroups.com

Oh, and this is available now:

https://developer.nvidia.com/virtual-reality-development

- Lorne


On 11/23/2015 4:47 PM, Jan Ciger wrote:
--

http://noirflux.com

Jan Ciger

unread,
Nov 24, 2015, 12:12:03 PM11/24/15
to vr-g...@googlegroups.com
On Tue, Nov 24, 2015 at 4:43 PM, Lorne Covington <noir...@gmail.com> wrote:
>
> I would be surprised by this, as 3D (in particular as related to VR) is more
> and more a standard feature. Multi-monitor gaming, particularly with nVidia
> Surround, also pushes this to work.

Multi-monitor gaming != guaranteeing frame sync across the monitors.
The two issues are completely unrelated and orthogonal. Apart from it
being physically impossible in some cases (monitors with different
refresh rates) there is no real use case needing it apart from stereo
support. And we know how Nvidia deals with stereo in their drivers -
basically making every possible effort to push you to buy an expensive
Quadro.

> No, SLI also has an "activate all displays" mode which means the resources
> are spread across all independent displays on all cards, as well as a
> "Surround" mode where all resources are directed to all displays (treated as
> one surface) on one of the SLI'd cards.

OK, so that's new. I have made the comment, because often people claim
to be running SLI to describe any situation when two (or more) cards
are installed in the same machine - which has nothing to do with SLI.
The last time I dealt with SLI enabling it meant that the output from
the secondary card was deactivated and the card was used to render the
image for the primary card.

>
> nVidia has specifically mentioned this in connection with 3D, I just don't
> think it's a released feature yet:
> https://developer.nvidia.com/designworks-vr
>
> It is to include fun things like being able to use separate GPUs for each
> eye, etc.

I think they are pushing it as a selling point for their expensive SLI
setups, which aren't exactly selling too well. Most current games work
just fine with a single card and apart from the bragging rights,
having two brings little real benefit for most people, only a ton of
problems with cooling, power supply and noise.

>
> And time to upgrade your cards! (:}) (The GTX 970 is at an incredible
> price/performance point:
> http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+GTX+970&id=)

Yes, that one was on my radar.

> By "frame locked" I meant they were both displaying the correct image frame,
> but the wrong "eye", as both "eyes" are generated at the same time (frame).
> They were locked in that there was no wavering or polarity switching - the
> stereo was solid, just R/L swapped on one projector.

Oh ok, I see what you mean. That is not what is commonly understood as
"frame". Frame is just that - an image. And that could be for the left
or right eye (so the two frames are actually different, not one frame
containing two images). That's why I was confused how come that the
screens were displaying the same "frame" and at the same time having
different stereo ...

What that means is that one output was offset by 1 frame (image), but
their output frequencies were in sync.


>
> You need to check out modern "splitters" too! DisplayPort tech allows
> multiple true displays out of one DP. So you can have things like this:
>
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814998089
>
> Which gives you up to three 1080p displays off of one DP port. I haven't
> tried this with multiple 3D projectors but will.

And what is the display latency of such thing? Considering that
DisplayPort (same as HDMI/DVI) are serial protocols, I would expect
the framerate per display drop by a factor of at least 3 if you are
driving 3 screens. If it does actually split a larger resolution into
2-3 smaller resolution screens (e.g. 2k into 2 FullHD), then I would
expect it to do quite a bit of buffering as well.


>
> Actually that is what I tested. The two projector setup on the one 970 was
> rock-solid over multiple runs and different outputs on the card. Maybe I
> just got lucky, but things were definitely not drifting. This may be a
> projector model-by-model "feature" though, as the GT1080s are designed for
> gaming so have a low-latency pipeline with little or no video processing.
>

Well, according to this:
http://www.projectorcentral.com/optoma_gt1080_projector_review.htm,
the projector has about 33ms input latency. That is not bad, but it is
still 2 frames.

I am a bit surprised that the stereo didn't drift between the
projectors (on a single projector there is no reason for it to drift,
as the glasses are synced to the video signal). I doubt that Optoma
synchronizes the color wheels rotation speed to the video signal too.
You didn't have color casts when driving the glasses from one
projector and looking at the image of the other? That's the most
common issue, before even the stereo flipping happens because it is
enough for the glasses timing to be only a little bit off for this.

> Again, since multi-monitor 3D gaming (off of one card) is already here, the
> main issue is if projectors will allow it.

The problem is that projectors are usually not something sold as for
"gaming". Most gamers don't have projectors and even those that have
one don't have more than one. Also active stereo was pretty much
killed off by Nvidia locking it up for their Quadros and requiring
certified hw and the proprietary 3DVision on desktop. The only people
using active stereo are very much "pros" - and those are being pushed
to the high end, uber expensive gear.

So there is not really much motivation out there among the
manufacturers to deal with these issues, especially after the whole 3D
TV fiasco. In fact, most manufacturers don't even ship the 3D glasses
with their TVs and projectors anymore, despite supporting 3D, because
almost nobody is asking for them ...

Adding proper synchronization to permit DLP colorwheel projectors to
work in a multi-projector active stereo setup would probably require
yet another proprietary synchronization hack, a bit of electronics to
actually synchronize the wheel and add extra cost for each projector
for an extremely niche feature.
Moreover, DLP-link is patented and licensed by Texas Instruments, so
someone like Optoma producing a "hacked" synchronizable version is
fairly unlikely for legal reasons as well (plus the entire patent
minefield on these technologies from companies like Projection Design,
Barco, etc. who build the high end projectors having these features).

So I am not very optimistic here.

J.

Lorne Covington

unread,
Nov 24, 2015, 12:56:44 PM11/24/15
to vr-g...@googlegroups.com


On 11/24/2015 12:12 PM, Jan Ciger wrote:
> I think they are pushing it as a selling point for their expensive SLI
> setups, which aren't exactly selling too well. Most current games work
> just fine with a single card and apart from the bragging rights,
> having two brings little real benefit for most people, only a ton of
> problems with cooling, power supply and noise.

Just a quick note, actually two 970s in SLI gives you about 20% better
performance than the 980 for less money. And modern motherboards and
power supplies are all designed with Crossfire/SLI in mind. I commonly
use dual-card setups in SLI, and have even used three-card SLI setups to
drive nine full-HD monitors. It's really not a problem, and in fact a
huge win for multi-monitor cases.

Jan Ciger

unread,
Nov 24, 2015, 1:32:07 PM11/24/15
to vr-g...@googlegroups.com
On Tue, Nov 24, 2015 at 6:56 PM, Lorne Covington <noir...@gmail.com> wrote:
> Just a quick note, actually two 970s in SLI gives you about 20% better
> performance than the 980 for less money.

Interesting. I will check the benchmarks once it becomes relevant -
probably after New Year.

> And modern motherboards and power
> supplies are all designed with Crossfire/SLI in mind. I commonly use
> dual-card setups in SLI, and have even used three-card SLI setups to drive
> nine full-HD monitors. It's really not a problem, and in fact a huge win
> for multi-monitor cases.

Yeah, but yours isn't exactly a very common use case ...

The MBs are not a problem, but I have seen a lot of people use cheap
power supplies that simply can't deliver the current these cards need,
despite what is written on the sticker of the PSU. Even brand name
PSUs have often problems delivering their rated current. That causes
stability issues, crashes and, in the worst case, the PSU will
catastrophically fail, taking out most of the PC with it.

I wasn't talking about people who know what they are doing, but most
folks don't have a clue about the required power output and thermal
design when building (or buying a pre-built) PC. And SLI tends to make
the issues 2x more complex to handle, because there are multiple cards
to deal with and more chances for things to go wrong.

J.

Andrew M.

unread,
Nov 24, 2015, 2:58:33 PM11/24/15
to vr-g...@googlegroups.com
Frame Time Variance is a known problem for SLI/Crossfire setups.  If you want stereo 3d with a multi-monitor set-up there is no way to achieve that without a quaddro or other top tier single card, unless you want to tolerate the sync and stutter issues.  It is impossible to perfectly sync two video cards, and long as they aren't perfectly in-sync, you will have FTV related problems.  The occasional micro-stutter (the product of FTV) on a monitor based experience isn't that bad or jarring, but with 3d multiscreen displays or hmd's it's pretty unpleasant.

Try running off just a single 970 or better with the three walls and NO 3d at all first.  See if the lack of 3d is really noticeable.  There's a lot of effects that people stop noticing once their attention is focused on the actual sim.

Lorne Covington

unread,
Nov 24, 2015, 5:13:26 PM11/24/15
to vr-g...@googlegroups.com

Did more tests.  Three GT1080s on one 970 are rock solid.  No flicker, color wobble, nada.  All three showing the correct eye phase perfectly.  Stopped and started many times, left running for ten minutes, no change.

Then moved all three to that EVGA DisplayPort splitter - same results.  Three projectors running synced 3D off one DisplayPort output of the 970.

Then hooked up two BenQ SH915s to the EVGA splitter.  Also perfect 3D sync.  I have not tried mixing and matching projector models, that I really do not expect to work.  (But I'll probably try anyway...)

Both of those projectors are DLP 3D, so I must conclude that at least these two models do indeed lock to the input signal as evidenced by their DLP sync signals to the glasses being locked and causing no confusion with the glasses.

What is probably important to note here is that all of my 3D stuff is in vvvv, where I do all my own separate eye image generation, and then do multi-output, side-by-side, or over-under image presentation to the projectors.  I am not using the nVidia 3D facility.

What is interesting, is that even when just showing the desktop all the projectors are synced.  When in side-by-side mode, I can close one eye and see the same half of the desktop in all the projectors, look out the other eye and see the other half.

So looks to me like this really does work on one card, at least with those two models of projector.  And given that these cards are capable of running 4K (4x 1080p) games at high frame rates, it is not surprising that they can handle multiple full HD outputs.  Now if I could only afford 4K projectors... (:^{)

Andrew, concerning SLI and VR, check out this link:

    https://developer.nvidia.com/virtual-reality-development

nVidia appears very, very serious about VR; this is all for GTX cards not Quadro.  I haven't tried this package yet but will when time allows - just found out we get to keep the Vive!

Ever onward!

- Lorne

Lorne Covington

unread,
Nov 24, 2015, 5:20:11 PM11/24/15
to vr-g...@googlegroups.com

Even better than I thought, at 5760x1080, SLI 970s are 31% faster than a
980:

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/20.html

- Lorne
--

http://noirflux.com

Jan Ciger

unread,
Nov 24, 2015, 6:00:19 PM11/24/15
to vr-g...@googlegroups.com
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 24/11/15 23:13, Lorne Covington wrote:
>
> Did more tests. Three GT1080s on one 970 are rock solid. No
> flicker, color wobble, nada. All three showing the correct eye
> phase perfectly. Stopped and started many times, left running for
> ten minutes, no change.
>
...

Interesting, I would have expected more trouble with that sort of
setup. They must be synchronizing the DLPs and the color wheel speed
to the incoming signal, otherwise they won't be able to keep it in
sync for that long. Pity that this behaviour is not documented
officially, so one doesn't know whether it is something one can rely
on or a total fluke that will bite you in the backside when you
actually try to deploy it ...

Re mixing & matching projectors - if you are lucky and they use the
same color wheel setup and same lighting patterns in their 3D mode, it
might work. But that's pretty much a matter of luck.


> Andrew, concerning SLI and VR, check out this link:
>
> https://developer.nvidia.com/virtual-reality-development
>
> nVidia appears very, very serious about VR; this is all for GTX
> cards not Quadro. I haven't tried this package yet but will when
> time allows

You will have to sign an NDA to be able to obtain it. Anyhow, I doubt
it is going to be of much use for "mere mortal" developers (aka not a
huge game studio), because they are clearly focusing on making things
like different viewpoint generation faster by letting the app use
multiple GPU (until now it was the driver deciding automatically which
frame/part of the frame gets rendered by which GPU).

That is mainly stuff to squeeze every last bit of framerate out of AAA
games running on Oculus Rift-like HMDs, not to make things like
running multiple projectors in stereo easier.

J.



-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2

iD8DBQFWVOv/n11XseNj94gRAoovAKCZQKbtDVLPSypqgf3SqTC2Z6qWlwCfQFeX
to9VkCc+XOwiKVoCvHLAzJs=
=iwdP
-----END PGP SIGNATURE-----
Reply all
Reply to author
Forward
0 new messages