Liquid galaxy on 360° panoramic projection system

134 views
Skip to first unread message

Dragetz

unread,
Jul 1, 2012, 9:21:05 AM7/1/12
to liquid-galaxy
Hi everyone!

I would like to present PanoVRama system. It is tiled multiprojector
panoramic visualization system developed at Faculty of Electrical
Engineering and Computing, University of Zagreb in Croatia as a
student project. The latest addition to the system is an ability to
run FightGear flight simulator and Liquid galaxy.

Here is a promo video of system in operation http://www.youtube.com/watch?v=xYW5CA_qMZs

In current configuration system uses 6 projectors to cover 360° field
of view. Two projectors are connected to one computer so we have a
distributed system of 3 computers that render image and one master
computer which controls everything. Canvas is cylindrical about 3
meters in diameter.

Resulting image is seamless so the experience is completely immersive,
there aren't any LCD monitor edges.

As Liquid Galaxy doesn't have any support for any image
transformation. We are grabbing image the same way compositing window
managers work. Redirecting of Google Earth rendering to off screen
buffer then using it as a texture in our system. By using OpenGL and
shaders on graphics card we can transform the image in real time.
This is all done on Linux operating system.

There were usual problems like running multiple instances of Google
Earth on one computer, entering Street View etc.

We even implemented Nintendo Wii controller support for Google Earth.
Source code of the whole project can be found at sourceforge.
http://sourceforge.net/projects/panovrama2/

Cheers,
Dragutin Hrenek



Jason Holt

unread,
Jul 1, 2012, 12:16:12 PM7/1/12
to liquid...@googlegroups.com

Beautiful!  That's amazing that you get such good performance from earth with frame grabbing.  Well done!
I hear that new nvidia cards have support for warps like this at the driver level so that you don't have to grab and post process the data yourself.  (but I haven't tried it myself)

Sent from my android

Andrew Leahy

unread,
Jul 1, 2012, 1:10:55 PM7/1/12
to liquid...@googlegroups.com
Lovely work guys.

How are you syncronising from the master controller and rotating the camera's for the Google Earth instances? Your blog says you aren't using anything from Liquid Galaxy, so you aren't using ViewSync?

"This project has nothing in common with Google's Liquid Galaxy project, but it has served as an inspiration." http://sourceforge.net/news/?group_id=511085

Are the projectors LCD or DLP? i can see a little bit of black-level loss at the overlap.

If you have ideas for auto-enabling Street View mode in client, I'd like that too.
Programmatically dragging peg-man onto a known streetview road with xdotool or sikuli is the hacky way I would do it!.

Cheers, Andrew
eResearch / UWS

Zoomin Hao

unread,
Jul 1, 2012, 9:26:21 PM7/1/12
to liquid...@googlegroups.com

hi,all
It's so cool!
Andrew's question is the same for me, how do you make the  texture seamlessly joint together ?(a complex equation for this?)

best reagards
Zoomin
CAS SIAT

2012/7/2 Andrew Leahy <alf...@gmail.com>

Dragetz

unread,
Jul 2, 2012, 6:19:25 AM7/2/12
to liquid-galaxy
@Jason

It's not true frame grabbing. Because everything stays in graphics
card memory. XComposite extension is used to redirect window to
offscreen Pixmap. And GLX extension EXT_texture_from_pixmap is used to
load that texture in our custom player. Performance loss is minimal
especially because our transformations are simple. It's almost the
same code that Compiz uses.
Bigger problem is that two instances of Google Earth are running
simultaneous.

Heard that also about NVIDIA but didn't look much in to it. Our
approach is probably more flexible and performance difference
shouldn't be major because graphics hardware has to do the same math
in both cases. For example in our first iteration of the system we did
21 meter screen and had water wave effects implemented in pixel shader
that spanned across the screen fully synchronized.

http://www.fer.unizg.hr/images/50011246/DHF5.jpg
This is a photo of the installation at Days of Croatian Film
festival.

@Andrew

That blog post was a miscommunication between my colleague who wrote
the blog posts and me. System is based on Liquid Galaxy. Maybe we
could have done it some other way but definitely not with per pixel
accuracy that we have now.

Entering into Street View is done using xdotool. We use Xlib directly
for creating windows and creating GLX rendering context. Plan was to
try to send mouse events directly to Google Earth instance event loop
but xdotool worked fine so we didn't bother with it.

Projectors are LCD lower middle class. And computers are also low
cost. Brightness calibration is the worst perfoming part of the
system.
Brightness response for projector is non-linear in every aspect from
spatial to intensity. Even projector from same series have completely
different characteristic. We tried to measure brightness response with
a camera, but its sensor has its own non-linearity. We tried HDR
imaging to get full dynamic response. Not to mention that human eye i
totally subjective. Current method is part measurement part heuristic.
But for a student project i think that results are pretty good.
Reply all
Reply to author
Forward
0 new messages