Liquid Galaxy on our university display wall

142 views
Skip to first unread message

paulh

unread,
Oct 5, 2010, 6:16:28 PM10/5/10
to liquid-galaxy, dmn...@wand.net.nz
So, I spent a couple hours yesterday playing with LG on the display
wall we have at my university:

http://picasaweb.google.com/bieh.nz/DropBox#5524399869666963794

http://picasaweb.google.com/bieh.nz/DropBox#5524399805196769570

It works pretty well - it's proven quite a crowd pleaser - everyone
who walks past ends up asking to see their house / home town / various
landmarks :)

Specs of the wall, in case anyone is interested: 20 x 22" monitors,
running at 1680x1050 each. Total pixel resolution of 8400x4200 - about
35 megapixel. 5 individual computers running Debian, one on each
column.

One question though - currently my method of controlling it it with
query.txt, as there's no keyboard / mice plugged into our wall. This
kinda works, but I'd really like to be able to, say, zoom in more, or
look up to the horizon and rotate around 360 degrees, or something.

Is there some kind of method I can use to remotely control the master
GE instance that gives me more options? Googling only turns up the JS
and COM APIs, neither of which look useful.

Thanks!

Jason Holt

unread,
Oct 5, 2010, 6:25:24 PM10/5/10
to liquid...@googlegroups.com, dmn...@wand.net.nz
That rocks!  What are your hardware specs?

Liquid Galaxy adds a couple of experimental ways to control Earth: query.txt, multi-axis controllers (in Linux), and the UDP datagrams used by the view sync code.

In the latter two cases, it should be possible to synthesize spacenav events or viewsync datagrams, although we haven't played with it much.

paulh

unread,
Oct 5, 2010, 7:12:50 PM10/5/10
to liquid-galaxy
The display machines are a few years old now, but they were pretty top-
of-the-line when we got them - C2D @ 2.66ghz, 4gb RAM, dual nvidia
8800GT video. Still more than enough to run GE, even at those
resolutions - I haven't checked the FPS, but it's pretty smooth.

I hadn't considered generating the view sync packets myself, but
that'd probably work. I don't suppose the content of those packets are
documented anywhere? Probably easier just to remotely inject X events
into the master GE instance though.

For the moment I might just write a basic mobile web interface that'll
let passers-by type in an address. Right now it's running a simple
script to generate a random lat/long and put it into query.txt every
30 seconds or so - that sounded like quite a good idea when I thought
of it, but it turns out that most of the world is water :)

Jason Holt

unread,
Oct 5, 2010, 7:16:38 PM10/5/10
to liquid...@googlegroups.com
No, not documented at the moment, although it's a TODO.  It's a bunch of comma-separated decimal numbers for things like lat and lon.  The first field is a monotonic counter that will probably cause you headaches; you might have to write a multiplexer to keep the counter increasing correctly regardless of the source  

How many screens per machine are you using?  4 screens vertically per machine, perhaps?  I'm trying to figure out how you made the tiling work.

touchaddict

unread,
Oct 5, 2010, 7:35:55 PM10/5/10
to liquid-galaxy
hey paulh,

would you want to interface your setup using multi-touch and hover
interaction? i dont have teh required setup right now for the
hardware. however i am sure it'd be much cooler to enable liq-gal to
interact using hover and multitouch gestures, just like on those
iphones/ipad and natal setups( http://www.youtube.com/watch?v=AgWUtw0sbro
) . the secret lies in using a couple cheap overlays upon the monitors/
lcds.

let me know.

cheers
-anirudh
/ http://www.touchaddict.blogspot.com /

paulh

unread,
Oct 5, 2010, 8:39:55 PM10/5/10
to liquid-galaxy
Right, I was wondering what the first number was. The rest look simple
enough, although documentation would still be appreciated :)

Yes, we use 4 screen per machine, stacked vertically. Each machine
runs at 1680x4560 - the extra height over the 4200 pixels is for a gap
between each monitor to account for bezels. (We do something similar
in our Chromium-like rendering software (http://bit.ly/cUMHyo) to
account for the bezels between columns normally, though I'll need to
do it on the GE level here).

As a result, the LG config is pretty simple - yawOffset just goes
73,36.5,0,-36.5,73 from left to right. The master GE instance is the
one in the middle.

paulh

unread,
Oct 5, 2010, 8:58:59 PM10/5/10
to liquid-galaxy
Yeah, that kind of stuff would be pretty neat. If you look at the
photo of the wall, we have a webcam attached above the screens - in
the past we've done various control mechanisms for apps on the wall
that use marker tracking / blob tracking / face detection. This means
that if I could control GE properly from code, I could stand in front
of it and make gestures to move the viewpoint around. For example, we
prototyped something once where you flew a plane on the wall by
standing in front of it, stretching out your arms horizontally, and
tilting from left to right :)

On Oct 6, 12:35 pm, touchaddict <anirudhsharma.cry...@gmail.com>
wrote:
> hey paulh,
>
> would you want to interface your setup using multi-touch and hover
> interaction? i dont have teh required setup right now for the
> hardware. however i am sure it'd be much cooler to enable liq-gal to
> interact using hover and multitouch gestures, just like on those
> iphones/ipad and natal setups(http://www.youtube.com/watch?v=AgWUtw0sbro

Jason Holt

unread,
Oct 5, 2010, 9:01:25 PM10/5/10
to liquid...@googlegroups.com
Very cool.  So you've basically unwrapped 146 degrees of a cylinder (73*2) onto a flat surface.   Is the distortion noticeable?  The radius of the cylinder would be whatever makes the width of a screen subtend 36.5 degrees.  (So if you actually mounted them in a cylinder of that diameter and stood in the middle, it'd look "right").  

You could change the yaw and horizfov to pretend you were mapping onto a much bigger cylinder, so that your wall only covers, say, 70 degrees instead of 146 degrees.  If you really took it to the extreme, pretending you were mapping to, say, a 100m wide cylinder, then your wall would be like looking through a telescope, but the "unwrap" distortion would be very low.  :)

Ultimately, I'm hoping we'll support view matrices, so that you can plug in some magic numbers and have it project skewed frustums over to the corners of your wall and make the projection "correct" from your preferred viewing location.

Jason Holt

unread,
Oct 5, 2010, 9:04:50 PM10/5/10
to liquid...@googlegroups.com
Sweet!  For that kind of interface, I'd probably try to synthesize multi-axis events.  Here's code that parses those events from a space navigator.  It should be reasonably easy to generate events like this instead of parsing them, and write them to a named pipe.  Then point earth at the pipe instead of an actual spacenav.

// Released into the public domain, 25 June 2009
// Google, inc. Jason E. Holt <jholt [at] google.com>
//
// Documentation and example code on how to read our 3dconnexion space navigator
// using the /dev/input/event* interface.
// Our navigator shows up as:
// Bus 007 Device 004: ID 0510:1004 Sejin Electron, Inc. 

#include <sys/ioctl.h>
#include <error.h>
#include <stdio.h>
#include <sys/types.h>
#include <sys/stat.h>
#include <fcntl.h>
#include <stdlib.h>
#include <string.h>
#include <linux/input.h>

main(int argc, char **argv) {
  
  int fd;
  
  if ((fd = open(argv[1], O_RDONLY | O_NONBLOCK)) < 0) {
      perror("opening the file you specified");
      exit(1);
  }
  
  printf("ev_rel:%d ev_abs:%d ev_key:%d\n", EV_REL, EV_ABS, EV_KEY);

  float x, y, z, yaw, pitch, roll;
  x = y = z = yaw = pitch = roll = 0.0;

  while(1) {
    //int event_data[6];
    struct input_event ev;
    struct input_event *event_data = &ev;

    int num_read = read(fd, event_data, sizeof(ev));

    if (sizeof(ev) != num_read) {
      printf("only read %d\n", num_read);
      usleep(100000);
      continue;
    } else {
    }

    /*
    printf("Timestamp %d:%d type: %hx %hx %x\n",
      ev.time.tv_sec, ev.time.tv_usec,
      ev.type, ev.code, ev.value);
      */

    if (event_data->type == EV_KEY) {
      printf("button press\n");
    } else if (event_data->type != EV_REL) {
      printf("Unknown event type\n");
    } else {
      int axis = event_data->code;
      int move_amount = event_data->value;
      switch(axis) {
        case 0:
          x = -move_amount * 0.08;
          break;
        case 1:
          y = -move_amount * 0.08;
          break;
        case 2:
          z = -move_amount * 0.31;
          break;
        case 3:
          pitch = move_amount * 0.0001;
          break;
        case 4:
          roll = move_amount * (1.0 / 24000.0);
          break;
        case 5:
          yaw = -move_amount * 0.0001;
          break;
        default:
          printf("unknown axis event\n");
          break;
      }

      int state_file = open("/tmp/spacenav.tmp", O_CREAT | O_WRONLY, 0644);
      if (state_file == -1) perror("opening /tmp/spacenav.tmp");

      char state_string[1000];
      sprintf(state_string, "%f,%f,%f,%f,%f,%f",
              x,y,z,yaw,pitch,roll);
      printf("%s\n", state_string);
      write(state_file, state_string, strlen(state_string));
      close(state_file);
      rename("/tmp/spacenav.tmp", "/tmp/spacenav");
    }
  }
}

paulh

unread,
Oct 5, 2010, 11:39:24 PM10/5/10
to liquid-galaxy
Yeah, I just used the example values from the QuickStart as a way of
getting it going quickly - we had quite a lot of distortion at the
edges, but we were having too much fun flying around and looking at
stuff to put too much thought into fixing it :)

I've now tweaked it to have a FoV of 90 across the wall - that is to
say, a horizFov of 18, and using 36,18,0,-18,-36 from left to right.
It's now much less distorted:

http://picasaweb.google.com/bieh.nz/DropBox#5524770804025281346
http://picasaweb.google.com/bieh.nz/DropBox#5524770654985298866

And thanks for the code, it'll be helpful :)

On Oct 6, 2:01 pm, Jason Holt <jh...@google.com> wrote:
> Very cool.  So you've basically unwrapped 146 degrees of a cylinder (73*2)
> onto a flat surface.   Is the distortion noticeable?  The radius of the
> cylinder would be whatever makes the width of a screen subtend 36.5 degrees.
>  (So if you actually mounted them in a cylinder of that diameter and stood
> in the middle, it'd look "right").
>
> You could change the yaw and horizfov to pretend you were mapping onto a
> much bigger cylinder, so that your wall only covers, say, 70 degrees instead
> of 146 degrees.  If you really took it to the extreme, pretending you were
> mapping to, say, a 100m wide cylinder, then your wall would be like looking
> through a telescope, but the "unwrap" distortion would be very low.  :)
>
> Ultimately, I'm hoping we'll support view matrices, so that you can plug in
> some magic numbers and have it project skewed frustums over to the corners
> of your wall and make the projection "correct" from your preferred viewing
> location.
>

Andrew Leahy

unread,
Oct 11, 2010, 6:43:26 AM10/11/10
to liquid-galaxy
Paul - just back to your control problem. Suggestion, if you're happy
to add another machine with a controller such as the SpaceNavigator.
Then have that machine act as the View master and setup the 5 wall
machines as View slaves. I've done something similar with a test-rig,
using a laptop as the machine which has 'ViewSync/send=true'. All the
display/render machines are just slaves to that. This also has the
benefit of not 'messing up' one of your main display screens with
search queries, menus, side-bars, etc.

For a few years now I've run GE on our large tiled rear-projection
display wall, just with Chromium+DMX (with Linux) and it works fine.
In that case GE get's run on the mothership (master) with 9 render
nodes all dealing with parts of the display. This works perfect
because the single GE instance offers up one huge planar view without
any distortion.

How do you account for the imagery that should be "hidden" behind the
horizontal bezels (aka mullions) going down the column?

I'm pretty sure Liquid Galaxy relies on the screen edge/bezel to hide
the fact that GE produces a planar view. Without bezels the "kinks"
that appear as a straightline is drawn across multiple views would be
annoying!

Even with all that it's still pretty impressive to see!

Cheers, Andrew

PS on your tile wall I'd move the webcam down to about eye-height,
just hide it between a couple of the bezels!

ki...@endpoint.com

unread,
Oct 11, 2010, 2:01:13 PM10/11/10
to liquid...@googlegroups.com
On Mon, Oct 11, 2010 at 03:43:26AM -0700, Andrew Leahy wrote:
> Paul - just back to your control problem. Suggestion, if you're happy
> to add another machine with a controller such as the SpaceNavigator.
> Then have that machine act as the View master and setup the 5 wall
> machines as View slaves. I've done something similar with a test-rig,
> using a laptop as the machine which has 'ViewSync/send=true'. All the
> display/render machines are just slaves to that. This also has the
> benefit of not 'messing up' one of your main display screens with
> search queries, menus, side-bars, etc.

Yep, great suggestion.

This is basically how we took care of things for the TV spot taking place at
the Tokyo Google office. The laptop can be the master for controlling the
rest of the Galaxy, or the laptop can be a another slave with the same view
as the master to allow synchronized recording of where someone in the Galaxy
is flying with the SpaceNav.

Lots of possibilities.

[...clip...]
--
Kiel Christofferson
1.616.799.7336
End Point Corporation
www.endpoint.com

signature.asc
Reply all
Reply to author
Forward
0 new messages