Official topic - 360° capture and rendering

106 views
Skip to first unread message

COLIBRI VR

unread,
Mar 21, 2020, 2:46:57 PM3/21/20
to COLIBRI VR
Hi,

This is the official topic for discussing 360° capture and rendering.

Feel free to post!

--
Grégoire Dupont de Dinechin
Message has been deleted

Tianyu Wang

unread,
Jun 30, 2020, 3:46:55 AM6/30/20
to COLIBRI VR
Hi,

I just test a 360° scene by using COLIBRI. There are some problems when I tried to output depth map. But the depth map shown in the Unity is right. (My computer is MacBook Pro with macOS Mojave 10.14.6)

0000.png

DEB10915-685C-4859-823C-222A788B346F.png




Greg de Dinechin

unread,
Jun 30, 2020, 4:42:07 AM6/30/20
to COLIBRI VR
Hi,

The top depth map is more precise (using the RGB channels to precisely encode depth values), while the bottom depth map is useful for visualization (easily interpretable, but would not enable precise decoding). There is a shader in "Shaders/Debugging" called "ConvertPreciseToVisualization" if you want to convert from the first to the second.

I hope this answers the question. If the issue is in fact that the precise depth map is not correct, feel free to give me details so that I can investigate.

Best,

Greg

Daniel Walter

unread,
Mar 29, 2021, 2:45:07 PM3/29/21
to COLIBRI VR
Hi  Grégoire ,

first let me thank you for creating Colibri VR and especially making it available for free to the public.

I have a couple of questions:
  • How did you create the RGB depth map? Did you use a laser scanner for that? If so, which one?
  • Is Colibri VR performant enough to run on mobile devices such as the Oculus Quest?
My Use Case would require to capture 360 scenes, such as the garden scene.
As I plan to create a huge library of VR scenes, I`d like to know your process.

I've been experimenting with 360 photography and manual depth reconstruction and also with lidar photogrammetry for the past 2 years.
Would love to be able to use your framework going forward.

Thank you for your time.

Cheers,
Dan

Greg de Dinechin

unread,
Feb 1, 2022, 4:32:56 AM2/1/22
to COLIBRI VR
Hi Dan,

Thanks for your message, and very sorry for not answering sooner, I haven't taken the time to check on the forum this past year.
  • Absolutely, I used a FARO Focus laser scanner to acquire the four 360° datasets listed on the datasets page under [de Dinechin and Paljic 2018]. I exported the resulting point cloud as a depth map from within FARO's software (I believe it was FARO Scene), which I then converted to COLIBRI VR's depth encoding using a custom script in Unity.
  • I haven't tested on Oculus Quest so I cannot give you any estimate. In general this highly depends on the scene you are rendering (size of the mesh, number of images...), so you would have to try on your scene to really know.
I really like your idea of creating a huge library of VR scenes! Have you been able to move forward with this project? I haven't found many publicly available 360° (image + depth) datasets to list on the datasets page, so if you end up creating one and want me to add it to the list I would be happy to do so.

Cheers,
Greg
Reply all
Reply to author
Forward
0 new messages