I'm switching from Premiere to Resolve with Karta VR and I have some questions.

19 views
Skip to first unread message

Marc Dantzker

unread,
Apr 19, 2024, 5:00:43 PMApr 19
to Kartaverse
I'm a conservation bioacoustician using VR techniques to study fish sounds. I've been using Premiere for some time, but I'm running into some limitations and bugs, and I'm looking to see if I can jump to Resolve. That brings me to Karta VR, but I'm not finding some of the tools I need. I'm wondering if someone (Andrew or someone else) can help. Here are a couple of questions I've got going in. 

1) I'm working with Insta360 footage. The GoPro Reframe tool constantly crashes for me, so I have exported everything as Equirectangular mp4 files, but because I need it to be looking in a particular direction (have a particular angle at the midline), I have used Premiere's built-in VR Projection tool to pan all of them 180°. When I bring these into Resolve, I lose that pan. I can use Karta to pan the reframing, but I don't know how to pan the EQR when I'm not reframing. Can this be done in resolve, with or without Karta VR?
 
2) We have data layers that we put over the top of our video, and we need to reframe the layers together while preserving the ability to adjust the overlay layers' opacities. As far as I can tell, Karta is applied to a clip, not a timeline. I can compound the clips and apply them to that result, but then I lose the ability to adjust the individual tracks. 

Thank you anyone who can help. 

Marc Dantzker

unread,
Apr 19, 2024, 8:13:28 PMApr 19
to Kartaverse
While I'm at it...  
1) Are there key commands for FOV, Pitch, Yaw, etc.? I'm adjusting them constantly, and the graphical dials are tough to grab.
2) If I have ambisonic audio, can you lock ambisonic in the direction of the video so that when you reframe, the sound moves with you, just like it would in a headset? There are all sorts of new ambisonic tools in 19. I wonder if there is a way to coordinate the moves in Karta with the moves in the soundfield? 

Andrew Hazelden

unread,
Apr 20, 2024, 10:39:14 AMApr 20
to Kartaverse
Hi Marc.

Popular 3rd party VR workflow tools for Resolve include BorisFX Continuum VR Unit, Mocha Pro, Revision Effects, and Reframe360XL plugins.

An important thing to clarify is if you are working in 180VR/360VR with 2D monoscopic, or stereo 3D content. Is your primary deliverable reframed "flat" video output?

1. If you want to perform a horizontal pan on equirectangular footage in the Edit or Color page, a very fast to render built-in tool for the job is to use the native ResolveFX based "Transform" OpenFX feature. This effect can be applied to footage directly in the Edit page and the playback speed is very close to real-time. With equirectangular content, you need to set the "Advanced Options > Edge Behaviour" to "Wrap-Around" to manage wrapping the footage around at the left/right frame border. The "Position X" control allows you to perform a horizontal scrolling effect to pan the footage which allows you to change the front viewing angle.

The Resolve/Fusion page also has a built-in "PanoMap" node that allows for XYZ view rotation. It is technically possible to use this feature as an Effects Template on the Edit page with a macro .setting file approach but it is much slower to render than the ResolveFX Transform tool.

It's worth mentioning that if you plan to do any amount of precise keyframe animation work, the Fusion page allows for more precise spline editing of the keys for Edit page based effects templates.

2. There are several ways to handle applying effects to multiple clips in a Edge page timeline. An attractive option is to use an adjustment layer on the Edit page as it allows you to instantly apply the same effect to multiple vertically video stacked video tracks, and to multiple clips placed sequentially in a timeline. The "Adjustment Clip" feature is accessed from the Edit page "Effects" tab under "Effects > Adjustment Clip". You can apply OpenFX plugins, effects templates, as well Fusion page based effects to an adjustment clip.

* * *

If you go the adjustment clip route, you might find the new Fusion page based Kartaverse kvrViewer node an interesting alternative to the earlier kvrReframe360Ultra node. The kvrViewer tool has onscreen controls on the Fusion page for applying reframing effects in an interactive fashion to 180VR and 360VR video in stereo 3D and 2D mono. The onscreen controls avoid needing to adjust sliders to animate the yaw/pitch/roll values.

* * *

If you have a need to regularly create parametric data driven graphics, it is possible to do interesting things in a nodal fashion with CSV (Comma Separated Value), and JSON records in the Fusion page with the Vonk Ultra data nodes. Vonk Ultra nodes work well inside of effects templates. With this approach you can quickly re-apply the same data graphics to new data sources across multiple projects/timelines.

* * *

As far as ambisonic audio workflows go, the Fairlight page has little connection to the scripting API in Resolve, or to the parameter values of effects in the Fusion page. Hopefully this situation will improve in time. :)

If you are using external DAW tools for ambisonic audio post-production tasks it might be possible to export the panning values to a format the Vonk Ultra data nodes can import like JSON/XML/CSV/TSV. Then those keyframed values could drive any attribute in the Fusion page for reframing footage.

Regards,
Andrew Hazelden
Kartaverse Developer
Reply all
Reply to author
Forward
0 new messages