Excellent question!
Short answer: latency and callback regularity will remain unchanged.
Long answer:
There is a preceding effort in progress to
mojofy audio IPC. It will switch control path of renderer/browser audio stream interactions to mojo (mojom::AudioOutputStream is an example), but real-time rendering mechanisms will remain untouched (sync socket/shared memory). So no extra latency on rendering path is introduced at this step.
The next step is moving audio streams to be hosted in the audio service.
This will require 2 hops for a stream creation: renderer requests a stream from browser, browser checks permissions and requests the stream from the audio service/process. But all the further interactions with AudioOutputStream including real-time audio rendering are preformed directly between renderer and the audio service/process in the same way as they are performed between renderer and browser now[after step 1].
So from real-time audio rendering perspective moving the streams to another process is transparent and should not affect neither latency nor callback regularity.