Capturing Remote, inbound Audio samples on Android

59 views
Skip to first unread message

William Seemann

unread,
Oct 22, 2023, 10:17:45 AM10/22/23
to discuss-webrtc
Hello, I'm using this library on Android to implement WebRTC, which is a fork of the Google WebRTC implementation. I'm able to intercept video frames using the addSink method. I noticed that `AudioTrack` doesn't have an addSink method associated with it. I would like the implement the ability to record both video and audio from my inbound camera stream. Does anyone know of a way to accomplish this? I've searched several forums and it seems I would have to implement this functionality myself at the native layer.

tales born

unread,
Oct 6, 2025, 10:25:58 AM (4 days ago) Oct 6
to discuss-webrtc
On the Web, you can record a WebRTC stream with MediaRecorder or use MediaStreamTrackProcessor to pull PCM and encode (WAV/MP3/AAC).

On Android native (including React Native), those are Web APIs and are not available. The org.webrtc Android SDK doesn’t expose a playout PCM callback, and react-native-webrtc doesn’t implement MediaRecorder/MediaStreamTrackProcessor.

Viable approaches on Android:
  • Fork/patch org.webrtc to add a playout PCM callback (e.g., in JavaAudioDeviceModule/WebRtcAudioTrack). This lets you tap the remote audio frames (similar to a custom RTCAudioDevice on iOS) and feed your encoder/muxer.
  • Use Android 10+ AudioPlaybackCapture via MediaProjection to capture the app’s playback and encode it. This requires a user permission prompt and captures device/app audio rather than just a specific WebRTC track.
If you need parity with an iOS custom RTCAudioDevice intercept, the forked org.webrtc approach is the closest match.
Reply all
Reply to author
Forward
0 new messages