Any pointer appreciated, why this is not working in Swift 5

372 views
Skip to first unread message

Neil Young

unread,
Apr 17, 2019, 9:13:21 AM4/17/19
to discuss-webrtc
From the AppRtcMobile, broadcast extension:

https://github.com/WebKit/webkit/blob/9d678322281ccd07909759af90ee010e0c34b1c7/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ARDExternalSampleCapturer.m


- (void)didCaptureSampleBuffer:(CMSampleBufferRef)sampleBuffer {
  if (CMSampleBufferGetNumSamples(sampleBuffer) != 1 || !CMSampleBufferIsValid(sampleBuffer) ||
      !CMSampleBufferDataIsReady(sampleBuffer)) {
    return;
  }

  CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
  if (pixelBuffer == nil) {
    return;
  }

  RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];
  int64_t timeStampNs =
      CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * NSEC_PER_SEC;
  RTCVideoFrame *videoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer
                                                           rotation:RTCVideoRotation_0
                                                        timeStampNs:timeStampNs];
  [self.delegate capturer:self didCaptureVideoFrame:videoFrame];
}

My code:

func startRecording() {

        

        guard recorder.isAvailable else {

            print("Recording is not available at this time.")

            return

        }

        

        

        recorder.isMicrophoneEnabled = false

        

        if #available(iOS 11.0, *) {

            recorder.startCapture(handler: { (sampleBuffer, bufferType, error) in

                if (bufferType == .video) {

                    guard CMSampleBufferIsValid(sampleBuffer), CMSampleBufferDataIsReady(sampleBuffer), CMSampleBufferGetNumSamples(sampleBuffer) == 1 else {

                        print("invalid sampleBuffer")

                        return

                    }

                    

                    

                    let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)

                    

                    let rtcpixelBuffer = RTCCVPixelBuffer(pixelBuffer: pixelBuffer!)

                    

                    let timeStampNs: Int64 = Int64(CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * 1000000000)

                    let videoFrame = RTCVideoFrame(buffer: rtcpixelBuffer, rotation: RTCVideoRotation._0, timeStampNs: timeStampNs)

                    

                    self.factory.videoSource().capturer(self.videoCapturer!, didCapture: videoFrame)

                }

            }) {

                (error) in

                if error != nil {

                    print(error)

                }

            }

        } else {

            // Fallback on earlier versions

        }

    }



I mean the only difference is the other delegate (didCapture instead of didCaptureVideoFrame), but my XCode claims, this has been renamed in Swift 3. And there is obviously no "didCaptureVidoeFrame" parameter in Swift.


The other difference is: The code above is supposed to work. Mine is NOT and I have no idea, why not. No video is sent.


More details https://groups.google.com/forum/?utm_medium=email&utm_source=footer#!topic/discuss-webrtc/sr9r6p2OXZY



Neil Young

unread,
Apr 17, 2019, 9:31:25 AM4/17/19
to discuss-webrtc
Oh boy, got it running. It is indeed an issue with the wrong delegate called from Swift. Congratulations
Reply all
Reply to author
Forward
0 new messages