VideoCapturer on iOS

3,975 views
Skip to first unread message

Mike Anderson

unread,
Oct 4, 2013, 4:17:18 PM10/4/13
to discuss...@googlegroups.com
I've been playing around with the (not officially landed) video capabilities in the iOS API but I am having trouble engaging sending video.
If anyone working on this implementation could shed some light I would be really grateful.

When I create a RTCVideoCapturer object I create it using device.uniqueId where device is my front camera grabbed by iterating through [AVCaptureDevice devicesWithMediaType:] to get a front facing camera. I then create the RTCVideoSource using that capturer and an empty constraints object.

This seems to create a video capturer that can be fed to the video stream, but no video flows through the video track.

Am I using the correct device name? should I be using device.localizedName instead?
Does the RTCVideoSource need additional constraints or is an empty media constraint set fine?

Thanks
-Mike


Bridger Maxwell

unread,
Oct 6, 2013, 9:48:00 AM10/6/13
to discuss...@googlegroups.com
It looks like the WebRTC folks are very close to supporting video on iOS, but there isn't a way to render the video using the ObjC interface. You need to write that interface yourself. The solution I have made for it is certainly not ideal, or I would share it here. Here is what I did to get the video capturer started.

Yes, the video capturer should be created with the device's localizedName instead of uniqueId. This is sorta odd. Here is the code I run to make the video track.

        NSString *frontCameraID = nil;
        for (AVCaptureDevice *captureDevice in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] ) {
            if (!frontCameraID || captureDevice.position == AVCaptureDevicePositionFront) {
                frontCameraID = [captureDevice localizedName];
            }
        }
        
        if (frontCameraID) { // This is nil on the simulator
            [self setVideoSource:[self.peerConnectionFactory videoSourceWithCapturer:[RTCVideoCapturer capturerWithDeviceName:frontCameraID] constraints:nil]];
            [self setLocalVideoTrack:[self.peerConnectionFactory videoTrackWithID:@"Understudyv0" source:_videoSource]];
            [[self delegate] whiteboardConnection:self newLocalVideoTrack:[self localVideoTrack]];
        }


On the receiving side, I used the constraints @"OfferToReceiveVideo".

- Bridger Maxwell

den...@indievid.org

unread,
Oct 7, 2013, 6:48:32 AM10/7/13
to discuss...@googlegroups.com
Hi,

Yes it looks very close. But still not there. Hope its coming very soon!

Do any one have any info on what versions of iPhone and iPad that will be needed? or is it only bond to iOS?

Thx

Mike Anderson

unread,
Oct 7, 2013, 3:28:43 PM10/7/13
to discuss...@googlegroups.com
Thanks!
Have you encountered a problem where the videoCapturer ObjC interface returns an object but does not initialize the native video capturer?
I am getting a segfault at the following line because my native video capturer pointer is null

void VideoSource::Initialize(
const webrtc::MediaConstraintsInterface* constraints) {

  std::vector<cricket::VideoFormat> formats;
  if (video_capturer_->GetSupportedFormats() &&
      video_capturer_->GetSupportedFormats()->size() > 0) {

    formats = *video_capturer_->GetSupportedFormats();
  }

...


-Mike

Bridger Maxwell

unread,
Oct 7, 2013, 5:31:28 PM10/7/13
to discuss...@googlegroups.com
It looks like it will at least work on iOS 6 and iOS 7. I don't know if it would be compatible with iOS 5, but there aren't many devices out there still running an OS that old. 

I have it running on an iPad 2, which is the slowest of the iPads with a camera. I haven't tried running the video on any iPhones.

--
 
---
You received this message because you are subscribed to a topic in the Google Groups "discuss-webrtc" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/discuss-webrtc/vBD_A7gY9Io/unsubscribe.
To unsubscribe from this group and all its topics, send an email to discuss-webrt...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

den...@indievid.org

unread,
Oct 13, 2013, 12:36:14 PM10/13/13
to discuss...@googlegroups.com
Hi,

What branch and what reversion do you run to do this?

/Dennis

den...@indievid.org

unread,
Oct 14, 2013, 5:06:39 PM10/14/13
to discuss...@googlegroups.com
Hi Mike,

did you find out way you got this crach? I get a <Warning>: Application 'UIKitApplication:com.google.AppRTCDemo[0xeb9c]' exited abnormally with signal 11: Segmentation fault: 11 when I run it with video.

Dennis

I Ainx

unread,
Nov 12, 2013, 5:34:05 AM11/12/13
to discuss...@googlegroups.com
Hi Mike,

I ran into this as well recently. Tracing it back, the default devicemanager is set to return NULL on IOS. Removing the checks for IOS (so that it initialises a WebRtcVideoCapturer object) stops it from crashing, but it fails to work as the capabilities detection in WebRtcVideoCapturer::Init cannot find any capabilities on IOS.

I'm still not sure how best to proceed from here, any extra tips would be welcome.

iain

Mike Anderson

unread,
Nov 12, 2013, 6:40:52 PM11/12/13
to discuss...@googlegroups.com
Hi Iain, Dennis,
Sorry for not replying earlier. The solution is pretty gross, which is I think why people have not been openly publicizing or advocating their fixes.
So as Bridger said originally, you must comment out the following lines in talk/media/webrtc/webrtcvideocapturer.cc

//if (supported.empty()) {
// LOG(LS_ERROR) << "Failed to find usable formats for id: " << device.id;
// return false;
//}

But you also need to remove all the IOS pre-processor statements that were (I speculate) originally included to make sure the code compiled when the video capturer was in a non-building state.

in talk/media/devices/devicemanager.cc 

#if !defined(IOS) <--- THESE NEED TO GO
#if defined(HAVE_WEBRTC_VIDEO)
#include "talk/media/webrtc/webrtcvideocapturer.h"
#endif
#if defined(HAVE_WEBRTC_VIDEO)
#define VIDEO_CAPTURER_NAME WebRtcVideoCapturer
#endif
#endif
 <--- THESE NEED TO GO
...
#if defined(IOS) <--- THESE NEED TO GO
LOG_F(LS_ERROR) << " should never be called!";
return NULL;
#else
...
#endif <--- THESE NEED TO GO

I may have missed some, but you should be able to text search through the directory.
I'm currently trying to find a way to port the log printing to iOS (which would have made finding this much easier)

Hope this helps

I Ainx

unread,
Nov 13, 2013, 4:53:21 AM11/13/13
to discuss...@googlegroups.com
Thanks Mike,

I think I'd found all these things, and commented them out (even made up my own format in case it needed one). It still crashes shortly after but it's good to know that I'm on the right track.

As for the logging: I turned it on in talk/base/logging.cc at line 94 by adding #define _DEBUG 1

Thanks again, it is good to have confirmation that my poking around in the dark is getting me somewhere. I'll go back and read all of Bridger's other messages and see what tips they have. If you ever feel like publishing your patch, I'll not judge you for the quality of the solution :)

iain

I Ainx

unread,
Nov 13, 2013, 6:24:24 AM11/13/13
to discuss...@googlegroups.com
Oh, ha,

The reason it was crashing after I'd removed the IOS defines was simply because ARC was freeing my RTCVideoCapturer. Doh. Thanks for your help though. Now to work out the VideoRenderer.

iain

Gregg Ganley

unread,
Nov 14, 2013, 6:30:31 PM11/14/13
to discuss...@googlegroups.com
With the help of Bridger Maxwell and this thread I have a working iOS AppRTCDemo with audio and VIDEO at full duplex.  checkout the project on github:

Hope this helps folks that are stuck.
--Gregg Ganley 

Ummer farooque M

unread,
Nov 19, 2013, 12:55:16 AM11/19/13
to discuss...@googlegroups.com
Hi Gregg Ganley,
I have tried to build and run AppRTCDemo using the link provided by you (https://github.com/gandg/webrtc-ios). I can success fully build and when I run on iphone device, no video ( local video and remote video) is rendered on the view . My question is , shall I need to make any changes to the code provided by you?. You mentioned some instruction  for building libs myself on README file.Is that instruction is important to run the app? Will you please help me to run the  AppRTCDemo app ?
Thanks & regards
Ummer Farooque

Gregg Ganley

unread,
Nov 25, 2013, 1:27:28 PM11/25/13
to discuss...@googlegroups.com
Hello Ummer,
 
I have not tested on an iPhone, could be that the placement of the window is out of view.  You should not need to make changes to the code or libraries. No the library building is for folks who may need to tweak the renderer or other low level changes.
 
--Gregg

Ummer farooque M

unread,
Nov 26, 2013, 12:33:28 AM11/26/13
to discuss...@googlegroups.com
Hi Gregg ganley,

Thanks  for fast response...Have you successfully connected and video chat with https://apprtc.appspot.com/ ?. I could not see any video on ios device..I got only audio support..I got only text messages on view that has done google code early.If you are success on video chat, will you please provide some instruction ? Please help me...I hope a positive response from you....

Thanks& Regards,
Ummer
Message has been deleted
Message has been deleted

Mike K

unread,
Nov 27, 2013, 3:25:25 PM11/27/13
to discuss...@googlegroups.com
I can confirm that the video is working on iPhone device the problem is that the remote video shows in my case only after VAdapt Frame: 0/6000. Do anyone know is it possible to force refresh whatever is done after Frame 0/6000? Maybe a FIR request or something like that.
Is it possible to run in a debug mode so to see trace what happens inside the libraries?

Reg M

Dne ponedeljek, 25. november 2013 19:27:28 UTC+1 je oseba Gregg Ganley napisala:

Ummer farooque M

unread,
Dec 1, 2013, 11:49:10 PM12/1/13
to discuss...@googlegroups.com
Hi Mike,
 I can't perform the working of video on iphone device..Will you please give some instruction to perform video ? I got only text messages on view...no video is showing...Please help me...

Thanks & regards
Ummer

Gregg Ganley

unread,
Dec 2, 2013, 1:48:16 PM12/2/13
to discuss...@googlegroups.com
Reg,

A few posts earlier, iain posted this on how to enable debug messages in the libraries:

...As for the logging: I turned it on in talk/base/logging.cc at line 94 by adding #define _DEBUG 1 …

This was very helpful for me.
--Gregg

Gregg Ganley

unread,
Dec 2, 2013, 1:50:16 PM12/2/13
to discuss...@googlegroups.com
Ummer,

This may be a problem with your wifi router firewall blocking traffic.  I find that my corporate wifi router allows both audio and video where as my home wifi router by default was blocking the video.  I did not have time to figure out the home router setting that is causing the issue.

--Gregg

Ummer farooque M

unread,
Dec 3, 2013, 3:39:11 AM12/3/13
to discuss...@googlegroups.com
Hi Gregg & Mike
Thanks for your help...

Rahul Behera

unread,
Dec 12, 2013, 12:50:17 PM12/12/13
to discuss...@googlegroups.com
Gregg & Bridger -- Thanks for your support. I am able to confirm video working with your project. (FINALLY) Also, how did you manage to reduce all the targets to just AppRTCDemo? Anytime I can get ninja to kick a xcodeproj, their are so many targets that my machine hangs periodically when indexing even with a SSD. With your project however, I only see one target which is amazing. Please share how you have managed to do that. I want to make an aggregate target so that we can pull the latest webrtc and push into xcode correctly setting up the targets

Bridger Maxwell

unread,
Dec 12, 2013, 9:38:00 PM12/12/13
to discuss...@googlegroups.com, rbe...@gmail.com
I don’t know the correct way to get this working completely Xcode. Instead, I have decided to let ninja build AppRTCDemo. As a side-effect, the libraries that one needs to link to are built. Then I set up my Xcode project to link to those libraries. It was quite a process. It is totally messy and fragile. Here are some of the details if one wants to follow this perilous path:

My directory is set up like this

/iPad/MyProject.xcodeproj
/trunk/out_ios/
/trunk/out_sim/
/trunk/talk/
/trunk/webrtc/
/trunk/***all of those other files that gclient sets up***

I added this script to the build phases of my project (notice that I installed the google build tools to ~/bin/)

------------
# This script runs ninja to build the WEBRTC libraries. It then collects a list of those libraries for linking later
# The ninja files must already be set up at both trunk/out_sim or trunk/out_ios using gclient. See the README for updating those ninja files.

WEBRTC_CONFIGURATION="Release" #${CONFIGURATION}

if [ $PLATFORM_NAME = "iphonesimulator" ];
then
    WEBRTC_BUILD_DIR="${PROJECT_DIR}/../trunk/out_sim/${WEBRTC_CONFIGURATION}/"
    echo "Building for simulator"
else
    WEBRTC_BUILD_DIR="${PROJECT_DIR}/../trunk/out_ios/${WEBRTC_CONFIGURATION}/"
    echo "Building for device"
fi

CURRENT_BUILD_DIR="${PROJECT_DIR}/../trunk/out_current"

# Link the current build dir to point to the correct output folder. The current build dir can be used as a framework search path
rm -f $CURRENT_BUILD_DIR
ln -s $WEBRTC_BUILD_DIR $CURRENT_BUILD_DIR

cd $WEBRTC_BUILD_DIR
~/bin/depot_tools/ninja AppRTCDemo

# Find all libraries in this folder and print out a useful message one can use to set up the "other linker flags" for either ios or sim
WEBRTC_LIBRARIES=`ls $WEBRTC_BUILD_DIR | grep '\.a$' | sed 's/lib/-l/' | sed 's/\.a//'`
echo "Other linker flags should be set to: "
echo $WEBRTC_LIBRARIES
------------

It builds for the current platform and makes a link to the current platform’s build folder. This link is at trunk/out_current. We can use this folder to link to the libraries.

In myProject, I set Library Search Paths (LIBRARY_SEARCH_PATHS) to $(inherited) $(SRCROOT)/../trunk/out_current/

Also in myProject I use the Other Linker Flags (OTHER_LDFLAGS) to pass in the names of the libraries to link. It needs a different setting for the simulator or the device. 

For example, OTHER_LDFLAGS for Debug, Any iOS Simulator SDK is set to:
-lCNG -lG711 -lG722 -lNetEq -lNetEq4 -lPCM16B -lacm2 -laudio_coding_module -laudio_conference_mixer -laudio_device -laudio_processing -laudio_processing_sse2 -lbitrate_controller -lcommon_audio -lcommon_audio_sse2 -lcommon_video -lcrnspr -lcrnss -lcrnssckbi -lcrssl -lexpat -liLBC -liSAC -liSACFix -licudata -licui18n -licuuc -ljingle -ljingle_media -ljingle_p2p -ljingle_peerconnection -ljingle_peerconnection_objc -ljingle_sound -ljsoncpp -lmedia_file -lnss_static -lopus -lpaced_sender -lrbe_components -lremote_bitrate_estimator -lrtp_rtcp -lsqlite_regexp -lsrtp -lsystem_wrappers -lusrsctplib -lvideo_capture_module -lvideo_coding_utility -lvideo_engine_core -lvideo_processing -lvideo_processing_sse2 -lvideo_render_module -lvoice_engine -lvpx -lvpx_asm_offsets_vp8 -lvpx_intrinsics_mmx -lvpx_intrinsics_sse2 -lvpx_intrinsics_ssse3 -lwebrtc_i420 -lwebrtc_opus -lwebrtc_utility -lwebrtc_video_coding -lwebrtc_vp8 -lyuv

OTHER_LDFLAGS for Debug, Any iOS SDK is set to:
-lCNG -lG711 -lG722 -lNetEq -lNetEq4 -lPCM16B -lacm2 -laudio_coding_module -laudio_conference_mixer -laudio_device -laudio_processing -laudio_processing_neon -lbitrate_controller -lcommon_audio -lcommon_audio_neon -lcommon_video -lcrnspr -lcrnss -lcrnssckbi -lcrssl -lexpat -liLBC -liSAC -liSACFix -licudata -licui18n -licuuc -lisac_neon -ljingle -ljingle_media -ljingle_p2p -ljingle_peerconnection -ljingle_peerconnection_objc -ljingle_sound -ljsoncpp -lmedia_file -lnss_static -lopus -lpaced_sender -lrbe_components -lremote_bitrate_estimator -lrtp_rtcp -lsqlite_regexp -lsrtp -lsystem_wrappers -lusrsctplib -lvideo_capture_module -lvideo_coding_utility -lvideo_engine_core -lvideo_processing -lvideo_render_module -lvoice_engine -lvpx -lvpx_asm_offsets_vp8 -lvpx_asm_offsets_vpx_scale -lwebrtc_i420 -lwebrtc_opus -lwebrtc_utility -lwebrtc_video_coding -lwebrtc_vp8 -lyuv

That list is generated by the build script I included above. It requires manual copying and pasting into the Xcode settings.

I hope this helps! It is totally hacky, but it allows me to stay in Xcode. Ninja is very fast, so it is probably good that it builds the whole of WebRTC leaving only my project (which is much smaller, by comparison) to Xcode.

- Bridger Maxwell

Ken OKABE

unread,
Dec 17, 2013, 12:12:33 AM12/17/13
to discuss...@googlegroups.com, rbe...@gmail.com
Thanks a lot Bridger Maxwell and others.
I'm a bit new and confused, so please give me advice to grab the whole picture of what's going on.
Here is my understanding, and please correct me if I'm wrong:

As on Dec 2013,
  • Bridger Maxwell has posted ObjC code to fix the issue on Oct.
  • Using the fix, Gregg Ganley has published a working Xcode project for iOS AppRTCDemo with audio and VIDEO at full duplex  https://github.com/gandg/webrtc-ios
  • Gregg's Xcode project is only available for real iOS devices, not for iOS simulators. I suppose the jingle libs are compiled for Arm7 not for x86, so we need to use ninja then Xcode as Bridger has suggested as above.

To build the video and jingle libs yourself, then XCode iOS app:

  • Copy the webrtc_obj files into this dir ...projdir.../trunk/talk/app/webrtc/objc
  • build the AppRTCDemo: wrios && gclient runhooks && ninja -C out_ios/Debug AppRTCDemo
  • copy libs into Xcode build cp ...projdir.../trunk/out_ios/Debug/libvideo_render_module.a ...projdir.../ios_app/webrtc-ios/ios-example/libs
  • I am not using the trunk dir provided in this repo. Using the google one instead.

So, here is my question:

1. Am I correct so far?

2. In Gregg's github project, https://github.com/gandg/webrtc-ios ,

a)  Copy the webrtc_obj files into this dir ...projdir.../trunk/talk/app/webrtc/objc

Where <the webrtc_obj(webRTC_obj) files> came from?? and I guess these files are patched with Bridge's fix. Is there any diff or instruction?

b)  build the AppRTCDemo: wrios && gclient runhooks && ninja -C out_ios/Debug AppRTCDemo

Is this identical with the whole process of the Bridge's post above?

c)  copy libs into Xcode build cp ...projdir.../trunk/out_ios/Debug/libvideo_render_module.a ...projdir.../ios_app/webrtc-ios/ios-example/libs

What does this mean??

copy 

  • libjingle.a
  • libjingle_media.a
  • libjingle_peerconnection_objc.a
  • libvideo_render_module.a

the 4 files under ./trunk/out_ios/Debug/ 

to

../ios_app/webrtc-ios/ios-example/libs ?

d)  I am not using the trunk dir provided in this repo. Using the google one instead.

What does this mean? What is the <google one>?

Thank you.

Ken
Message has been deleted

Gregg Ganley

unread,
Dec 17, 2013, 10:38:43 AM12/17/13
to discuss...@googlegroups.com, rbe...@gmail.com
Ken

Good questions below, I will try and answer in line.


On Tuesday, December 17, 2013 12:12:33 AM UTC-5, Ken OKABE wrote:
Thanks a lot Bridger Maxwell and others.
I'm a bit new and confused, so please give me advice to grab the whole picture of what's going on.
Here is my understanding, and please correct me if I'm wrong:

As on Dec 2013,
>> It appears the comment of "very close" is taken from another google thread with a date of around June 2013.  So hopefully Google will have an out of the box solution, but in the mean time, please feel free to use my offering. 
  • Bridger Maxwell has posted ObjC code to fix the issue on Oct.
  • Gregg's Xcode project is only available for real iOS devices, not for iOS simulators. I suppose the jingle libs are compiled for Arm7 not for x86, so we need to use ninja then Xcode as Bridger has suggested as above.
>> Correct 

To build the video and jingle libs yourself, then XCode iOS app:

  • Copy the webrtc_obj files into this dir ...projdir.../trunk/talk/app/webrtc/objc
  • build the AppRTCDemo: wrios && gclient runhooks && ninja -C out_ios/Debug AppRTCDemo
  • copy libs into Xcode build cp ...projdir.../trunk/out_ios/Debug/libvideo_render_module.a ...projdir.../ios_app/webrtc-ios/ios-example/libs
  • I am not using the trunk dir provided in this repo. Using the google one instead.

So, here is my question:

1. Am I correct so far?

>> Yes 

2. In Gregg's github project, https://github.com/gandg/webrtc-ios ,

a)  Copy the webrtc_obj files into this dir ...projdir.../trunk/talk/app/webrtc/objc

Where <the webrtc_obj(webRTC_obj) files> came from?? and I guess these files are patched with Bridge's fix. Is there any diff or instruction?

>> The webrtc_obj directory is in my github repo and it contains the source files for the needed jingle and other libs.  The source code is mostly from Bridger but with some crucial tweaks from this thread.  It would be great if Google would pull them into their main trunk
 

b)  build the AppRTCDemo: wrios && gclient runhooks && ninja -C out_ios/Debug AppRTCDemo

Is this identical with the whole process of the Bridge's post above?

>> Should be. 

c)  copy libs into Xcode build cp ...projdir.../trunk/out_ios/Debug/libvideo_render_module.a ...projdir.../ios_app/webrtc-ios/ios-example/libs

What does this mean??

>>When the ninja build is complete without error, copy the resultant object libraries and copy them back into the Xcode project such that your changes can be linked in and run as part of the app.  In general, most folks will not need to change the jingle lib source code, I wanted more debug output so I have turned on various verbose output flags. 

copy 

  • libjingle.a
  • libjingle_media.a
  • libjingle_peerconnection_objc.a
  • libvideo_render_module.a

the 4 files under ./trunk/out_ios/Debug/ 

to

../ios_app/webrtc-ios/ios-example/libs ?

>> Yes, you got it!
 

d)  I am not using the trunk dir provided in this repo. Using the google one instead.

What does this mean? What is the <google one>?
>> I forked this repo originally and so it comes (when you clone it) with the Google trunk source code which is a few month old and possibly out of date.  Based on this, I would not use this trunk code, but instead recommend pulling the latest form google.

I have updated my repo README to be a little clearer based on these questions, thanks Ken!

Ken OKABE

unread,
Dec 17, 2013, 9:13:39 PM12/17/13
to discuss...@googlegroups.com, rbe...@gmail.com
Gregg,

Thanks a lot for your input and your time.

What I was confused due to my lack of knowledge is that this example iOS app has a core library located in another place,  ../trunk/talk/app/

webrtc/objc .

So, I now clearly understood this is the WebRTC library, and the example is also to show how to optimize to bundle the (jingle) library.

https://code.google.com/p/webrtc/source/browse/trunk/talk/app/webrtc/objc/README?r=5302

This README makes sense to me what's going on.

So, here's what happened when I tried. (next post to clarify the topic)

Ken OKABE

unread,
Dec 17, 2013, 10:42:40 PM12/17/13
to discuss...@googlegroups.com, rbe...@gmail.com
Basically, I want to run the Demo in Simulators.

In Simulator,
I succeeded to run the default repo from

https://code.google.com/p/webrtc/source/browse/trunk/talk/app/webrtc/objc/?r=5302
and
https://code.google.com/p/webrtc/source/browse/trunk/?r=5302#trunk%2Ftalk%2Fexamples%2Fios%2FAppRTCDemo

However, RTC connection fails somehow.

Then, I replaced
.../trunk/talk/app/webrtc/objc
with
the webrtc_obj dir in your project ( https://github.com/gandg/webrtc-ios )

I've got error

$ ninja -C out_sim/Debug iossim AppRTCDemo
ninja: Entering directory `out_sim/Debug'
[452/2099] ACTION(host) Generating header
2013-12-18 10:45:10.714 class-dump[12893:d07] Unknown load command: 0x00000024
2013-12-18 10:45:10.734 class-dump[12893:d07] Unknown load command: 0x0000002a
2013-12-18 10:45:10.735 class-dump[12893:d07] Unknown load command: 0x00000026
2013-12-18 10:45:10.736 class-dump[12893:d07] Unknown load command: 0x00000029
2013-12-18 10:45:10.737 class-dump[12893:d07] Unknown load command: 0x0000002b
[552/2099] OBJCXX obj.host/testing/iossim/iossim.iossim.o
In file included from ../../testing/iossim/iossim.mm:40:
obj.host/testing/iossim/iossim.gen/iossim/iPhoneSimulatorRemoteClient.h:37:1: warning: no 'assign', 'retain', or 'copy' attribute is specified - 'assign' is assumed [-Wobjc-property-no-attribute]
@property(nonatomic) id <DTiPhoneSimulatorSessionDelegate> delegate; // @synthesize delegate=_delegate;
^
obj.host/testing/iossim/iossim.gen/iossim/iPhoneSimulatorRemoteClient.h:37:1: warning: default property attribute 'assign' not appropriate for non-GC object [-Wobjc-property-no-attribute]
2 warnings generated.
[2097/2099] OBJC obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCAppDelegate.o
../../talk/examples/ios/AppRTCDemo/APPRTCAppDelegate.m:47:17: warning: method 'peerConnection:addedDataChannel:' in protocol not implemented [-Wprotocol]
@implementation PCObserver {
                ^
../../talk/app/webrtc/objc/public/RTCPeerConnectionDelegate.h:56:1: note: method 'peerConnection:addedDataChannel:' declared here
- (void)peerConnection:(RTCPeerConnection *)peerConnection
^
../../talk/examples/ios/AppRTCDemo/APPRTCAppDelegate.m:41:12: note: required for direct or indirect protocol 'RTCPeerConnectionDelegate'
@interface PCObserver : NSObject<RTCPeerConnectionDelegate>
           ^
1 warning generated.
[2098/2099] LINK AppRTCDemo.app/AppRTCDemo, POSTBUILDS
FAILED: ../../third_party/llvm-build/Release+Asserts/bin/clang -framework Foundation -framework UIKit -framework IOKit -framework Security -framework SystemConfiguration -framework AVFoundation -framework CoreMedia -framework CoreVideo -framework OpenGLES -framework QuartzCore -Wl,-search_paths_first -Wl,-ObjC -mios-simulator-version-min=6.0 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator7.0.sdk -arch i386 -L. -o AppRTCDemo.app/AppRTCDemo obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCAppClient.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCAppDelegate.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCViewController.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.GAEChannelClient.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.main.o libjingle_peerconnection_objc.a libjingle_peerconnection.a libjingle.a libexpat.a libjsoncpp.a libcrnss.a libnss_static.a libcrnspr.a libsqlite_regexp.a libicui18n.a libicuuc.a libicudata.a libcrnssckbi.a libcrssl.a libjingle_media.a libyuv.a libvideo_capture_module.a libwebrtc_utility.a libaudio_coding_module.a libCNG.a libcommon_audio.a libsystem_wrappers.a libcommon_audio_sse2.a libG711.a libG722.a libiLBC.a libiSAC.a libiSACFix.a libPCM16B.a libNetEq.a libwebrtc_opus.a libopus.a libacm2.a libNetEq4.a libmedia_file.a libwebrtc_video_coding.a libwebrtc_i420.a libcommon_video.a libvideo_coding_utility.a libwebrtc_vp8.a libvpx.a libvpx_asm_offsets_vp8.a libvpx_intrinsics_mmx.a libvpx_intrinsics_sse2.a libvpx_intrinsics_ssse3.a libvideo_render_module.a libvideo_engine_core.a librtp_rtcp.a libpaced_sender.a libremote_bitrate_estimator.a librbe_components.a libbitrate_controller.a libvideo_processing.a libvideo_processing_sse2.a libvoice_engine.a libaudio_conference_mixer.a libaudio_processing.a libaudio_processing_sse2.a libaudio_device.a libjingle_sound.a libjingle_p2p.a libsrtp.a  -framework Foundation -lstdc++ /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator7.0.sdk/usr/lib/libsqlite3.dylib -framework AudioToolbox -framework CoreAudio
Undefined symbols for architecture i386:
  "_OBJC_CLASS_$_RTCDataChannel", referenced from:
      objc-class-ref in libjingle_peerconnection_objc.a(libjingle_peerconnection_objc.RTCPeerConnection.o)
      objc-class-ref in libjingle_peerconnection_objc.a(libjingle_peerconnection_objc.RTCPeerConnectionObserver.o)
ld: symbol(s) not found for architecture i386
clang: error: linker command failed with exit code 1 (use -v to see invocation)
[2098/2099] MACTOOL copy-bundle-resource ../../t...ios/AppRTCDemo/en.lproj/APPRTCViewController.xib
2013-12-18 10:51:05.658 Interface Builder Cocoa Touch Tool[16264:303] CFPreferences: user home directory at file:///Users/ken/Library/Application%20Support/iPhone%20Simulator/User/ is unavailable. User domains will be volatile.
ninja: build stopped: subcommand failed.

So, I also want to replace the
.../trunk/talk/examples/android
 with
.../trunk/talk/examples/android 
in your project ( https://github.com/gandg/webrtc-ios )

Here, I start from scratch to avoid a potential mess.

cd projectRootDir
Delete trunk
gclient sync
cd projectRootDir/trunk
export GYP_DEFINES="build_with_libjingle=1 build_with_chromium=0 libjingle_objc=1"
export GYP_GENERATORS="ninja"
export GYP_DEFINES="$GYP_DEFINES OS=ios target_arch=ia32"
export GYP_GENERATOR_FLAGS="$GYP_GENERATOR_FLAGS output_dir=out_sim"
export GYP_CROSSCOMPILE=1
gclient runhooks

//Here replace libjingle and exampleApp code
replace
projectRootDir/trunk/talk/app/webrtc/objc
with
https://github.com/gandg/webrtc-ios/tree/master/webRTC_obj

replace
projectRootDir/trunk/talk/examples/android
 with
https://github.com/gandg/webrtc-ios/tree/master/trunk/talk/examples/android
//------------------------------------------

Finally, compile

$ ninja -C out_sim/Debug iossim AppRTCDemo

ninja: Entering directory `out_sim/Debug'
[362/2099] ACTION(host) Generating header
2013-12-18 12:33:16.150 class-dump[19444:d07] Unknown load command: 0x00000024
2013-12-18 12:33:16.166 class-dump[19444:d07] Unknown load command: 0x0000002a
2013-12-18 12:33:16.167 class-dump[19444:d07] Unknown load command: 0x00000026
2013-12-18 12:33:16.168 class-dump[19444:d07] Unknown load command: 0x00000029
2013-12-18 12:33:16.169 class-dump[19444:d07] Unknown load command: 0x0000002b
[500/2099] OBJCXX obj.host/testing/iossim/iossim.iossim.o
In file included from ../../testing/iossim/iossim.mm:40:
obj.host/testing/iossim/iossim.gen/iossim/iPhoneSimulatorRemoteClient.h:37:1: warning: no 'assign', 'retain', or 'copy' attribute is specified - 'assign' is assumed [-Wobjc-property-no-attribute]
@property(nonatomic) id <DTiPhoneSimulatorSessionDelegate> delegate; // @synthesize delegate=_delegate;
^
obj.host/testing/iossim/iossim.gen/iossim/iPhoneSimulatorRemoteClient.h:37:1: warning: default property attribute 'assign' not appropriate for non-GC object [-Wobjc-property-no-attribute]
2 warnings generated.
[2096/2099] OBJC obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCAppDelegate.o
../../talk/examples/ios/AppRTCDemo/APPRTCAppDelegate.m:47:17: warning: method 'peerConnection:addedDataChannel:' in protocol not implemented [-Wprotocol]
@implementation PCObserver {
                ^
../../talk/app/webrtc/objc/public/RTCPeerConnectionDelegate.h:56:1: note: method 'peerConnection:addedDataChannel:' declared here
- (void)peerConnection:(RTCPeerConnection *)peerConnection
^
../../talk/examples/ios/AppRTCDemo/APPRTCAppDelegate.m:41:12: note: required for direct or indirect protocol 'RTCPeerConnectionDelegate'
@interface PCObserver : NSObject<RTCPeerConnectionDelegate>
           ^
1 warning generated.
[2098/2099] LINK AppRTCDemo.app/AppRTCDemo, POSTBUILDS
FAILED: ../../third_party/llvm-build/Release+Asserts/bin/clang -framework Foundation -framework UIKit -framework IOKit -framework Security -framework SystemConfiguration -framework AVFoundation -framework CoreMedia -framework CoreVideo -framework OpenGLES -framework QuartzCore -Wl,-search_paths_first -Wl,-ObjC -mios-simulator-version-min=6.0 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator7.0.sdk -arch i386 -L. -o AppRTCDemo.app/AppRTCDemo obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCAppClient.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCAppDelegate.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCViewController.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.GAEChannelClient.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.main.o libjingle_peerconnection_objc.a libjingle_peerconnection.a libjingle.a libexpat.a libjsoncpp.a libcrnss.a libnss_static.a libcrnspr.a libsqlite_regexp.a libicui18n.a libicuuc.a libicudata.a libcrnssckbi.a libcrssl.a libjingle_media.a libyuv.a libvideo_capture_module.a libwebrtc_utility.a libaudio_coding_module.a libCNG.a libcommon_audio.a libsystem_wrappers.a libcommon_audio_sse2.a libG711.a libG722.a libiLBC.a libiSAC.a libiSACFix.a libPCM16B.a libNetEq.a libwebrtc_opus.a libopus.a libacm2.a libNetEq4.a libmedia_file.a libwebrtc_video_coding.a libwebrtc_i420.a libcommon_video.a libvideo_coding_utility.a libwebrtc_vp8.a libvpx.a libvpx_asm_offsets_vp8.a libvpx_intrinsics_mmx.a libvpx_intrinsics_sse2.a libvpx_intrinsics_ssse3.a libvideo_render_module.a libvideo_engine_core.a librtp_rtcp.a libpaced_sender.a libremote_bitrate_estimator.a librbe_components.a libbitrate_controller.a libvideo_processing.a libvideo_processing_sse2.a libvoice_engine.a libaudio_conference_mixer.a libaudio_processing.a libaudio_processing_sse2.a libaudio_device.a libjingle_sound.a libjingle_p2p.a libsrtp.a  -framework Foundation -lstdc++ /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator7.0.sdk/usr/lib/libsqlite3.dylib -framework AudioToolbox -framework CoreAudio
Undefined symbols for architecture i386:
  "_OBJC_CLASS_$_RTCDataChannel", referenced from:
      objc-class-ref in libjingle_peerconnection_objc.a(libjingle_peerconnection_objc.RTCPeerConnection.o)
      objc-class-ref in libjingle_peerconnection_objc.a(libjingle_peerconnection_objc.RTCPeerConnectionObserver.o)
ld: symbol(s) not found for architecture i386
clang: error: linker command failed with exit code 1 (use -v to see invocation)
[2098/2099] MACTOOL copy-bundle-resource ../../talk/examples/ios/AppRTCDemo/en.lproj/APPRTCViewController.xib
2013-12-18 12:38:57.560 Interface Builder Cocoa Touch Tool[23007:303] CFPreferences: user home directory at file:///Users/ken/Library/Application%20Support/iPhone%20Simulator/User/ is unavailable. User domains will be volatile.
ninja: build stopped: subcommand failed.
--------------------------------------------------------------------

I'm not sure, but for unknown reason (to me), the code fails to compile for iOS simulator.
Any thought?

Ken

Rahul Behera

unread,
Dec 18, 2013, 2:28:39 PM12/18/13
to Ken OKABE, discuss...@googlegroups.com
Out of curiosity, how do you plan on actually testing the video on the iOS Simulator? From what I know you can't get the video stream from the camera on the simulator

Ken OKABE

unread,
Dec 18, 2013, 3:27:52 PM12/18/13
to discuss...@googlegroups.com, Ken OKABE
Oh, iOS Simulator can't get the video stream from the camera?

I use Android Simulators (Genymotion), and it connects the camera and mic of the host machine(mac book air).

jordi domenech flores

unread,
Dec 19, 2013, 3:38:22 PM12/19/13
to discuss...@googlegroups.com, rbe...@gmail.com

Gregg,

Thanks a lot for your work going on with video on iOS (and everyone here digging on this!). Rather than make the Google's AppRTCDemo work with video, in my case I'm more interested on understand what you did, what you add / changed. So, I've started checking out your github repo history, as I think code history (=evolution) is GREAT for learning and understanding. Anyway, there's no history under the webRTC_obj folder (https://github.com/gandg/webrtc-ios/tree/master/webRTC_obj), and I guess the point of it all is inside that folder, right? I'd really appreciate if you could explain your work there e.g. which files did you modify, or if you faced any issue after that... I understand that you didn't make any change outside the demo, in the core, did you?

Well, that's all. Again, thanks for your time and good job!

Ken OKABE

unread,
Dec 19, 2013, 4:38:33 PM12/19/13
to discuss...@googlegroups.com, rbe...@gmail.com
jordi,

Did you try to build for iOS native and simulators with his code?
What is the result?

Ken

jordi domenech flores

unread,
Dec 19, 2013, 5:25:14 PM12/19/13
to discuss...@googlegroups.com
hi Ken,

actually I started from scratch and i managed to understand the peer connection flow, got it work successfully from device <> simulator, but only voice w/ a custom signaling. but in the beginning I run the demo (audi only) with no problems. btw, video should work from device to simulator.

Ken OKABE

unread,
Dec 19, 2013, 5:38:24 PM12/19/13
to discuss...@googlegroups.com
jordi,

Sounds great.
As I mentioned before your first post, I tried to run the demo with Gregg's project with webRTC_obj  (https://github.com/gandg/webrtc-ios/tree/master/webRTC_obj) , but failed to build on Simulators.

I also want to grasp the whole process to make it work.

Do you mean you working project (from scratch following the thread info.) on device And simulator differs from Gregg's project??
If so is it possible to share your knowledge to build working project from scratch and the result perhaps on github?

Thanks.

Ken

jordi domenech flores

unread,
Dec 19, 2013, 7:20:10 PM12/19/13
to discuss...@googlegroups.com


yes, it's a project written from 0, nothing to do with the demo.
​actually, sharing the code is my todo list... but i would like 1st understand and complete the video part.

Sajid Hussain

unread,
Jan 2, 2014, 4:56:50 AM1/2/14
to discuss...@googlegroups.com, rbe...@gmail.com
Hi Greg,

From this discussion and from your README file, I understand that you did some changes and wrote down some code to achieve the video capturering and rendering funcationality.

I understand that the underlying code changes are compiled up in the form of these static libraries:

    - ios-example/libs/libjingle.a
    - ios-example/libs/libjingle_media.a
    - ios-example/libs/libjingle_peerconnection_objc.a
    - ios-example/libs/libvideo_render_module.a

In your repository: https://github.com/gandg/webrtc-ios, there are 2 projects: ios-example/AppRTCDemo.xcodeproj & trunk/all.xcodeproj

'AppRTCDemo' is linked with 63 static libraries including the above 4. 'all' is a comprehensive project having targets to build 6 libjingle static libraries, libxmpphelp and AppRTCDemo. But the committed code on which these libraries are built is based on some older version of google webrtc implementation.

I have applied the tweeks that are discussed in this discussion to the original implementation but it didn't work as is. There must be more tweeks that you did to make it work.

It would be very helpful if you can share the tweeked code base.

Thanks

SAJID HUSSAIN

Kimberley Hansen

unread,
May 18, 2014, 12:28:35 AM5/18/14
to discuss...@googlegroups.com, rbe...@gmail.com
Hi there Greg,

First of all, thanks for your awesome work on webrtc-ios! Second of all, I'm a little stuck :( I can successfully build/run, but:

1) Trying to connect to the webrtc page results in a black screen with a white bar at the bottom.  Refreshing a few times results in a "this room is full" message.  Sounds similar to what was mentioned above, but I've tried on multiple networks, and nothing seems to work.
2)  Running the app on my phone results in a grey screen with a white bar on the bottom, and a "Loading.... (tap to dismiss" message in the middle of the screen.

This is my first XCode project, so I'm feeling pretty out of my depth :( Any help/thoughts you could shoot my way would be very much appreciated.

Setup: Macbook Air/Mavericks, iPhone 5, latest iOS, XCode 5.11

Best regards,

Kim Hansen

G-Pacer SW

unread,
May 19, 2014, 1:34:09 PM5/19/14
to discuss...@googlegroups.com
Kim,

This project is very challenging and I do not recommend it for someone who is just starting out in iOS. 
A few items that might help:

- The web browser and the iPhone app can only try to connect once.  If it doesn’t work, because of the network, or the app crashes etc, you must refresh the web browser to obtain a new room number.  Also, you must stop and restart the iOS app, each time.
- The log file is filled with helpful debug output, be sure to look through this and use the output to help debug your issues

Good luck,
—Gregg

—Gregg
For more options, visit https://groups.google.com/d/optout.

Nagendra Mahto

unread,
Sep 27, 2014, 4:02:22 AM9/27/14
to discuss...@googlegroups.com
Hi Mike

I did some code for getting videosource its working fine for me.
see below code for getting videosource.


self.videodevice=[self frontFacingCameraIfAvailable];

    NSLog(@"***DebugLine5");

    NSString *cameraID = nil;

    

    if (self.videodevice)

    {

        cameraID=[self.videodevice localizedName];

    }

    NSLog(@"***DebugLine6");

    self.videocapturer = [RTCVideoCapturer capturerWithDeviceName:cameraID];

    

    

    

    NSLog(@"***DebugLine7");

    self.videoSource=[self.peerConnectionFactory videoSourceWithCapturer:self.videocapturer constraints:nil];

    

    NSLog(@"***DebugLine8");

    self.localVideoTrack =[self.peerConnectionFactory videoTrackWithID:vieoid source:self.videoSource];

    

    NSLog(@"***DebugLine9");

    if (self.localVideoTrack)

    {

        [self.localmediaStream addVideoTrack:self.localVideoTrack];

    }

    

    if (shouldReceiveVideo) {

        

        [self.peerConnection addStream:self.localmediaStream constraints:constraints];

        

    }

     

On Saturday, October 5, 2013 1:47:18 AM UTC+5:30, Mike Anderson wrote:
I've been playing around with the (not officially landed) video capabilities in the iOS API but I am having trouble engaging sending video.
If anyone working on this implementation could shed some light I would be really grateful.

When I create a RTCVideoCapturer object I create it using device.uniqueId where device is my front camera grabbed by iterating through [AVCaptureDevice devicesWithMediaType:] to get a front facing camera. I then create the RTCVideoSource using that capturer and an empty constraints object.

This seems to create a video capturer that can be fed to the video stream, but no video flows through the video track.

Am I using the correct device name? should I be using device.localizedName instead?
Does the RTCVideoSource need additional constraints or is an empty media constraint set fine?

Thanks
-Mike


Reply all
Reply to author
Forward
0 new messages