void VideoSource::Initialize(
const webrtc::MediaConstraintsInterface* constraints) {
std::vector<cricket::VideoFormat> formats;
if (video_capturer_->GetSupportedFormats() &&
video_capturer_->GetSupportedFormats()->size() > 0) {
formats = *video_capturer_->GetSupportedFormats();
}
...
-Mike
--
---
You received this message because you are subscribed to a topic in the Google Groups "discuss-webrtc" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/discuss-webrtc/vBD_A7gY9Io/unsubscribe.
To unsubscribe from this group and all its topics, send an email to discuss-webrt...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
http://webrtc.googlecode.com/svn/trunk/talk/examples/ios/AppRTCDemo/
is very close to supporting video on iOS, but there isn't a way to render the video using the ObjC interface.
To build the video and jingle libs yourself, then XCode iOS app:
- Copy the webrtc_obj files into this dir ...projdir.../trunk/talk/app/webrtc/objc
- build the AppRTCDemo: wrios && gclient runhooks && ninja -C out_ios/Debug AppRTCDemo
- copy libs into Xcode build cp ...projdir.../trunk/out_ios/Debug/libvideo_render_module.a ...projdir.../ios_app/webrtc-ios/ios-example/libs
- I am not using the trunk dir provided in this repo. Using the google one instead.
So, here is my question:
1. Am I correct so far?
2. In Gregg's github project, https://github.com/gandg/webrtc-ios ,
a) Copy the webrtc_obj files into this dir ...projdir.../trunk/talk/app/webrtc/objc
Where <the webrtc_obj(webRTC_obj) files> came from?? and I guess these files are patched with Bridge's fix. Is there any diff or instruction?
b) build the AppRTCDemo: wrios && gclient runhooks && ninja -C out_ios/Debug AppRTCDemo
Is this identical with the whole process of the Bridge's post above?
c) copy libs into Xcode build cp ...projdir.../trunk/out_ios/Debug/libvideo_render_module.a ...projdir.../ios_app/webrtc-ios/ios-example/libs
What does this mean??
copy
the 4 files under ./trunk/out_ios/Debug/
to
../ios_app/webrtc-ios/ios-example/libs ?
d) I am not using the trunk dir provided in this repo. Using the google one instead.
What does this mean? What is the <google one>?Thanks a lot Bridger Maxwell and others.
I'm a bit new and confused, so please give me advice to grab the whole picture of what's going on.
Here is my understanding, and please correct me if I'm wrong:
As on Dec 2013,
webrtc - Revision 5293: /trunk/talk/examples/ios/AppRTCDemo
http://webrtc.googlecode.com/svn/trunk/talk/examples/ios/AppRTCDemo/
is very close to supporting video on iOS, but there isn't a way to render the video using the ObjC interface.
- Bridger Maxwell has posted ObjC code to fix the issue on Oct.
- Using the fix, Gregg Ganley has published a working Xcode project for iOS AppRTCDemo with audio and VIDEO at full duplex https://github.com/gandg/webrtc-ios
- Gregg's Xcode project is only available for real iOS devices, not for iOS simulators. I suppose the jingle libs are compiled for Arm7 not for x86, so we need to use ninja then Xcode as Bridger has suggested as above.
To build the video and jingle libs yourself, then XCode iOS app:
- Copy the webrtc_obj files into this dir ...projdir.../trunk/talk/app/webrtc/objc
- build the AppRTCDemo: wrios && gclient runhooks && ninja -C out_ios/Debug AppRTCDemo
- copy libs into Xcode build cp ...projdir.../trunk/out_ios/Debug/libvideo_render_module.a ...projdir.../ios_app/webrtc-ios/ios-example/libs
- I am not using the trunk dir provided in this repo. Using the google one instead.
So, here is my question:
1. Am I correct so far?
2. In Gregg's github project, https://github.com/gandg/webrtc-ios ,
a) Copy the webrtc_obj files into this dir ...projdir.../trunk/talk/app/webrtc/objc
Where <the webrtc_obj(webRTC_obj) files> came from?? and I guess these files are patched with Bridge's fix. Is there any diff or instruction?
b) build the AppRTCDemo: wrios && gclient runhooks && ninja -C out_ios/Debug AppRTCDemo
Is this identical with the whole process of the Bridge's post above?
c) copy libs into Xcode build cp ...projdir.../trunk/out_ios/Debug/libvideo_render_module.a ...projdir.../ios_app/webrtc-ios/ios-example/libs
What does this mean??
copy
- libjingle.a
- libjingle_media.a
- libjingle_peerconnection_objc.a
- libvideo_render_module.a
the 4 files under ./trunk/out_ios/Debug/
to
../ios_app/webrtc-ios/ios-example/libs ?
d) I am not using the trunk dir provided in this repo. Using the google one instead.
What does this mean? What is the <google one>?
webrtc/objc .
So, I now clearly understood this is the WebRTC library, and the example is also to show how to optimize to bundle the (jingle) library.
https://code.google.com/p/webrtc/source/browse/trunk/talk/app/webrtc/objc/README?r=5302
This README makes sense to me what's going on.
So, here's what happened when I tried. (next post to clarify the topic)
ninja: Entering directory `out_sim/Debug'
[452/2099] ACTION(host) Generating header
2013-12-18 10:45:10.714 class-dump[12893:d07] Unknown load command: 0x00000024
2013-12-18 10:45:10.734 class-dump[12893:d07] Unknown load command: 0x0000002a
2013-12-18 10:45:10.735 class-dump[12893:d07] Unknown load command: 0x00000026
2013-12-18 10:45:10.736 class-dump[12893:d07] Unknown load command: 0x00000029
2013-12-18 10:45:10.737 class-dump[12893:d07] Unknown load command: 0x0000002b
[552/2099] OBJCXX obj.host/testing/iossim/iossim.iossim.o
In file included from ../../testing/iossim/iossim.mm:40:
obj.host/testing/iossim/iossim.gen/iossim/iPhoneSimulatorRemoteClient.h:37:1: warning: no 'assign', 'retain', or 'copy' attribute is specified - 'assign' is assumed [-Wobjc-property-no-attribute]
@property(nonatomic) id <DTiPhoneSimulatorSessionDelegate> delegate; // @synthesize delegate=_delegate;
^
obj.host/testing/iossim/iossim.gen/iossim/iPhoneSimulatorRemoteClient.h:37:1: warning: default property attribute 'assign' not appropriate for non-GC object [-Wobjc-property-no-attribute]
2 warnings generated.
[2097/2099] OBJC obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCAppDelegate.o
../../talk/examples/ios/AppRTCDemo/APPRTCAppDelegate.m:47:17: warning: method 'peerConnection:addedDataChannel:' in protocol not implemented [-Wprotocol]
@implementation PCObserver {
^
../../talk/app/webrtc/objc/public/RTCPeerConnectionDelegate.h:56:1: note: method 'peerConnection:addedDataChannel:' declared here
- (void)peerConnection:(RTCPeerConnection *)peerConnection
^
../../talk/examples/ios/AppRTCDemo/APPRTCAppDelegate.m:41:12: note: required for direct or indirect protocol 'RTCPeerConnectionDelegate'
@interface PCObserver : NSObject<RTCPeerConnectionDelegate>
^
1 warning generated.
[2098/2099] LINK AppRTCDemo.app/AppRTCDemo, POSTBUILDS
FAILED: ../../third_party/llvm-build/Release+Asserts/bin/clang -framework Foundation -framework UIKit -framework IOKit -framework Security -framework SystemConfiguration -framework AVFoundation -framework CoreMedia -framework CoreVideo -framework OpenGLES -framework QuartzCore -Wl,-search_paths_first -Wl,-ObjC -mios-simulator-version-min=6.0 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator7.0.sdk -arch i386 -L. -o AppRTCDemo.app/AppRTCDemo obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCAppClient.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCAppDelegate.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.APPRTCViewController.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.GAEChannelClient.o obj/talk/examples/ios/AppRTCDemo/AppRTCDemo.main.o libjingle_peerconnection_objc.a libjingle_peerconnection.a libjingle.a libexpat.a libjsoncpp.a libcrnss.a libnss_static.a libcrnspr.a libsqlite_regexp.a libicui18n.a libicuuc.a libicudata.a libcrnssckbi.a libcrssl.a libjingle_media.a libyuv.a libvideo_capture_module.a libwebrtc_utility.a libaudio_coding_module.a libCNG.a libcommon_audio.a libsystem_wrappers.a libcommon_audio_sse2.a libG711.a libG722.a libiLBC.a libiSAC.a libiSACFix.a libPCM16B.a libNetEq.a libwebrtc_opus.a libopus.a libacm2.a libNetEq4.a libmedia_file.a libwebrtc_video_coding.a libwebrtc_i420.a libcommon_video.a libvideo_coding_utility.a libwebrtc_vp8.a libvpx.a libvpx_asm_offsets_vp8.a libvpx_intrinsics_mmx.a libvpx_intrinsics_sse2.a libvpx_intrinsics_ssse3.a libvideo_render_module.a libvideo_engine_core.a librtp_rtcp.a libpaced_sender.a libremote_bitrate_estimator.a librbe_components.a libbitrate_controller.a libvideo_processing.a libvideo_processing_sse2.a libvoice_engine.a libaudio_conference_mixer.a libaudio_processing.a libaudio_processing_sse2.a libaudio_device.a libjingle_sound.a libjingle_p2p.a libsrtp.a -framework Foundation -lstdc++ /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator7.0.sdk/usr/lib/libsqlite3.dylib -framework AudioToolbox -framework CoreAudio
Undefined symbols for architecture i386:
"_OBJC_CLASS_$_RTCDataChannel", referenced from:
objc-class-ref in libjingle_peerconnection_objc.a(libjingle_peerconnection_objc.RTCPeerConnection.o)
objc-class-ref in libjingle_peerconnection_objc.a(libjingle_peerconnection_objc.RTCPeerConnectionObserver.o)
ld: symbol(s) not found for architecture i386
clang: error: linker command failed with exit code 1 (use -v to see invocation)
[2098/2099] MACTOOL copy-bundle-resource ../../t...ios/AppRTCDemo/en.lproj/APPRTCViewController.xib
2013-12-18 10:51:05.658 Interface Builder Cocoa Touch Tool[16264:303] CFPreferences: user home directory at file:///Users/ken/Library/Application%20Support/iPhone%20Simulator/User/ is unavailable. User domains will be volatile.
ninja: build stopped: subcommand failed.
cd projectRootDir/trunkninja: Entering directory `out_sim/Debug'
export GYP_DEFINES="build_with_libjingle=1 build_with_chromium=0 libjingle_objc=1"
export GYP_GENERATORS="ninja"
export GYP_DEFINES="$GYP_DEFINES OS=ios target_arch=ia32"
export GYP_GENERATOR_FLAGS="$GYP_GENERATOR_FLAGS output_dir=out_sim"
export GYP_CROSSCOMPILE=1
gclient runhooks
//Here replace libjingle and exampleApp code
replace
projectRootDir/trunk/talk/app/webrtc/objc
with
https://github.com/gandg/webrtc-ios/tree/master/webRTC_obj
replace
projectRootDir/trunk/talk/examples/android
with
https://github.com/gandg/webrtc-ios/tree/master/trunk/talk/examples/android
//------------------------------------------
Finally, compile
$ ninja -C out_sim/Debug iossim AppRTCDemo
For more options, visit https://groups.google.com/d/optout.
self.videodevice=[self frontFacingCameraIfAvailable];
NSLog(@"***DebugLine5");
NSString *cameraID = nil;
if (self.videodevice)
{
cameraID=[self.videodevice localizedName];
}
NSLog(@"***DebugLine6");
self.videocapturer = [RTCVideoCapturer capturerWithDeviceName:cameraID];
NSLog(@"***DebugLine7");
self.videoSource=[self.peerConnectionFactory videoSourceWithCapturer:self.videocapturer constraints:nil];
NSLog(@"***DebugLine8");
self.localVideoTrack =[self.peerConnectionFactory videoTrackWithID:vieoid source:self.videoSource];
NSLog(@"***DebugLine9");
if (self.localVideoTrack)
{
[self.localmediaStream addVideoTrack:self.localVideoTrack];
}
if (shouldReceiveVideo) {
[self.peerConnection addStream:self.localmediaStream constraints:constraints];
}
I've been playing around with the (not officially landed) video capabilities in the iOS API but I am having trouble engaging sending video.If anyone working on this implementation could shed some light I would be really grateful.
When I create a RTCVideoCapturer object I create it using device.uniqueId where device is my front camera grabbed by iterating through [AVCaptureDevice devicesWithMediaType:] to get a front facing camera. I then create the RTCVideoSource using that capturer and an empty constraints object.This seems to create a video capturer that can be fed to the video stream, but no video flows through the video track.Am I using the correct device name? should I be using device.localizedName instead?
Does the RTCVideoSource need additional constraints or is an empty media constraint set fine?Thanks-Mike