[QLab] live camera from iPod in QLab

4,201 views
Skip to first unread message

Frank Leclerc

unread,
Mar 31, 2011, 12:47:45 AM3/31/11
to ql...@lists.figure53.com
Hello,

I recently installed an app on my iPod Touch 4th gen called iWebcamera
(4.99 CA$).

http://www.drahtwerk.biz/EN/Products/iPhone/iWebcamera.aspx

Basically, the app turn the iPhone or iPod into a real wireless
webcamera. So with iWebcamera on the iPod (and the appropriate driver
installed on a mac), you can have a wireless webcamera feed that could
be routed to video applications such as QuickTime, Modul8,
MaxMSP/Jitter, and so on. I've tried it with my MacBook and it's
working great and flawlessly.

Here is my question: is there any way to have the option to use this
set-up into QLab as a live wireless camera? I did route iWebcamera
with all those apps I mentionned (QuickTime, Modul8, MaxMSP/Jitter),
but I couldn't have found how to "see" the iWebcamera input (or
driver) in the QLab preferences pane. Insted, the only option I have
is the Built-in iSight camera. Does anybody know if there's a way to
have QLab and iWebcamera working together? Or maybe there is a better
solution to use an iPod/iPhone as a wireless camera routed into QLab?

Many thanks,

François
________________________________________________________
WHEN REPLYING, PLEASE QUOTE ONLY WHAT YOU NEED. Thanks!
Change your preferences or unsubscribe here:
http://lists.figure53.com/listinfo.cgi/qlab-figure53.com
Follow Figure 53 on Twitter here: http://twitter.com/Figure53

Chris Mower

unread,
Mar 31, 2011, 3:12:54 AM3/31/11
to Discussion and support for QLab users.

Can I second this. I have several video input sources which i can use with Quicktime, but there doesn't seem to be a way to get QLab to recognise them. 

regards,
Chris

luckydave

unread,
Mar 31, 2011, 7:16:01 AM3/31/11
to Discussion and support for QLab users.
On Mar 31, 2011, at 3:12 AM, Chris Mower wrote:

> Can I second this. I have several video input sources which i can use with Quicktime, but there doesn't seem to be a way to get QLab to recognise them.

Some people have had strange successes with other cameras, but as a general rule I like to call "FireWire DV" cameras the set of cameras that work with QLab's Camera cue. However, with a little Quartz Composer trickery, many other cameras can be used. In the QLab application package, in the Resources folder (control-click on the QLab application icon, and select "Show package contents"), you'll find a file called "video.qtz".

Copy that file somewhere else, and open it up in Quartz Composer. In Quartz, find the Core Image Filter patch, and disconnect its output. Leave it there, conveniently forgetting about it. Add a Video Input patch and connect it to the Image input where the CI Filter was connected. In the Settings of the Video Input patch, select your camera.

Create a movie file with pixel dimensions that match your camera's output, with a duration as long as you'll need to see the camera. This is a dummy movie file, so I tend to make a black image that lasts an hour, and encode with a temporal codec (I use H.264). It shouldn't be a very large file, and it doesn't matter what it looks like, since its output will be thrown away in Quartz, being replaced by the camera's output.

In QLab, add that dummy movie file as a Video cue, and apply your new camera quartz file as its custom renderer. Ta-da! Now you have a workaround camera cue that will work with other cameras than what the Camera cue can recognize.

luckydave
luck...@figure53.com

*

unread,
Mar 31, 2011, 7:46:14 AM3/31/11
to Discussion and support for QLab users.
Wow!

& how do we do the same thing for live audio:)

*

On Thu, March 31, 2011 6:16 am, luckydave wrote:
> In QLab, add that dummy movie file as a Video cue, and apply your new
> camera quartz file as its custom renderer. Ta-da! Now you have a
> workaround camera cue that will work with other cameras than what the
> Camera cue can recognize.

Chris Mower

unread,
Mar 31, 2011, 7:51:01 AM3/31/11
to Discussion and support for QLab users.
WOW! Didn't understand a word of that. :-) 
However I'm sure that if I just do what LuckDave has said then this will work. Many thanks for the info.

Regards,
Chris

*

unread,
Mar 31, 2011, 8:23:21 AM3/31/11
to Discussion and support for QLab users.
I know it would be a very specific cue but since a Metric Halo box has FW
& DAW channels that can be routed all over OSX, I wonder if there is a way
to configure an audio cue to take advantage of this.

For example, I can route live audio into Spectra Foo with FW channels.

Any audio passing thru a MH box can be captured, routed & recorded (even
internally to the Mio Console). Maybe this same function but without the
recording part could be patched into Qlab via FW channels.

Maybe there is no "input" framework & so it's no possible.

A step in this direction might be a FW / DAW cue that lets another app
route audio thru Qlab.

Something along the lines of a sound flower cue but then adjusted to allow
for live input.

Maybe this is all way off & maybe there is already something in the works...

*

Christopher Ashworth

unread,
Mar 31, 2011, 8:26:25 AM3/31/11
to Discussion and support for QLab users.
On Mar 31, 2011, at 3:12 AM, Chris Mower wrote:
>
> Can I second this. I have several video input sources which i can use with Quicktime, but there doesn't seem to be a way to get QLab to recognise them.

To expand on luckydave's discussion, a bit more background on why you see this:

Apple has two video input frameworks.

#1 is old, very hard to use, and has been deprecated by Apple. (Which means they can remove it from the operating system at any time they want.)

#2 is new, easier to use, and is what Apple tells developers to use from here on out.

#1 works with all cameras.

#2 does not.

Welcome to the joys of QuickTime.

-C

Christopher Ashworth

unread,
Mar 31, 2011, 8:27:05 AM3/31/11
to Discussion and support for QLab users.

On Mar 31, 2011, at 8:23 AM, * wrote:

> maybe there is already something in the works...

This.

-C

Geoff Hollingshead

unread,
Mar 31, 2011, 3:00:38 PM3/31/11
to Discussion and support for QLab users.
I know we've all said it before but just to say it again, QLAB and their support are AMAZING! How many software companies do you know that will get you to go into the package contents and advise you to change things that they have created in their programming to allow you to do what you need to make the software work for your purposes!

Thanks Guys, keep up the fantastic work!

Cheers

Geoff Hollingshead
Head of Sound
Arts Club Theatre

Sent from my iPhone

------------------------------ IMPORTANT NOTICE ------------------------------
This email transmission and any accompanying attachments contain confidential
information intended only for the use of the individual or entity named above.
Any dissemination, distribution, copying or action taken in reliance on the
contents of this email by anyone other than the intended recipient is strictly
prohibited. If you have received this email in error please immediately delete
it and notify sender at the above email address.
------------------------------ IMPORTANT NOTICE ------------------------------

luckydave

unread,
Mar 31, 2011, 3:07:31 PM3/31/11
to Discussion and support for QLab users.
> I know we've all said it before but just to say it again, QLAB and their support are AMAZING! How many software companies do you know that will get you to go into the package contents and advise you to change things that they have created in their programming to allow you to do what you need to make the software work for your purposes!

For the sake of reiterating, I don't advise actually changing the video.qtz file. If it were to be done with the camera input that I described, that would mean *every* video cue would now be a camera cue. Make a copy of the video.qtz file somewhere else, and then make changes. You shouldn't change anything in the actual QLab application package. The good thing is, if you do make that mistake, it's easily fixed by downloading the app again from our website and replacing the "broken" one.

Oh, and thank you for the kind words! :-)

Frank Leclerc

unread,
Mar 31, 2011, 5:23:55 PM3/31/11
to Discussion and support for QLab users.
I agree, the support is completely amazing! I'll end up the day with
two or three solutions to my problem.

Really, a big thanks to all of you who helped me today!

Cheers,

François

Keith Smith

unread,
Apr 2, 2011, 7:01:30 AM4/2/11
to Discussion and support for QLab users.
On 31 Mar 2011, at 12:16, luckydave wrote:

> [an excellent tutorial on creating a custom camera source]


Dave, this is golden. I have something coming up in a few weeks where I think I will need this. You're a star.

Regards,
Keith.


mic

unread,
Feb 13, 2012, 4:55:53 AM2/13/12
to luckydave, ql...@googlegroups.com
Dear,

i followed these instructions step by step, but when i run the cue,
QLab hangs, the rotating ball appears, i have to force quit. And my
camera input is not showing in the video. I'm using iWebcamera (Frank,
did you succeed in making it work?), but it doesnit even selecting
isight in Video Input patch.

any idea?
mic

On 31 Mar 2011, 12:16, luckydave <luckyd...@figure53.com> wrote:
> On Mar 31, 2011, at 3:12 AM, Chris Mower wrote:
>
> > Can I second this. I have several video input sources which i can use with Quicktime, but there doesn't seem to be a way to get QLab to recognise them.
>

> Some people have had strange successes with other cameras, but as a general rule I like to call "FireWire DV" cameras the set of cameras that work with QLab'sCameracue. However, with a littleQuartzComposer trickery, many other cameras can be used. In the QLab application package, in the Resources folder (control-click on the QLab application icon, and select "Show package contents"), you'll find a file called "video.qtz".
>
> Copy that file somewhere else, and open it up inQuartzComposer. InQuartz, find the Core Image Filter patch, and disconnect its output. Leave it there, conveniently forgetting about it. Add a Video Input patch and connect it to the Image input where the CI Filter was connected. In the Settings of the Video Input patch, select yourcamera.
>
> Create a movie file with pixel dimensions that match yourcamera'soutput, with a duration as long as you'll need to see thecamera. This is a dummy movie file, so I tend to make a black image that lasts an hour, and encode with a temporal codec (I use H.264). It shouldn't be a very large file, and it doesn't matter what it looks like, since its output will be thrown away inQuartz, being replaced by thecamera'soutput.
>
> In QLab, add that dummy movie file as a Video cue, and apply your newcameraquartzfile as its custom renderer. Ta-da! Now you have a workaroundcameracue that will work with other cameras than what theCameracue can recognize.
>
> luckydave
> luckyd...@figure53.com

luckydave Memory

unread,
Feb 13, 2012, 8:50:10 AM2/13/12
to ql...@googlegroups.com
Have you looked at the example file on the wiki? That follows these steps for you, so hopefully will work better. When you have it open in Quartz, give values to Width and Height, and see if it shows up in Quartz's Viewer window. If it's not there, it's not working.

http://wiki.figure53.com/QLab+Hints+and+Tips#x-Use Quartz Composer to access non-DV cameras (USB etc.)

Thanks,
luckydave

Aaron Quick

unread,
Apr 30, 2014, 12:04:42 AM4/30/14
to ql...@googlegroups.com
Dredging up an old topic because I was searching. A show I'm designing is looking for a way to emulate a live Instagram recording onstage. Is there any new part of this process for Qlab3?

Thanks,

Aaron Quick
Resident Sound and Projection Designer- Black Ensemble Theater

Douglas Heriot

unread,
May 1, 2014, 6:14:30 AM5/1/14
to ql...@googlegroups.com
I’ve successfully used an app called 'AirBeam Pro' a few times.

It streams video from an iOS device to a Mac. The trick is the Mac software outputs the video via Syphon, so you can get it into a QLab camera cue! You could then use QLab’s video effects to simulate Instagram filters.

I’ve been very impressed with the quality and low-latency of AirBeam, and highly recommend it.
Things to watch out for:
• The Syphon feed to QLab stop if you hide or minimise the AirBeam Mac App – make sure you just leave it open in the background.
• It streams over WiFi, so usual cautions regarding WiFi apply

Sam Kusnetz

unread,
May 1, 2014, 2:04:56 PM5/1/14
to ql...@googlegroups.com
On May 1, 2014 at 6:14:32 AM, Douglas Heriot (dougla...@gmail.com) wrote:
> I’ve successfully used an app called 'AirBeam Pro' a few times.
> http://appologics.com/airbeam

I’d like to add a second vote for this app. It’s pretty amazing, actually. I used it for a three week run on a very low budget show. There was a WiFi network set up just for this, with nothing but the QLab machine and the iPhone on it, and it worked perfectly every single night.

Pretty amazing.

Cheerio
Sam

Sam Kusnetz
QLab Field Operative
s...@figure53.com

Aaron Quick

unread,
May 1, 2014, 2:14:50 PM5/1/14
to ql...@googlegroups.com
Thanks all, just tested that app and it's awesome and easy to set up.

Sent from my iPhone
> --
> --
> Change your preferences or unsubscribe here:
> http://groups.google.com/group/qlab
>
> Follow Figure 53 on Twitter: http://twitter.com/Figure53
>
> ---
> You received this message because you are subscribed to a topic in the Google Groups "QLab" group.
> To unsubscribe from this topic, visit https://groups.google.com/d/topic/qlab/PQg9hY-erkE/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to qlab+uns...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Greg Ott

unread,
Sep 1, 2014, 3:34:42 PM9/1/14
to ql...@googlegroups.com
Hello everybody!

I'm actually trying to get this same setup to work for an upcoming show at the Annoyance Theater. None of us have worked with camera settings in Qlab before and we're having some trouble figuring out how to get this to work. All we are trying to do is display a live video output from an iPad or iPhone through a projector. 

Right now, here's what we're running:

- AirBeam on iPad
- AirBeam Pro on Mac
- Qlab 3

In Qlab, I've added a camera cue with the following settings:

- Camera: 2 - QLab - Surface 1
- Video Surface: Surface 1
-- Screens assigned: Syphon

Am I missing a step? From everything I've read -- and believe me, I already feel way in over my head -- the AirBeam Mac app (which isn't minimized) should output via Syphon and it should be displaying on the Video Surface with Syphon assigned.

If anyone has any insight into why I can't get this to work, I would really appreciate it. Frankly, we didn't know all of this was possible with this software and we're really hoping to get it off the ground for a show we have opening soon.

Thanks for any and all help you can provide!

Best,

Greg

Dave "luckydave" Memory

unread,
Sep 1, 2014, 4:09:36 PM9/1/14
to ql...@googlegroups.com
On Monday, September 1, 2014 at 12:34 PM, Greg Ott wrote:
In Qlab, I've added a camera cue with the following settings:

- Camera: 2 - QLab - Surface 1
- Video Surface: Surface 1
-- Screens assigned: Syphon

When you assign Syphon to a surface, you're sending QLab's output to Syphon, not bringing it in. Add the Syphon input from AirBeam to the camera cue, and add your projector to the surface.

-- 

Alec Sparks

unread,
Sep 1, 2014, 9:26:34 PM9/1/14
to ql...@googlegroups.com, ql...@lists.figure53.com, lecler...@gmail.com
Super Simple Solution: AirBeam app. $4 Works with QLab camera straight out of the box, no hacking about required.

Greg Ott

unread,
Sep 2, 2014, 11:48:20 AM9/2/14
to ql...@googlegroups.com
Thanks for the quick reply! Am I missing a step in getting QLab to recognize AirBeam as an input? When the AirBeam Pro Mac app and iOS app are running, the only option for a camera that I see is the FaceTime HD Camera. 

Greg

Daniel Richert

unread,
Sep 2, 2014, 3:14:00 PM9/2/14
to ql...@googlegroups.com, ql...@lists.figure53.com, lecler...@gmail.com
Hi.

There are free alternatives to these webcam apps

On your ios device you need TLRemote Camera

And on your Computer you need TCPSyphon (be sure to use the server if you want to receive, or client if you want to send)

Works nicely with Qlab, and there is even a syphon viewer for ios so you can use your ios device as a syphon destination


All the best.

Alec Sparks

unread,
Sep 2, 2014, 4:06:03 PM9/2/14
to ql...@googlegroups.com
Sometimes restarting QLab will show the added camera. If the AirBeam OSX app is started after QLab, it may not detect it.

Sam Kusnetz

unread,
Sep 3, 2014, 11:57:34 PM9/3/14
to ql...@googlegroups.com
Alec Sparks wrote:
> Sometimes restarting QLab will show the added camera. If the AirBeam
> OSX app is started after QLab, it may not detect it.
Also, it's not immediately obvious at first but you need to open the
camera in the AirBeam Pro app on the Mac, and have its window showing
the live display in order for it to appear as a Syphon server.

Cheerio
Sam

--

Greg Ott

unread,
Sep 4, 2014, 10:26:21 PM9/4/14
to ql...@googlegroups.com
Again, thanks for trying to help me figure this out! For the life of me, I still can't understand why I can't get any of this to work. Between two different MacBooks running 10.8, 10.9, and the 10.10 beta, and two different wireless networks, I keep running into the same issues:

- AirBeam Mac displays the video from my iPad in a preview, but when I click the play button, all that shows up is a black window. I am able to view the stream in a web browser just fine; viewing the full video on the Mac just isn't working, as shown. In QLab, no Syphon input is detected, just the built in FaceTime camera as Input 1. (I have also tried many combinations of launching and quitting QLab while AirBeam is and is not running. Nothing seems to change.)

- TLRemote Camera on the iPad does somehow get recognized as Input 2 when run through TCPSyphon Server, but again, no video is displayed.

- My only thought now is to somehow stream the video from AirBeam to a web browser and somehow display that window in QLab, but that seems like a complicated workaround. If this is what I have to resort to, does anyone know if there is a way to Syphon the video in the web browser? 

Does anyone have any other thoughts, apps, or steps that I might be missing in getting this to display?

Greg

Lucas Krech

unread,
Sep 4, 2014, 10:41:03 PM9/4/14
to ql...@googlegroups.com
Works for me right out the box on 10.8. Tried adding it live as a cue and it showed up and restarted QLab and it was still there. 

Are both versions of airbeam current? What iOS are you on? current release? Beta? Is your 10.8 and 10.9 up to date? 

-L

*insert witty mobile device advertising here*
--
--
Change your preferences or unsubscribe here:
http://groups.google.com/group/qlab
 
Follow Figure 53 on Twitter: http://twitter.com/Figure53

---
You received this message because you are subscribed to the Google Groups "QLab" group.
To unsubscribe from this group and stop receiving emails from it, send an email to qlab+uns...@googlegroups.com.

micpool

unread,
Sep 5, 2014, 4:27:17 AM9/5/14
to ql...@googlegroups.com, des...@lucaskrech.com
Never had a problem with it working first time. Try it on an ad hoc network.  Quit air beam on both devices. Create network on your mac. log on to your ad hoc network with your iOS device./Start air beam on both /start qlab/ put in a camera cue./Select  Airbeam camera and output surface/play


Mic

bamt...@gmail.com

unread,
Feb 2, 2016, 1:25:15 PM2/2/16
to QLab, des...@lucaskrech.com
This seems like an ideal solution to my situation. Using Airbeam as camera inputs. My question: will this work with 6 ipads (6 airbeam inputs)?

Op vrijdag 5 september 2014 10:27:17 UTC+2 schreef micpool:

CHNL

unread,
Feb 4, 2016, 11:00:56 AM2/4/16
to QLab, des...@lucaskrech.com
So is that really all that's to it?

  • Install Airbeam on iPad
  • Install Airbeam pro on MBP
  • Create wifi network
  • Start Airbeam on iPad
  • Start Airbeam pro on MBP
  • Start QLab
  • Camera cue with input from Airbeam
  • And be impressed
Wow

CH

Lucas Krech

unread,
Feb 4, 2016, 12:15:04 PM2/4/16
to bamt...@gmail.com, QLab
I think you should try it out and report back! :)

bamt...@gmail.com

unread,
Feb 4, 2016, 12:27:01 PM2/4/16
to QLab, bamt...@gmail.com
I will see if i can try. Multiple sources seem to doubt a wifi-networks capability to handle 6 streams reliably (including Qlab tech support). Though I would like to see if i could setup a fully functioning closed 802.11ac network, and what kind of difference that would make. If I ever get round to testing this system I will report back.

Op donderdag 4 februari 2016 18:15:04 UTC+1 schreef Lucas Krech:

sam kusnetz

unread,
Feb 4, 2016, 11:07:04 PM2/4/16
to ql...@googlegroups.com, bamt...@gmail.com
I can confirm success with:

- a closed 802.11N wifi network, with the access point about 30' maximum from the iPhones being used
- 3 iPhones of assorted vintage broadcasting in SD, black and white, no audio, 30 fps
- MacBook Pro connected to the network via Ethernet.

I'd love to hear about your setup!

Sam
--
Sam Kusnetz | Figure 53
(mobile)
--
--
Change your preferences or unsubscribe here:
http://groups.google.com/group/qlab
 
Follow Figure 53 on Twitter: http://twitter.com/Figure53
---
You received this message because you are subscribed to the Google Groups "QLab" group.
To unsubscribe from this group and stop receiving emails from it, send an email to qlab+uns...@googlegroups.com.

bamt...@gmail.com

unread,
Feb 5, 2016, 8:58:35 AM2/5/16
to QLab, bamt...@gmail.com
Hi Sam,

thanks for the info. I would also be very interested to know the specs of your macbook. Mine are:

Macbook Pro Retina (mid-2012) with OS X El Capitan (10.11.3).

2.6 GHz Intel Core i7 Processor

8 GB 1600 MHz DDR3 Memory

NVIDIA GeForce GT 650M 1024 MB Videocard


grz.


Op vrijdag 5 februari 2016 05:07:04 UTC+1 schreef sam kusnetz:

bamt...@gmail.com

unread,
Mar 31, 2016, 4:58:22 PM3/31/16
to QLab, bamt...@gmail.com
So i promissed I would report back. I can report that i have been succesfull!!! 

6 iPads all streaming live camera feed (front cam) to one router (802.11n 5Ghz) via Airbeam Pro. My macbook (see specs below) was also connected to the network. Using Airbeam as 6 individual live inputs for Qlab, i was able to use 1 projector and map 6 different surfaces. During the play we switched from colored backgrounds, to live feeds, to video, on 1, or a few, or all of the different surfaces and all of the different live feeds simultaneously without incident. 

The only tricky part was getting the 6 live feeds from airbeam running all at the same time (before even opening Qlab): sometimes Airbeam would freeze, or would refuse to recognize one of the iPads. It took some patience and some retries, but after a few tries they were all up and running. And once all 6 feeds were working there didn't appear to be any danger of airbeam freezing or crashing again.

It might be worth mentioning that we also tested this on another Mac: even though this mac was a newer generation it apparently didn't have the same video chip or processor chip because this newer mac wasn't able to run all 6 feeds from airbeam at the same time. 

So be sure the specs of your macbook match the ones below or are better if you want to try this. If any one tries this with more than 6 live feeds I would be really interested to hear the results!

thanks for all the help!

macbookspecs:

Macbook Pro Retina (mid-2012) with OS X El Capitan (10.11.3) 15 inch.

2.6 GHz Intel Core i7 Processor

8 GB 1600 MHz DDR3 Memory

NVIDIA GeForce GT 650M 1024 MB Videocard

Op donderdag 4 februari 2016 18:15:04 UTC+1 schreef Lucas Krech:
I think you should try it out and report back! :)

bamt...@gmail.com

unread,
Mar 31, 2016, 4:59:09 PM3/31/16
to QLab, ql...@lists.figure53.com, lecler...@gmail.com
So i promissed I would report back. I can report that i have been succesfull!!! 

6 iPads all streaming live camera feed (front cam) to one router (802.11n 5Ghz) via Airbeam Pro. My macbook (see specs below) was also connected to the network. Using Airbeam as 6 individual live inputs for Qlab, i was able to use 1 projector and map 6 different surfaces. During the play we switched from colored backgrounds, to live feeds, to video, on 1, or a few, or all of the different surfaces and all of the different live feeds simultaneously without incident. 

The only tricky part was getting the 6 live feeds from airbeam running all at the same time (before even opening Qlab): sometimes Airbeam would freeze, or would refuse to recognize one of the iPads. It took some patience and some retries, but after a few tries they were all up and running. And once all 6 feeds were working there didn't appear to be any danger of airbeam freezing or crashing again.

It might be worth mentioning that we also tested this on another Mac: even though this mac was a newer generation it apparently didn't have the same video chip or processor chip because this newer mac wasn't able to run all 6 feeds from airbeam at the same time. 

So be sure the specs of your macbook match the ones below or are better if you want to try this. If any one tries this with more than 6 live feeds I would be really interested to hear the results!

thanks for all the help!

macbookspecs:

Macbook Pro Retina (mid-2012) with OS X El Capitan (10.11.3) 15 inch.

2.6 GHz Intel Core i7 Processor

8 GB 1600 MHz DDR3 Memory

NVIDIA GeForce GT 650M 1024 MB Videocard


Op donderdag 31 maart 2011 06:47:45 UTC+2 schreef Frank Leclerc:
Hello,

I recently installed an app on my iPod Touch 4th gen called iWebcamera
(4.99 CA$).

http://www.drahtwerk.biz/EN/Products/iPhone/iWebcamera.aspx

Basically, the app turn the iPhone or iPod into a real wireless
webcamera. So with iWebcamera on the iPod (and the appropriate driver
installed on a mac), you can have a wireless webcamera feed that could
be routed to video applications such as QuickTime, Modul8,
MaxMSP/Jitter, and so on. I've tried it with my MacBook and it's
working great and flawlessly.

Here is my question: is there any way to have the option to use this
set-up into QLab as a live wireless camera? I did route iWebcamera
with all those apps I mentionned (QuickTime, Modul8, MaxMSP/Jitter),
but I couldn't have found how to "see" the iWebcamera input (or
driver) in the QLab preferences pane. Insted, the only option I have
is the Built-in iSight camera. Does anybody know if there's a way to
have QLab and iWebcamera working together? Or maybe there is a better
solution to use an iPod/iPhone as a wireless camera routed into QLab?

Many thanks,

François
________________________________________________________
WHEN REPLYING, PLEASE QUOTE ONLY WHAT YOU NEED. Thanks!

Change your preferences or unsubscribe here:

CHNL

unread,
Apr 14, 2016, 4:30:26 AM4/14/16
to QLab, bamt...@gmail.com
Cool. Just curious: which router did you use and what was the approximate range before the signal dropped?
CH

bamt...@gmail.com

unread,
Apr 15, 2016, 11:44:47 AM4/15/16
to QLab, bamt...@gmail.com
It was an ASUS RT-N66U Dual Band router. I didn't test the reach of the signal, since the ipads in our production had fixed positions. They were within 10 meter range of the router, but i suspect a larger range would certainly be possible.


Op donderdag 14 april 2016 10:30:26 UTC+2 schreef CHNL:

Jason Alan

unread,
Sep 10, 2017, 7:10:14 PM9/10/17
to QLab
Hello... it seems like someone else has run into the same problem as myself.  I am able to connect EpocCam on my desktop but when linking to a Lab camera Cue my MAC isn't recognizing a another option besides FaceTime camera.. any suggestions would be greatly appreciated  

Jason 

Felix Dietlinger

unread,
Sep 11, 2018, 6:29:10 AM9/11/18
to QLab
Hey everybody!

I used the airbeam solution 6 years ago, and it worked great. Is it still the best way to stream from an iphone into qlab? or has something better come along? Thanks for all the help

cheers,
Felix

James Dethlefson

unread,
Sep 11, 2018, 1:30:03 PM9/11/18
to QLab
Hi, Felix,

You might want to look at this discussion:


The App store has two NDI camera apps available.  One for iPhone ($10) and iPhone/iPad ($20).

Might be worth a shot.

Regards,

James

Felix Dietlinger

unread,
Sep 13, 2018, 3:25:03 AM9/13/18
to QLab
Thanks, the app looks good. They did however last update it in 2016, so I'm not sure if they still service their app properly. I might try it, tho, being able to prevent standby on the iPhone seems to be the killer feature that airbeam seems to lack.

James Dethlefson

unread,
Sep 13, 2018, 12:22:11 PM9/13/18
to QLab
Good catch.  The NewTek ($20) app is not much better having been updated a year ago.  For what it's worth, I did play around with the cheaper NDICamera app and the NDISyphon app luckydave linked to in the other thread using my iPhone 7 as a hotpot.  While the video latency seemed ok, once I activated the audio stream, latency was clearly perceptible  .
Reply all
Reply to author
Forward
0 new messages