josephstaffa
New Member
Hi Guys!
So I got this request from my creative team asking if it would be possible to have a video feed coming out of device onstage to be projected on screens SL & SR. I've been working on it and have had a decent amount of success but I figured I'd ask the Hive Mind and see if anybody else has done this/or even wants to do it themselves...Or you know, has a better way to do it because my current method is still not exactly reliable.
Basically the premise is that I have an actor onstage who pulls out a device (i.e. iPhone, iPad, something with a camera) out of her purse and starts recording a live video that's streaming to something like facebook or youtube.
Things to note for my scenario:
- It doesn't HAVE to be wireless...That being said, the idea of an actor running around onstage with a wired laptop sounds like a terrible idea to me.
- As I said it can be any device, in this case more influenced by supply and demand. Device would probably be an iPad Mini we have running iOS 10.1.1
- Ideally this feed goes into Qlab to be projected.
Okay! details done. So this is what I have come up with so far.
The way I have gotten this to work is by running a software called AirServer on my Sound/Video Machine and using the built in Airplay features to cast what's on the iPad screen to my computer. This is being done over a dedicated network (Linksys EA3500) with nothing else on it. So basically I cast it, open up the camera, and bam live video. The latency is about 50-80 milliseconds which is more than acceptable. What else is nice is that Qlab sees AirServer directly as a Camera Input so there's no need to run through something like Syphon.
Now for the issues...
Once the airplay connection is open it's great! However, if you put the iPad to sleep naturally the connection closes after about 30sec (still running tests to find a concrete time) and then you have to re-cast to the server and refresh the camera cue in Qlab (I rigged a hotkey cue list to do this). Worth noting that when the connection drops AirServer keeps displaying the last frame receieved. Now in theory the iPad could just stay on which is how it's set up right now and the connection is stable. I could rig something inside her purse to keep the power button from getting hit but this is a bad idea for many obvious reasons.
Essentially what I need is a way to keep the airplay connection open while the device is asleep which is most likely impossible or beyond my technical savvy....Or a better way to do it entirely. The only other benefit of using airplay is the flexibility of using almost any apple device.
Anyways, that's my crazy plan for now! Any and all advice welcome...
Thanks,
Joe
So I got this request from my creative team asking if it would be possible to have a video feed coming out of device onstage to be projected on screens SL & SR. I've been working on it and have had a decent amount of success but I figured I'd ask the Hive Mind and see if anybody else has done this/or even wants to do it themselves...Or you know, has a better way to do it because my current method is still not exactly reliable.
Basically the premise is that I have an actor onstage who pulls out a device (i.e. iPhone, iPad, something with a camera) out of her purse and starts recording a live video that's streaming to something like facebook or youtube.
Things to note for my scenario:
- It doesn't HAVE to be wireless...That being said, the idea of an actor running around onstage with a wired laptop sounds like a terrible idea to me.
- As I said it can be any device, in this case more influenced by supply and demand. Device would probably be an iPad Mini we have running iOS 10.1.1
- Ideally this feed goes into Qlab to be projected.
Okay! details done. So this is what I have come up with so far.
The way I have gotten this to work is by running a software called AirServer on my Sound/Video Machine and using the built in Airplay features to cast what's on the iPad screen to my computer. This is being done over a dedicated network (Linksys EA3500) with nothing else on it. So basically I cast it, open up the camera, and bam live video. The latency is about 50-80 milliseconds which is more than acceptable. What else is nice is that Qlab sees AirServer directly as a Camera Input so there's no need to run through something like Syphon.
Now for the issues...
Once the airplay connection is open it's great! However, if you put the iPad to sleep naturally the connection closes after about 30sec (still running tests to find a concrete time) and then you have to re-cast to the server and refresh the camera cue in Qlab (I rigged a hotkey cue list to do this). Worth noting that when the connection drops AirServer keeps displaying the last frame receieved. Now in theory the iPad could just stay on which is how it's set up right now and the connection is stable. I could rig something inside her purse to keep the power button from getting hit but this is a bad idea for many obvious reasons.
Essentially what I need is a way to keep the airplay connection open while the device is asleep which is most likely impossible or beyond my technical savvy....Or a better way to do it entirely. The only other benefit of using airplay is the flexibility of using almost any apple device.
Anyways, that's my crazy plan for now! Any and all advice welcome...
Thanks,
Joe