Suggestions for Multimedia Revamp?

Our small school theater has an aging low-resolution projector that I'm looking to replace in the next year or so. As part of the process, I'd like to future-proof our input and distribution system as much as possible. Currently we have VGA inputs and left-and-right XLR sound inputs both at FOH and in the booth. I use a collection of adapters and cables and DAC's and direct boxes to connect various devices to the system, which is a real pain. So here are my questions:

1) What inputs should I have at FOH
2) What inputs should I have in the booth?
3) What's the best way to connect the whole system together?
4) Is there anything I'm probably not thinking of that I should be?
 
I have stated in other posts but whenever possible run fiber. I hope that your current cables are in conduit, or that you plan on running some conduit for this upgrade. Best way to future proof something is make it easier to pull new cable as the needs change.

The nice thing about fiber is the bandwidth is so large that just about anything coming down in the near future can be handled. Also the adapter boxes are easy to come by and if you get a one off that needs that weird adapter box that you don't have, most video rental houses will have the adapters for rent. You can even get wall plates that do the adapting so you can have an HDMI on the wall but still a fiber run so upgrading later is just changing out the wall plate. Though if you do that I still recommend leaving access to some tails, you never know when having a few extra runs from stage to booth can come in handy (and fiber can be used for audio as well as lighting and standard network without interference to what is on the other pairs.)

As for your inputs, look at what your space is used for. Do you run a lot of presentations (from the stage) where you need computer inputs? Do you run movies and such and want to be able to have it on stage? Mainly look at what you are currently using the most and make sure you have that.

For stage and booth I would definitely recommend HDMI right now, just make sure it is HDCP compliant (or even better that it is compliant but also has a way to turn HDCP off,) or that clients BlueRay player that they bring in won't work.

For connecting it all together I personally love Video over IP that way you can use the cables as network and get control as well as routing configured from a standard computer. You can also use video matrix systems to route the signal wherever needed.
 
The primary uses right now are presentations (generally run from guest laptops at FOH) and movies and such (generally run from the booth). If I understand your suggestion, it's essentially to use the LAN to transmit video from and to HDMI ports anywhere I need inputs or outputs. For future proofing, I should use fiber instead of CAT6 for its increased bandwidth. I can then route inputs and outputs to where they are needed over the LAN.

I have a follow-up question: if I have a gigabit CAT6 network already in place, I should be able easily to handle HD video and even 4K over my existing network without the fiber upgrade, right?
 
You are pretty much there, but yeah using the LAN to transmit video (and I always recommend a dedicated network, don't piggy back off the rest of the building) will allow you a lot of freedom. Though it does introduce some lag. For presentations and movies probably not a problem but if doing IMag then it could potentially be an issue. If possible I would run a few pairs of cable down to the stage and you can use those as a last minute "Oh my God we have to do IMag!" solution. Adapters are available for both IP and non-IP over CAT cable.

Depending on what else is on your network, CAT6 should work for current technology, though for future proofing fiber is probably the way to go. Trying to make sure that the CAT6 network you have is isolated, or at least a couple of runs are, will help a lot if you run into bandwidth issues. At this point any time I see someone running new cable I usually just say run the Fiber, it's not that much more expensive and you can get multiple pairs in a single cable if needed, all in the same volume as a current CAT cable.
 
So the question I guess becomes whether to go with a routable or a non-routable solution. The routable solution will be more future-proof, but a non-routable one will probably be less expensive to implement. At a minimum, I could go with HDBaseT and a matrix switcher with audio de-embedding, with HDMI inputs at FOH and in the booth, HDBaseT running to my projector, and de-embedded audio running to my sound board through a DI box. I could do this pretty easily by pulling a couple of CAT6 cables through existing conduit using my current VGA runs. At the other end of the spectrum, I could go with a full SMPTE 2022-compatible AV over IP solution on a fiberoptic network.
 
yep
 
Fibre might be cheap, but in general the endpoint devices and their transceivers have not yet reached the price point of copper...
Fibre standards have also evolved over time, multimode has been through OM1, OM2, OM3 and now OM4, and single mode's up to OS2. If you're not installing the equipment now, you might be in the position of having to pull different fibre when you do go to light it up. Single mode is more future resistant, but also more spendy on the transceivers...

In general, in 2016, I'd be going with HDBaseT as a default position. If you're in the situation to, I'd always advocate pulling spare conduit with a minimum bend radius to accept Cat6A S/FTP or Fibre, and leave a draw string in there. That's the only true "future proof" solution...

Despite what some manufacturers want be to believe, gigabit networks cannot transmit HD video without some compression. A 1080p60 signal is a 3.something Gbps data rate and that can't fit down a 1Gbps link. The compression may be visually lossless and very light, but it is compressed and that does add latency. I'd also be checking out the HDCP capabilities of any IP solution carefully, many of them can't pass encrypted content, a small handful can (but they are more expensive of course)...
 
Yeah, I'm leaning toward HDBaseT right now.

As far as projectors go, what do people think of the Epson Pro G7500U? It's about twice as bright as our ancient SVGA projector (which is admittedly not bright enough for the space), has good resolution for the price range, and has great connectivity.
 
A question that came up in my research: how important is HDCP 2.2?

In looking into implementing an HDBaseT infrastructure, I was please to find hardware capable of video up to 4096x2160 (with some frequency/chroma subsampling limits due to the reduced bandwidth vs HDMI) and VESA up to 2560x2048. Even though we only plan to upgrade to a WUXGA (1920x1200) projector at this time, I had hoped that by installing a 4K-capable network we could future-proof for the day when we might want to upgrade to a 4K projector. Unfortunately, I've just discovered that while this equipment is capable of routing 4K video, it only supports HDCP 1.4 (which can be turned off).

So, just how limiting is this likely to be? Would this only prevent us from showing a 4K Blu-ray, or would we run into problems with PowerPoint presentations and the like?
 
Today there is not that much HDCP2.2 content.
But you say that you're only sticking with HD in this generation with a possible move to 4K at some stage in the future.
In a few years time, when you might be in the position to be looking at this upgrade, it's likely that a more diverse range of content will demand HDCP2.2, or HDCP3 or whatever the technology of choice is then.

Digital video continues to evolve rapidly, so future resistant is a moving ballpark. Your most adaptable solution is likely to be the best shielded cat cable you can afford, and a spare conduit for whatever the next technology evolution brings.
Sorry that's not the news you hoped for...
 
That's what I was afraid of, yes. So I guess I won't sell it as being upgradeable to 4K with just the purchase of a projector.

Your most adaptable solution is likely to be the best shielded cat cable you can afford, and a spare conduit for whatever the next technology evolution brings.

Is there any reason to go for Cat6a in this situation? My longest runs will be around 100' including patches, so I should be able to get reliable 10Gbps speeds on properly-installed Cat6. And it seems likely that any 4K upgrade we might do down the road would require more than 10Gbps, which would require new cable pulls anyway.
 

Users who are viewing this thread

Back