... But I have to wonder as to WHY the WiFi systems would get flaky, as I sit here and type this on my laptop, while sitting on my front porch. I do occasional get a dropped WiFi connection here when the phone rings and that's a 5.8mhz. Go figure and perhaps Kirk at
ETC could enlighten us as to why not WiFi. ...
Why not WiFi? There are a couple of reasons.
First, as you have experienced when your phone rings, the connection can be interrupted, (or phrased better for your scenario: the signal overpowered), with other devices because it relies on a wider spectrum of frequency range. That wider spectrum makes it easier for other devices to interrupt part of the bandwidth needed. WiFi falls in the 2.4GHz frequency range for the most widely used standards (802.11 B/G/N) and in the 5.8 GHz for the other commercially available routers (802.11 A/N). (Note that the 802.11N standard allows for the choice between 2.4GHz, 5.8GHz, or both based on hardware).
What other devices also operate in the 2.4GHz bandwidth? Some quick Google searching reveals that these devices all operate in that bandwidth:
Cordless Phones
Bluetooth Devices
Wireless USB
Zigbee Devices
Car Alarms
Also, microwaves tend to cause a lot of electromagnetic interference in the 2.4GHz band as well.
A pretty good article on avoiding interference in this range can be found at:
Avoiding Interference in the 2.4-GHz ISM Band. While it is more focused on the development side (it is an electrical engineer's magazine) it does have a few useful graphics that explain some of the reasons interference happens.
But back to topic...
The second and more important reason that we do not use WiFi, especially for show critical applications, is that there is not enough time within the packet stream that we are relying on to account for dropped packets.
Think about how you use your computer to access the web. Generally you open your favorite browser, enter a URL, and wait for that website to load up. If you have a blazing fast connection, it loads pretty quickly. If you aren't so lucky, you may see parts of the
page load and other parts wait to download images or other sections. Under the hood, the computer is loading and saving to a temporary location everything needed to display that
page. If it misses something, it can simply ask for that packet again and it will get it. Once the
page has loaded, you aren't actively using the
network connection to stream that
page to you, it is stored on the computer. That process can take a few seconds, but then you can scroll through the
page because nothing on it is trying to load.
In lighting, we are constantly changing that information. Levels change, parameters change, settings change, outputs change,
etc... instead of the "
page" being loaded and stored to the computer, it is constantly refreshing as that data changes. There is not enough time to ask the
console to resend the information, because, by that time, the entire
page has changed already. So, if you miss information, you can become out of sync and not have the correct display. This is why, if you run an application such as the
Eos/
Ion Client wirelessly, you may see it synchronize the show file several times because it doesn't know what it missed, so it has to assume it needs everything.
So, in order to avoid those issues, the Net3
RFR uses a radio and receiver that is based on the Zigbee
protocol. But Kirk, isn't that in the 2.4GHz band you were describing earlier? Yes, it is. Because the Zigbee radio has a narrower bandwidth per
channel than WiFi, it is less prone to interference.
(See this
image from the article linked above: )
We also employ frequency hopping technology, so while the main signal is centered on one frequency, it does hop around the entire 2.4 band in millisecond bursts to further avoid interference.
The
iRFR app does rely on WiFi technology, and is thus more prone to interference and disconnects than the dedicated Net3
RFR.