I don't know how
hdmi and audio over these cables work though - so I'd love to hear from anyone who does. Are we still dealing with packets and error correction, or does the quality of the cable have a direct
effect on the quality of the signal?
Digital is digital. Depending on the
protocol (HDBT,
HDMI, Dante, AVB, Cobranet) it's not always
Ethernet packets, but it still comes down to 1's and 0's. Problem is you can only do so much to make errors graceful on real-time streaming multimedia, so instead of operating at a lower bandwidth it's more likely work right up until it doesn't work at all.
In TCP/IP networking there are UDP (User Datagram Protcol) packets and TCP (Transmission Control
Protocol) packets. UDP just constantly keeps streaming new packets to the receiver whether they get it or not, and whether the data is corrupted or not. TCP announces that a packet is about to sent and some info on what the packet contains, the receiver acknowledges, the transmitter sends the packet, and the receiver confirms they got the packet uncorrupted. If the packet fails, it tries again. Understandably, UDP is faster because it has a lot less overhead, but it's more likely to receive corrupted data that it cannot error-correct.
You see TCP most of the time for
conventional networking. If you're copying a 250mb file across your
network, making sure you the destination received the entire file correctly is important or the whole file may become unreadable. Whereas if you're real-time streaming audio/video, or
sACN out to your lighting
system, time is of the essence. You don't care so much if the last packet was successful or not. If it wasn't, you
throw it away and move onto the next one. Time sensitivity is prioritized over accuracy. This is where you get into that realm that usually it works or it doesn't work. Generally not a lot of grey area in between.
Naturally, this leads to some really terrible nerd humor.