Sound fail....

...Can somebody tell me what is up with these awards shows? Are the companies hired not adequate to meet the needs of the show? Is getting good sounding audio harder than good quality picture or lighting? ...
Considering all the issues and obstacles, I think ATK Audiotek (who does the majority of the large events) does a pretty good job.

And yes, getting good sounding audio on TV IS harder than good quality picture or lighting. Audio has a minimum of three consoles: FOH, Monitors, and Broadcast. The three engineers may not be in communication with one another and are definitely not in the same space. What you hear in your living room may not be anything close to what the broadcast A1 intended.
 
Last edited:
Obviously determining who's at fault is the key thing in how you would proceed fixing the problem in the future. We have no idea who was at fault because we weren't there. It seems like ALOT of things went wrong in that video.

Disagree. Only one thing went wrong, there by causing many problems. The feedback, the clicks/popping, the backstage voices, all happened because the wrong microphone went on stage and the correct microphone went onstage.


Can somebody tell me what is up with these awards shows? Are the companies hired not adequate to meet the needs of the show? Is getting good sounding audio harder than good quality picture or lighting? Seriously I'd love to know the reason, and I don't.

I would venture to say The Grammys or The Tonys would rather rent RCMH or wherever for another day of tech rehearsals than be embarrassed by terrible audio. It seems like every review of these shows I've read makes some mention of something like "Aside from obvious Audio issues"....What gives?

A designer I work for designed Bernstein's Mass at Carnegie Hall where an fuming audience member ran up to FOH and proceded to scream at him that he couldn't "understand a effing word they are saying." To which the designer replied, "you know that they are speaking in Latin, right?"

We are in a very small demographic that notice little things like that. I mean, every single movie I've seen has feedback in it as the character approaches a microphone. Why is that an accepted occurrence? We anticipate and accept bad sound as a cost of doing business. Hell, how can people still listen to 128 and below bit rates?

But watch one of the award shows with a lighting designer or video director, it makes audio look like angels.:rolleyes:
 
Because your ear can't differentiate with anything higher. There is no point in going higher than 128 bit rate because your ear can only hear so many different sounds, and frankly why use more than 6mb per song?

Bull, I'm a lighting guy and even I can tell the difference! (V0 or death!)

For all the issues thrown at the guys, all the different acts, the limited amount of rehearsal time, I think they do an excellent job. Getting every bit right for one performer in a night is a difficult task enough, but dozens? I'm willing to cut them some slack. I worked with a guy who ran monitors, one night the cowboy's IEM pack failed mid show and he was fired for it that night. Mistakes happen, some you can't control, anyone who suggests you should be perfect or replaced has never worked in the real world.
 
Because your ear can't differentiate with anything higher. There is no point in going higher than 128 bit rate because your ear can only hear so many different sounds, and frankly why use more than 6mb per song?

While 128 is smaller in file size, there is certainly a noticeable difference in audio quality due to the heavy compression. I refused to create/buy mp3 under 192 and prefer higher such as a "lossless" format or uncompressed. In a reasonably controlled environment with a decent system or good headphones/amp, do a comparison of the same song at 128mp3 vs. maybe a Stereo SACD format and I think even an "untrained" ear could tell you that there is a big difference.....but to each his own. Due to the billions in 128mp3 downloads, most people don't care.

I think most people are satisfied because they may not know what to listen for. Just like what has been mentioned earlier in this post. People like the award shows even when they have "issues" because most of the time, that's not what they care about until it interferes with them seeing and hearing intelligibly, what they tuned in for. I have taken many volunteers (church environment) who had no clue about audio, and ruined their listening experience just by teaching them more about the subject. Now, they can point out most errors and see it in almost all productions and will have a difficult time just being a spectator.......I suppose that's our curse as "sound guys".

While there were some major audio issues during that performance, most people on this forum know that there were tons of people and tasks that were pulled off flawlessly. People like these shows because of what guys/girls (like those on this forum) can do with their amazing talent. I feel for the production crew because mistakes happen and this one was one that didn't occur in the shadows.
 
Because your ear can't differentiate with anything higher. There is no point in going higher than 128 bit rate because your ear can only hear so many different sounds, and frankly why use more than 6mb per song?
You don't hear digitally, human hearing is analog and that is infinite in the potential range of sounds. Digital media or data offers numerous advantages in distribution, copying, manipulation, etc., but for analog sources it is still an approximation of the infinite resolution possible with analog signals.

There may also be some confusion between a 128 bit bit depth and a 128kbps bit rate. Standard CD audio is a 44.1kHz sample rate and a 16 bit bit depth for two channels which equates to a 1411.2kbps bit rate. That is quite a bit greater than 128kbps and that kind of transition is made through the use of lossy compression algorithms, essentially selectively throwing out data so as to deal with only the data determined to be most relevant. On the other hand, a 128 bit bit depth with a 44.1kHz sample rate equates to a 11289.6kbps bit rate for a stereo signal, which might sound great but is 88 times the bit rate of a 128kbps stream.
 
You don't hear digitally, human hearing is analog and that is infinite in the potential range of sounds.
There are some myths that just don't die the way they should. Study how the stereocilia and their tips link. Human hearing IS digital. The same way that seeing is with the cones and rods. However it is not as intuitive as seeing.

In both cases the final perception is the result of processing by the brain.

Andre
 
Last edited:
There are some myths that just don't die the way they should. Study how the stereocilia and their tips link. Human hearing IS digital. The same way that seeing is with the cones and rods. However it is not as intuitive as seeing.

In both cases the final perception is the result of processing by the brain.
I guess it may depend on what you define as "hearing" but if one looks at the entire hearing process from input to processing then human hearing is both analog and digital. The stimuli is analog as is the conversion of the fluid (air) pressure to mechanical motion, that mechanical motion to fluid pressure in the inner ear and the fluid pressure to electrical potential by the stereocilia. I believe that from there it is essentially a bit like a transistor or gate, the movement of the stereocilia generate a electrical potential (the microphonic potential) that is analog and that at some threshold level triggers a neuron discharge, at which point it becomes binary. So unless you somehow bypass the actual hearing mechanism and go right to the synapses firing, which is a binary or single bit event, then the hearing system is responding to an analog stimulus and the hearing process includes analog elements.

Not my area of expertise but my understanding is that if one does look at the 'digital' aspect then there are something like 15,000 nerve endings associated with the hearing process, so that's apparently comparable to a 15,000 bit bit depth. With neurons able to fire from around 1 to 1,000 times per second, using an average of 100 times per second that represents 1,500,000 bps or 1,500 kbps. So neither a 128 bit bit depth or a 128kbps bit rate seem to come near the potential of human hearing.

However, in the context being addressed my point was really that the stimuli to which human hearing responds are analog.
 
Because your ear can't differentiate with anything higher. There is no point in going higher than 128 bit rate because your ear can only hear so many different sounds, and frankly why use more than 6mb per song?

Ever since I heard Julian Treasure's TED talk about audio effecting our mental health I've become pretty curious about the effects of digital quality. Do you have peer-reviewed articles or studies supporting that? I'm interested in finding out what the threshold lives for the human brain.

While I agree, listening to a 128kbps coding or CD, I cannot tell the difference between a Nickelback song. But I can absolutely tell the differences between a 128kbps and a CD quality version of Andrea Bocelli performances.;)
 
Ever since I heard Julian Treasure's TED talk about audio effecting our mental health I've become pretty curious about the effects of digital quality. Do you have peer-reviewed articles or studies supporting that? I'm interested in finding out what the threshold lives for the human brain.

While I agree, listening to a 128kbps coding or CD, I cannot tell the difference between a Nickelback song. But I can absolutely tell the differences between a 128kbps and a CD quality version of Andrea Bocelli performances.;)


I'll have to pull out my archived files, but i did do an enterance research paper to get into a study program, there was an article mentioned in there about it. I would also like to note that i did mistake 128b to 128kbps. It has also come up many times when i would do some online DJ work and deciding whether the bandwidth required by 192kbps compared to 128kbps.

Also to note even though you have 15,000 receptors it still has to go through a "digital system". Just because you have 10 Mice hooked up to your computer doesn't mean you get 10 cursors...

It's similar to your taste buds. The strongest taste is going to be the most prevalent. Of course you'll taste a few lesser intense flavors, but you wont really taste them.
 
Though I have never worked an event at the magnitude of the Grammys or Tonys, I have worked many events that feature 15+ acts in very quick succession. Using 12 wireless handhelds, and every act having multiple instruments or accompaniment music at the very least. I blame the back stage hands. If I cannot depend on my guys and girls back stage to get the right mic to the right person, then its out of my hands. I'm not back stage to verify that that has occurred. I can always use the ClearCom to have a check, but there isn't always the time for that when you are rushed. A good sound check and good documentation of what goes where is what can really make or break you during a quick change show. If the worst happens and its clear the mics have gotten mixed up, you can always PFL or send the individual mics to your console monitor headphones to try to sort it out on the fly.
 

Users who are viewing this thread

Back