Ever noticed if you keep turning up the brightness on a TV, the black goes kinda grey? Looks shit doesn't it? What's happening is you're reducing the range of contrast available.
If that doesn't please your eyes, imagine the same situation with audio. This is basically what producers have been doing for the past 20 years. You mad?
Basically, in order to make the music louder, they take all the bits that were quiet, and make them as loud as the loud bits.
You might have heard this in a classical album. All the instruments seem to be the same volume. You'll hear a pronounced hiss before a quiet instrument plays (e.g. a triangle). This is because a previously quiet part of the track has been amplified, including its noise.
What's wrong with that you ask? I like hearing everything. Well to begin with, the hiss is annoying. More importantly, it gives the recording a distinctly artificial sound. If an instrument is barely perceptible, it's probably supposed to be.
What good is a crescendo if the harp is as loud as the brass?
This isn't just something that affects stuffy classical albums, it's actually far worse with rock.
Apparently Bob Dylan criticised increasing loudness levels -"You listen to these modern records, they're atrocious, they have sound all over them. There's no definition of nothing, no vocal, no nothing, just like—static." But Bob Dylan is a dick, and his latest albums are horrendously loud anyway.
As for modern electronic music, it's hard to say what distortion is on purpose, but a dub step drop works much better if there's actually some silence before it! Luckily for contemporary listeners, most electronic artists have an active interest in the production of their music and will devote considerable energy into balancing loudness levels. Whether or not this is preserved in the inevitable remixes and radio edits is another matter entirely.
The loudness wars are described in this article here.
No comments:
Post a Comment