Bit rate variations

Posted by: noeluk on 15 September 2013

Hi 

Apologies if this is a really dumb question but it is puzzling me. I am re-ripping many of my CDs to Lossless files. I have noticed that different versions of the same song (e.g. the album version, a single version and a version from a compilation) all have different Bit Rates. 

 

Why is this, and is the better version the higher or lower Bit Rate?

I guess it it due to each version being mastered at different times?? I have read that some 'Remastered' re-releases (which tend to have higher bit rates than originals) use a lot more compression so can restrict the quality... is this true?

 

Or does a higher bit rate represent a broader dynamic range, and therefore is the better version of the two?? 

 

Can you straighten out my confusion? Thanks!  

Posted on: 15 September 2013 by mutterback

What software are you using to do the ripping? CD Rips should all be 16/44.1 for the file. The bitrate should always be the same.

 

Some software gives you a dynamic range (difference between loudest and 0) meter. Is that what you're looking at? Believe Foobar has this.

Posted on: 15 September 2013 by Aleg

Bit-rate is not relevant when ripping to lossless. It can vary with compression levels.

 

when ripping lossless only sample rate and bit depth are relevant and they are constant and determined by the original medium, e.g. 44.1/16 for CD's, DVD-A, DVD-V and BD can also store other combinations.

 

bit-rate is only an indicator of quality in lossy formats like mp3, aac etc.

 

cheers

 

Aleg

Posted on: 15 September 2013 by noeluk
Originally Posted by mutterback:

What software are you using to do the ripping? CD Rips should all be 16/44.1 for the file. The bitrate should always be the same.

 

Hi there mutterback & Aleg, thanks for replying.

 

I use iTunes mainly for convenience, but also Max and XLD, ripping to Apple Lossless format. Yes, this is 16/44.1 for standard audio CDs. In iTunes, each ripped track has a Bit Rate value. It is variable for Apple Lossless. 

 

 

Posted on: 15 September 2013 by mutterback

Puzzled, but a quick google search revealed that the apple lossless bit rate varies with the amount of data on the track.

 

I would stick to FLAC or WAV for your original rips. But, I guess Apple lossless is the way to go if you are using iTunes exclusively.

Posted on: 15 September 2013 by Aleg

As I said bitrate is meaningless for lossless files because the outcome depends on the compression levels applied by the alac or flac compression algorithms and in lossless audio it is no indicator of quality  whatsoever.

 

bitrate is only usefull when looking at lossy formats.

 

If you can't grasp this, please ignore the bitrate indicator

 

cheers

 

aleg