Lossless? Really?

Posted by: madgerald on 29 April 2014

Not sure if this is the right place to ask this Q but pretty sure someone will be able to help...

 

Following the principle that original is best (I've been brainwashed by vinylheads) and if you mess with something you make it worse then if you are going to listen to digital music then CD must be best format (unless you can get your hands on the original uncompressed file).  

 

A good friend of mine disagrees (yes he is in IT) and says that ripped "lossless" will be as good as the original CD since its all just 1's and 0's anyway.  The only way to settle the argument would be to do a blind test streaming a ripped "lossless" CD against the original played on my CDX2 through the same DAC, amp and speakers to see if we can hear the difference.  Trouble is I don't have a separate DAC and am not about to buy one just to prove him wrong.

 

Has anyone conducted such a test and if so what were the results?  Feel free to point me at a previous post if this has been discussed before. 

 

Thanks if you can prove me righteous  

 

Bill 

Posted on: 06 May 2014 by Marky Mark

The usual obfuscation going on here. I think it is worthwhile looking at the basic facts rather than dusty academic papers. At the risk of generalisation and repetition, I might describe the 'challenge' as:

1) FLAC file (a track of 5 mins length) = 40MB
2) Throughput on home network = 10MB per second
3) Network cards / hubs throughput = 10MB per second

Given the above, I might venture the following high-level solutions:

1) transfer entire file to DAC memory in 4 seconds then play in entirety
2) smooth stream over 5 mins at 0.13 MB per second (approx 1/80th of operating throughput) adding a bit on for buffering.
3) some hybrid of the above

 

Whichever I do, decoding is a very, very, very small job. An in-memory operation that could be completed in milliseconds, whether all at once or in dripfeed fashion, for a complete FLAC file with negligible processing. Sure there may be an processing side-effect - if you're using a ZX81.

 

As mentioned on here a long time ago, a Raspberry Pi could be adapted for streaming duties. Certainly it could hold the track above in memory. As could many smartphones and even some wrist-watches.

Posted on: 06 May 2014 by Simon-in-Suffolk

Hi all, and for the edification of those who are interested here is a book, "An Introduction to Information Theory - Symbols, Signals and Noise"  - (the intro is free to read) - It is  an introduction to Signal theory, communication theory and entropy and shows how you can compress without loss of information  - or convey information efficiently - the cornerstone of many concepts in ITC - additionally it shows why digital transmission using analogue means has a statistical outcome rather than being certain.

This text is not particularly mathematical:

 

http://books.google.co.uk/books?hl=en&lr=&id=eKvhiI2ogwEC&oi=fnd&pg=PR2&dq=signal+theory+entropy&ots=bJmJJovuLs&sig=0Oc9o3HsH35bi9CkbLlTmosk_7M#v=onepage&q=signal%20theory%20entropy&f=false

 

I find this branch of physics  truly fascinating - alas it is an area that marketing overviews don't really go into as it shows real world is not straightforward - bit it allows FLAC/ALAC to accurately contain PCM in a compressed format with no loss  of information and why data networks need reliable transport protocols so as not to lose information.

 

Simon

Posted on: 06 May 2014 by Big Bill

Simon yes an interesting introductory text but it had the feeling of being quite old, certainly there wasn't much mention of digital.  There was an interesting graph of how a square wave (which can never exist as a perfect square btw) becomes degraded as it passes down a cable.  That is the point I was making earlier about triggering, as long as its not degraded below a certain trigger voltage it will be read correctly.

I could not see anything about compression and it is not a free text btw, you just see a bit of it when you follow your link.

One other thing to note is that when we talk about networks they are not necessarily the same networks he is talking about.

 

To Jude & Jota:  The only thing we can do is to trust our ears, even if they and our brains are far from ideal.  We have no other choice, might help a bit though if manufacturers started quoting figures that are (a) consistent and (b) actually meant something.

Posted on: 06 May 2014 by Simon-in-Suffolk

Bill - cheers - yes only the intro is free - I corrected by post. There seem to be several new papers by the IEEE on this topic but clearly these are more involved and are not free either.

But the principles of Entropy were developed in the 40s I believe so its not a particular new science - its just there are new applications using it. Contrary to perhaps many people's views on this a lot of these concepts are pretty well established - don't you agree?

 

BTW encoding and binary digits are covered in chapter IV - and efficient encoding chapter VII (compression). The book does address concepts pertinent to digital communication networks as well as other electrical networks.

Simon

Posted on: 06 May 2014 by Dozey

The principles of entropy were developed in the 1870's by Botzmann, Maxwell and Gibb.

 

Information entropy (aka Shannon entropy) was developed by Claude Shannon in 1948 in his paper on communication theory.

Posted on: 06 May 2014 by Simon-in-Suffolk

Dozey - thanks it was Shannon Entropy I was referring to - as opposed to the earlier thermo dynamic entropy.

Posted on: 06 May 2014 by Jude2012
Originally Posted by Big Bill:

Simon yes an interesting introductory text but it had the feeling of being quite old, certainly there wasn't much mention of digital.  There was an interesting graph of how a square wave (which can never exist as a perfect square btw) becomes degraded as it passes down a cable.  That is the point I was making earlier about triggering, as long as its not degraded below a certain trigger voltage it will be read correctly.

I could not see anything about compression and it is not a free text btw, you just see a bit of it when you follow your link.

One other thing to note is that when we talk about networks they are not necessarily the same networks he is talking about.

 

To Jude & Jota:  The only thing we can do is to trust our ears, even if they and our brains are far from ideal.  We have no other choice, might help a bit though if manufacturers started quoting figures that are (a) consistent and (b) actually meant something.

Agreed.  This is where people that advocate scientific tests as ultimate proof may have an issue with this.  In reality it is a combination of science and judgement 

Posted on: 06 May 2014 by Jude2012
Originally Posted by Marky Mark:

The usual obfuscation going on here. I think it is worthwhile looking at the basic facts rather than dusty academic papers. At the risk of generalisation and repetition, I might describe the 'challenge' as:

1) FLAC file (a track of 5 mins length) = 40MB
2) Throughput on home network = 10MB per second
3) Network cards / hubs throughput = 10MB per second

Given the above, I might venture the following high-level solutions:

1) transfer entire file to DAC memory in 4 seconds then play in entirety
2) smooth stream over 5 mins at 0.13 MB per second (approx 1/80th of operating throughput) adding a bit on for buffering.
3) some hybrid of the above

 

Whichever I do, decoding is a very, very, very small job. An in-memory operation that could be completed in milliseconds, whether all at once or in dripfeed fashion, for a complete FLAC file with negligible processing. Sure there may be an processing side-effect - if you're using a ZX81.

 

As mentioned on here a long time ago, a Raspberry Pi could be adapted for streaming duties. Certainly it could hold the track above in memory. As could many smartphones and even some wrist-watches.

Perhaps you are right, perhaps you are wrong.  No one solution for any one set-up or any one set of ears.  as for obfuscation of facts .......

Posted on: 06 May 2014 by andarkian

Time for a little more heresy, as this all getting too obscure for me. I have no idea who Sharron Entropy is but am sure she's a fine lass. ATM I am listening to ElectricLadyland downloaded to my MacBook Air in AAC from iTunes being fed via WiFi to my B&W A7, which was all I was prepared to put into my conservatory. The MacBook Air is all SSD so no pesky disks to worry about. I have to say I am very happy with the clarity, quality, precision and all round depth delivered by the A7. And as I look around my little A7 not a bit dropped out anywhere.

Posted on: 06 May 2014 by Marky Mark
Originally Posted by Jude2012:
Originally Posted by Marky Mark:

The usual obfuscation going on here. I think it is worthwhile looking at the basic facts rather than dusty academic papers. At the risk of generalisation and repetition, I might describe the 'challenge' as:

1) FLAC file (a track of 5 mins length) = 40MB
2) Throughput on home network = 10MB per second
3) Network cards / hubs throughput = 10MB per second

Given the above, I might venture the following high-level solutions:

1) transfer entire file to DAC memory in 4 seconds then play in entirety
2) smooth stream over 5 mins at 0.13 MB per second (approx 1/80th of operating throughput) adding a bit on for buffering.
3) some hybrid of the above

 

Whichever I do, decoding is a very, very, very small job. An in-memory operation that could be completed in milliseconds, whether all at once or in dripfeed fashion, for a complete FLAC file with negligible processing. Sure there may be an processing side-effect - if you're using a ZX81.

 

As mentioned on here a long time ago, a Raspberry Pi could be adapted for streaming duties. Certainly it could hold the track above in memory. As could many smartphones and even some wrist-watches.

Perhaps you are right, perhaps you are wrong.  No one solution for any one set-up or any one set of ears.  as for obfuscation of facts .......

There is no perhaps, the above simplistic measures of capacity and throughput are a perfectly sound example. You might work out what the specific parameters are for your specific situation if you so desired but I doubt it makes much odds. This is about focusing on real issues rather than making a song-and-dance about non-issues.

 

Regarding the right solution for anyone's ears, that isn't easy to compare notes on without everyone having the device\s in question available to listen to and all being in the same shared space. That is not going to happen on a forum.

Posted on: 06 May 2014 by Simon-in-Suffolk

 

>>

 I am listening to ElectricLadyland downloaded to my MacBook Air in AAC from iTunes being fed via WiFi to my B&W A7, which was all I was prepared to put into my conservatory.

>>

You are sorted then  

Posted on: 06 May 2014 by Big Bill
Originally Posted by Simon-in-Suffolk:

Bill - cheers - yes only the intro is free - I corrected by post. There seem to be several new papers by the IEEE on this topic but clearly these are more involved and are not free either.

But the principles of Entropy were developed in the 40s I believe so its not a particular new science - its just there are new applications using it. Contrary to perhaps many people's views on this a lot of these concepts are pretty well established - don't you agree?

 

BTW encoding and binary digits are covered in chapter IV - and efficient encoding chapter VII (compression). The book does address concepts pertinent to digital communication networks as well as other electrical networks.

Simon

Hey I just lost my post, what happened to it.  I was typing it and then scrolled up the page and it went!

 

Probably just as well because I was just informing Simon that the concept of entropy dates back to the 19th century and noticed that Dozey had already chimed in.

 

btw Simon remember I am a chemistry graduate so first learnt about Gibbs Free Energy and Entropy in the 1970s.  You know back when there were still Rock Bands in the UK.

 

Entropy (Time's Arrow) is an underlying part of the nature of our universe it is something that is fundamental to Life, The Universe and Everything.  It is also a bit of bind because for one thing it predicts the Heat Death of our Universe, yes The New Den everything.

 

I cannot understand how new applications use it though, I think it would have to be that new applications come to terms with it.   Rather like GPS has to come to terms with the Einstein view of time.

 

Perhaps you could describe some of these applications Simon and explain how they use it?

 

WAT: Have to say never been a big Moody's fan, although I still play a lot of Prog - Camel, Caravan, Soft Machine etc - thank you Canterbury.  But you mentioned someone magical Sandy Denny, that girl had a voice crafted in heaven.  I saw her twice in Fairport and once in Fotheringay, she was simply magical.  A very sad loss and her death was such a terrible and avoidable tragedy.  Actually guys I am starting to fill up here, speak to you later.

Posted on: 06 May 2014 by alan33

Chiming in with some trepidation...

 

My comment is to remind folks that the entire notion and existence of in-built error correction / data recovery schemes for digital encoding (of anything) is all the evidence we need that digital data storage and transmission is not error free... There are lots of ways things can go wrong, and they're generally different depending on where and when in the process they happen. Corrected digital errors cause no impact (on reading stored data, on computer program execution, on telecommunication fidelity, on audio quality): if they're corrected, they aren't errors any longer, at least as far as the next action is concerned. Uncorrected digital errors may have large or small impact - the apocryphal "single bit flipped" problem gets managed in a variety of ways (sometimes you have enough information to know there's an error, but not enough information to fix it). Most of the information theory that's being introduced conceptually in this thread has to do with how errors creep in, how one can know that, how they can be corrected, etc. This stuff is nice to know, of course, but it may not be very productive to dive into challenges beyond the task of gaining familiarity. 

 

That said, we have high confidence that the analog conversion process generally begins with a (near) perfect digital stream, thanks to the redundancy and correction. As such, it feels to me like we're having two (or more?!) conversations at the same time as if there is only one topic. Not all the audio degradation is due to failures in digital fidelity, and extraction, conversion, and processing challenges are likely responsible for the way different end-to-end systems sound and, to some extent, for the different preferences expressed by different listeners. 

 

One common thread of experience seems to be that audio quality is improved by segregating effort in different system elements (famously, transcoding before streaming rather than in the renderer) and minimizing known contamination configurations (famously, ensuring good cabling and network setup). That these may not address digital errors directly is neither here nor there I think!

 

Thanks to all contributors for a stimulating thread, alan

 

 

Posted on: 06 May 2014 by Simon-in-Suffolk

 

H Bill - regarding examples of application using information theory entropy:

 

Wikopedia is our friend

 

Information Theory Entropy:

 

which allows lossless compression:

 

which includes application such as FLAC, ALAC, DST, WMA lossless etc

 

Further when the data rate is above the 'Shannon capacity' for the information being carried and structured in a particular way random errors can be corrected - such as

 

Reed-Solomon error correction:

 

which is used to encode PCM for recoverability on CDs for example and recovers data framing errors on ADSL accesses.

 

Simon

Posted on: 06 May 2014 by Big Bill
Originally Posted by Simon-in-Suffolk:

 

H Bill - regarding examples of application using information theory entropy:

 

Wikopedia is our friend

 

Information Theory Entropy:

 

which allows lossless compression:

 

which includes application such as FLAC, ALAC, DST, WMA lossless etc

 

Further when the data rate is above the 'Shannon capacity' for the information being carried and structured in a particular way random errors can be corrected - such as

 

Reed-Solomon error correction:

 

which is used to encode PCM for recoverability on CDs for example and recovers data framing errors on ADSL accesses.

 

Simon

Simon

 

But is that making use of Entropy?  Or is it working with it?  Not sure to be honest.

 

Anyway as the post above your last one states there are two discussions going on and you keep quoting stuff on one, which I don't believe is relevant to the discussion I started. 

 

If you can't accept that digital data can flow from a NAS, or USB HD or whatever into the RAM of a streamer (its buffer) 100% error free then I can't see that you really do understand digital transmissions.  I am not worried about compression or FLAC v WAV v ALAC etc, they are side shows.  But if you believe that the data that flows into the buffers of our streamers and that data is generally corrupted to such an extent to degrade the audio output, then you need to answer the question I posted right at the beginning:  Why do our computers work?  How is it that time and time again we can load and run a program from disk?  How can we open an Excell spreadsheet that is the same as the one we saved?  etc etc.

Now if you are going to answer then I really don't want a load of invented techno-babble.  I want an answer to a simple question.

Posted on: 06 May 2014 by Simon-in-Suffolk

What?? why the insults and the defensive position. I very much understand this to the point I make a very good living from it and have even been an ITU rapporteur on aspects of this with respect to digital communication standards in the past - what is your beef? Many on this forum like my explanations, if you don't, please, please don't invite me to discuss with you only to insult me. It really is quite disgraceful and completely unnecessary and not to mention rude.

 

If you don't understand information theory and  data networks, then I  have no issue with this as some of the concepts can be counter intuitive, and I certainly won't insult you about, but hopefully I have helped the OP with his original query

 

This forum is about discussion, advice and informed debate - and since you joined it on the 1st May you have taken a hostile trolling like attitude to much that you have asked me to comment on. There are other places on the web for that but not here.

Posted on: 06 May 2014 by Big Bill
Originally Posted by Simon-in-Suffolk:

What?? why the insults and the defensive position. I very much understand this to the point I make a very good living from it and have even been an ITU rapporteur on aspects of this with respect to digital communication standards in the past - what is your beef? Many on this forum like my explanations, if you don't, please, please don't invite me to discuss with you only to insult me. It really is quite disgraceful and completely unnecessary and not to mention rude.

 

If you don't understand information theory and  data networks, then I  have no issue with this as some of the concepts can be counter intuitive, and I certainly won't insult you about, but hopefully I have helped the OP with his original query

 

This forum is about discussion, advice and informed debate - and since you joined it on the 1st May you have taken a hostile trolling like attitude to much that you have asked me to comment on. There are other places on the web for that but not here.

So you refuse to answer my question but just to resort to bombast!  I have been trying to get an answer to this question right from the start and I feel sure that a leading expect in information theory, like yourself, must be able to tell me how a comms method works perfectly well in one appliction but becomes problematical when we apply it to another application, like feeding bits to a DAC.

But I suspect you will duck and dive as before.

Posted on: 06 May 2014 by Simon-in-Suffolk

Sorry - I don't understand what you are referring to. I suggest you refrain to invite me to comment if you can't be civil.

Thank you.

Posted on: 06 May 2014 by Big Bill
Originally Posted by Simon-in-Suffolk:

Sorry - I don't understand what you are referring to. I suggest you refrain to invite me to comment if you can't be civil.

Thank you.

So I take that as a NO then?

Posted on: 06 May 2014 by Simon-in-Suffolk
Originally Posted by alan33

My comment is to remind folks that the entire notion and existence of in-built error correction / data recovery schemes for digital encoding (of anything) is all the evidence we need that digital data storage and transmission is not error free... There are lots of ways things can go wrong, and they're generally different depending on where and when in the process they happen. Corrected digital errors cause no impact (on reading stored data, on computer program execution, on telecommunication fidelity, on audio quality): if they're corrected, they aren't errors any longer, at least as far as the next action is concerned. Uncorrected digital errors may have large or small impact - the apocryphal "single bit flipped" problem gets managed in a variety of ways (sometimes you have enough information to know there's an error, but not enough information to fix it). Most of the information theory that's being introduced conceptually in this thread has to do with how errors creep in, how one can know that, how they can be corrected, etc. This stuff is nice to know, of course, but it may not be very productive to dive into challenges beyond the task of gaining familiarity. 

 

That said, we have high confidence that the analog conversion process generally begins with a (near) perfect digital stream, thanks to the redundancy and correction. As such, it feels to me like we're having two (or more?!) conversations at the same time as if there is only one topic. Not all the audio degradation is due to failures in digital fidelity, and extraction, conversion, and processing challenges are likely responsible for the way different end-to-end systems sound and, to some extent, for the different preferences expressed by different listeners. 

 

Alan - good summary in my opinion.

 

Simon

 

Posted on: 06 May 2014 by Marky Mark
Originally Posted by Simon-in-Suffolk:

What?? why the insults and the defensive position. I very much understand this to the point I make a very good living from it and have even been an ITU rapporteur on aspects of this with respect to digital communication standards in the past - what is your beef? Many on this forum like my explanations, if you don't, please, please don't invite me to discuss with you only to insult me. It really is quite disgraceful and completely unnecessary and not to mention rude.

 

If you don't understand information theory and  data networks, then I  have no issue with this as some of the concepts can be counter intuitive, and I certainly won't insult you about, but hopefully I have helped the OP with his original query

 

This forum is about discussion, advice and informed debate - and since you joined it on the 1st May you have taken a hostile trolling like attitude to much that you have asked me to comment on. There are other places on the web for that but not here.

He has a point though Simon. Sorry to say this but your posts are often full of meaningless babble. This may impress some but the point, if there is one, is often lost.

 

Often your posts seem to follow the path of badly describing something very abstract which we need to be saved from. Then not saving us from it.

Posted on: 06 May 2014 by Big Bill
Originally Posted by Marky Mark:
Originally Posted by Simon-in-Suffolk:

What?? why the insults and the defensive position. I very much understand this to the point I make a very good living from it and have even been an ITU rapporteur on aspects of this with respect to digital communication standards in the past - what is your beef? Many on this forum like my explanations, if you don't, please, please don't invite me to discuss with you only to insult me. It really is quite disgraceful and completely unnecessary and not to mention rude.

 

If you don't understand information theory and  data networks, then I  have no issue with this as some of the concepts can be counter intuitive, and I certainly won't insult you about, but hopefully I have helped the OP with his original query

 

This forum is about discussion, advice and informed debate - and since you joined it on the 1st May you have taken a hostile trolling like attitude to much that you have asked me to comment on. There are other places on the web for that but not here.

He has a point though Simon. Sorry to say this but your posts are often full of meaningless babble. This may impress some but the point, if there is one, is often lost.

 

Often your posts seem to follow the path of badly describing something very abstract which we need to be saved from. Then not saving us from it.

Thank you Mark, I was beginning to think it was just me.

 

btw Simon I take serious exception to being called a troll.  You are the one who always disputes everything I say, not a discussion but an out and out criticism and you call me a troll.

 

For example, my first post you replied "Good post, unfortunately some aspects not quite right in there (IMO)." but didn't really go on to say where I was wrong.  You just started on about the inevitable errors that occur in digital transmission.  I know about that and I know about the mechanisms to make such transmissions 100% and I thought that was obvious in my post.

 

How was that post trolling?

Posted on: 06 May 2014 by Simon-in-Suffolk

Marky - sorry to hear that - its not meaningless to me - but then I have worked with this stuff most of my career. Sometimes I perhaps assume its easier for others to pick up on this who have not studied or worked with this fascinating area - and yes sometimes my phrasing could be better.

 

However I do get feedback from some off and on this forum that do find it useful, and that motivates me to participate -  so if you can't make head nor tail of it and it irritates you could I respectfully ask you to ignore it - but i will never knowingly insult.

 

 

Posted on: 06 May 2014 by alan33
Originally Posted by Big Bill:
... if you believe that the data that flows into the buffers of our streamers and that data is generally corrupted to such an extent to degrade the audio output, then ...


Hi Big Bill,

I'm saddened by the spiralling tone in this thread. In quoting your statement, I'm admitting that my post above was a failure: I was trying to say that nobody is holding this extreme position so we should acknowledge that and move on.

 

I'd prefer the dialog to focus on contributing to understanding, rather than challenging each other to prove things or defend positions that are not being held. Simon has shared several interesting experimental observations using networking tools related to how data transfer can introduce unique noise sources that, for example, make clocking or RF rejection a real engineering challenge. This is not the same as saying data corruption is the killer problem... and it's counterproductive to insist that someone else is saying that.

Regards, alan

Posted on: 06 May 2014 by Big Bill
Originally Posted by alan33:
Originally Posted by Big Bill:
... if you believe that the data that flows into the buffers of our streamers and that data is generally corrupted to such an extent to degrade the audio output, then ...


Hi Big Bill,

I'm saddened by the spiralling tone in this thread. In quoting your statement, I'm admitting that my post above was a failure: I was trying to say that nobody is holding this extreme position so we should acknowledge that and move on.

 

I'd prefer the dialog to focus on contributing to understanding, rather than challenging each other to prove things or defend positions that are not being held. Simon has shared several interesting experimental observations using networking tools related to how data transfer can introduce unique noise sources that, for example, make clocking or RF rejection a real engineering challenge. This is not the same as saying data corruption is the killer problem... and it's counterproductive to insist that someone else is saying that.

Regards, alan

But Alan I thought that was exactly what he was saying!  He certainly disagreed with me when I took the counter argument.

But I do take your well made point, its just seeing someone obviously cutting and pasting instead of using their own voice and it is noticeable when all of a sudden someone's grammar becomes perfect and that shouldn't but it does annoy me.