Why Wi-Fi is degrading the audio signal : A physical explanation ?
Posted by: Djef on 25 September 2016
HI everybody,
I'm appealing to this forum to find a explanation to this simple fact :
If I make a easy test with A (wifi) and B (wired:ethernet) (could be run without interruptions) 20 people out of 20 will think the B (ethernet) will sound better.. and they are right... but why ?
If I send a copy of a word document over wi-fi or Ethernet I expect them to be the same , they usually are ;-)
I have small but really loved system compose of a unitiqute, a pair of ls 50 from Kef and driven by a computer based Foobar2000.
Also I have a degree in physics and I work as computer programmer.
If there is any techies that tumble upon that Thread please speak your mind.
I will start , possible explanation :
Since powers amps are sensible to electrical power modulation
Possibly the wi-fi module drain more electrical power than the Ethernet card.
If you have any idea please respond,
if you know the answer please do .
As a newbie on this forum I can not expected much.. but thanks anyway.
likesmusic posted:banzai posted:I think what likesmusic meant is that exactly the same info delivered to the DAC regardless whether it is wireless or wired. However, he misses a crucial point is that the timing between data frames from the wireless transmission are not consistent as the wired one and that could cause some degradation in music reproduction.
Sure timing between frames "could" cause some degradation in the colours of a image sent to a printer, or the sound of music sent to a DAC. But should they? Is it necessarily so? How exactly do these timing changes do that? Wifi and ethernet both send packets at inconsistent intervals, both have vastly more capacity than is needed for music. Surely they should produce indiscriminable results. And if they don't, someone's got something vastly wrong. How do they send pictures all those thousands of miles from the space station? And then round the world. Do you think that a scientist in Oxford gets a different coloured picture from a scientist in Houston? Is maybe Mars not really red because someone used wifi to print the picture?
And if your argument is that wifi frames are inconsistent and this causes an audible difference, then it follows that every time you listen to a piece of music via wifi it will sound different. Does it? If it does I would just regard my hifi as broken.
Hi Likesmusic,
Your statement is that we all can see the same printed picture. I fully agree with that. But this picture is flat, no natural colours, no 3D and gives us minor information on the real thing although we might recognize it.
But IMO our goal is that we want to see the real thing in 3D.
As soon as two detection units (ears/eyes) are involved it becomes very complicated and delicate. Our brain is processing the signals of these two detectors. You cannot hear an 3D image with one ear or see a 3D image with one eye or reproduce a 3D image with one loudspeaker.
Our brain has gigantic abilities in processing very very very low signal differences. You just have to listen and trust your own brain. The thing that made us humans survive in nature. Do not under estimate it just because we think we can explain it all with signal- and information-theory. Your brain cannot be fooled. Better stop trying this.
likesmusic posted:banzai posted:I think what likesmusic meant is that exactly the same info delivered to the DAC regardless whether it is wireless or wired. However, he misses a crucial point is that the timing between data frames from the wireless transmission are not consistent as the wired one and that could cause some degradation in music reproduction.
Sure timing between frames "could" cause some degradation in the colours of a image sent to a printer, or the sound of music sent to a DAC. But should they? Is it necessarily so? How exactly do these timing changes do that? Wifi and ethernet both send packets at inconsistent intervals, both have vastly more capacity than is needed for music. Surely they should produce indiscriminable results. And if they don't, someone's got something vastly wrong. How do they send pictures all those thousands of miles from the space station? And then round the world. Do you think that a scientist in Oxford gets a different coloured picture from a scientist in Houston? Is maybe Mars not really red because someone used wifi to print the picture?
And if your argument is that wifi frames are inconsistent and this causes an audible difference, then it follows that every time you listen to a piece of music via wifi it will sound different. Does it? If it does I would just regard my hifi as broken.
The analogy of a printer rendering an image and a DAC music reproduction is very interesting, bearing in mind that there are 2 fundamental technical differences:
1) Buffering: Printer is only ready to print an image when it receives a full image or the start and end of a printed page, whereas a DAC is playing music on the fly
2) The haphazard nature of a wifi transmission can cause more packet drops and re-transmissions, and timeouts, thus causing some degradation in the music reproduction.
3) It can sound differently each time you listen to same piece of music via wifi, especially during the evening where there is a lot of network activities.
OK, for all you people who have DACs or streamers that cannot make wifi sound the same as Ethernet.
You argue that the performance of your DAC is compromised in some way by handling wifi packets.
Wifi transmission is not continuous, it is packetised. The capacity of your network vastly exceeds the requirements of playing a cd.
So it follows that your sound should only be being degraded when a packet is being received.
Can you tell me how frequent the packets are? One a minute? One a second? How long does the performance of your system degraded while the packet is being processed? A microsecond? Half a second? A minute?
But surely all data is transmitted in packets, wired or wireless, so I'm not sure how that is relevant to the argument?
likesmusic posted:
Can you tell me how frequent the packets are? One a minute? One a second? How long does the performance of your system degraded while the packet is being processed? A microsecond? Half a second? A minute?
I haven't got a wifi trace to hand - but looking at a wired trace I created earlier I can see on the wire my Netgear NAS sending to the Naim-AsiaPacific ethernet interface 1514 bytes being at varying spacing times of less than 0.0005 seconds unless the flow is suspended because the Naim streamer TCP window buffer is full. As the Naim streamer receive window is being acknowledged this data is being spooled off and passed up to the application into the streamer processor.
With wifi and internet the frame rate is slower and more variable in timing - you see more timing variation than 0.0005 seconds generally you don't see the flow suspended because the receive buffer is full, you see more of a dynamic interchange of acknowledging data. With wifi there is also a significant amount of overhead so the amount of data required for a given size of samples is a lot higher than for Ethernet.
Theoretically, for streaming 44.1khz 16bits, will result in 4.234 bits * 10^7 per min, which is approx 60 packets per min. In practice, transmitting 44.1 khz 16 bits via wifi will result in 4 or 5 times that number.
However, it is not the bandwidth that degrades the music performance, it is the packet drops, retransmissions, recoveries and timeouts cause it.
banzai posted:Theoretically, for streaming 44.1khz 16bits, will result in 4.234 bits * 10^7 per min, which is approx 60 packets per min. In practice, transmitting 44.1 khz 16 bits via wifi will result in 4 or 5 times that number.
However, it is not the bandwidth that degrades the music performance, it is the packet drops, retransmissions, recoveries and timeouts cause it.
Hi banzai - not sure what 'packets' you are referring to... anyway if we are thinking TCP packets on an Ethernet network for stereo CD PCM - it goes something like this
CD PCM stereo Sample data: 16x2x44,100 = 1,411,200 bits/second = 176,400bytes/per second
Now assuming we are in a mid flow of sample data the pay load carried by the TCP PDU segment is determined by the frame size minus the TCP header and Ethernet header overheads. On our LANs the typical Ethernet frame size is 1514 bytes, now remove the headers, and that leaves us 1460 bytes of payload in the TCP packet.
So 176,400 divided by 1460 = approx 121 Ethernet frames per second, or one Ethernet frame every 0.0083 seconds assuming linear consistent flow - which of course it isn't - as the frames burst in as fast as the sender can provide them and the Naim streamer when full effectively tells the sender to wait - so the data travels in burst of frames on the Ethernet wire, so all you can say is there is one frame every less than 0.0083 of a second, unless flow is paused.
As you say the overhead of wifi is significantly higher than ethernet - and there is a greater flow control due to slower throughput, collision avoidance algorithms and also increased likelihood of TCP segment resends
Actually, there is a typo, I really meant ~60 packets per second (a packet is an ethernet frame, which ~1500 bytes or more precisely 1460 bytes user payload).
My calculation is 44.1 kHz 16bits = 4.234*10^7 bits per minute = ~88208 bytes per second = ~60 frames per second.
And I see that you use 44.1 16bits by 2 for PCM, which doubles my result. But please excuse my ignorance - why 44.1kHzx 16 bits x 2?
banzai posted:Theoretically, for streaming 44.1khz 16bits, will result in 4.234 bits * 10^7 per min, which is approx 60 packets per min. In practice, transmitting 44.1 khz 16 bits via wifi will result in 4 or 5 times that number.
However, it is not the bandwidth that degrades the music performance, it is the packet drops, retransmissions, recoveries and timeouts cause it.
How do you know? Have you established that with any kind of controlled, blind, listening test? Who says a packet drop necessarily causes a loss of sq? It doesn't have to. It surely isn't supposed to. Why is it allowed to in some pieces of (expensive) hifi.
banzai posted:Actually, there is a typo, I really meant ~60 packets per second (a packet is an ethernet frame, which ~1500 bytes or more precisely 1460 bytes user payload).
My calculation is 44.1 kHz 16bits = 4.234*10^7 bits per minute = ~88208 bytes per second = ~60 frames per second.
And I see that you use 44.1 16bits by 2 for PCM, which doubles my result. But please excuse my ignorance - why 44.1kHzx 16 bits x 2?
Stereo sampling i.e. 2 channels sampled at 16bits at 44.1kHz gives a sample rate of 1411 Kbps for redbook standard.
I see, my bad, must be Sunday morning
likesmusic posted:. Why is it allowed to in some pieces of (expensive) hifi.
Please talk to Naim/Linn, sir. We are still some way before perfection !!!
likesmusic postedHow do you know? Have you established that with any kind of controlled, blind, listening test? Who says a packet drop necessarily causes a loss of sq? It doesn't have to. It surely isn't supposed to. Why is it allowed to in some pieces of (expensive) hifi.
Likesmusic, I have, as I have posted previously on this forum undertaken controlled and measured evaluations, and indeed notice subtle SQ changes based on TCP segment flow patterns and timing ... not dissimilar to differences I hear between WAV and FLAC,. In short I say this accounts why many people hear differences between Tidal FLAC and locally streamed FLAC, and why Tidal SQ can vary. Tidal, by its nature of being carried via the internet, has a more dynamic set of TCP dynamics to deal with than local streaming.. if you like the RCP state machine is working harder or at least differently. The same happens with wifi
Naim are entirely aware of this, which is why it takes a long time validating firmware and its impact on SQ when the TCP/IP stack is changed.
But yes theoretically there should be no difference, but system coupling means there are often interactions in closed systems in the real world.
Simon
likesmusic posted:banzai posted:Theoretically, for streaming 44.1khz 16bits, will result in 4.234 bits * 10^7 per min, which is approx 60 packets per min. In practice, transmitting 44.1 khz 16 bits via wifi will result in 4 or 5 times that number.
However, it is not the bandwidth that degrades the music performance, it is the packet drops, retransmissions, recoveries and timeouts cause it.
How do you know? Have you established that with any kind of controlled, blind, listening test? Who says a packet drop necessarily causes a loss of sq? It doesn't have to. It surely isn't supposed to. Why is it allowed to in some pieces of (expensive) hifi.
Hi Likesmusic,
I don't think we can prove that the difference in SQ is (only) caused by packed drop, retransmissions, recoveries and/or time-outs. We are all seeking for an explanation what is causing the SQ differences with the little knowledge that we have. But please don't let us preach that if we (you) cannot prove the cause of a SQ difference that there is no SQ difference. Therefore current know-how of our hearing system and processing in the brain is too limited.
We cannot deny that all major audio companies do their tests and selections with hearing sessions. This is not without reason; We cannot measure it and/or we don't know what to measure.
Peter
Simon-in-Suffolk posted:likesmusic postedHow do you know? Have you established that with any kind of controlled, blind, listening test? Who says a packet drop necessarily causes a loss of sq? It doesn't have to. It surely isn't supposed to. Why is it allowed to in some pieces of (expensive) hifi.
Likesmusic, I have, as I have posted previously on this forum undertaken controlled and measured evaluations, and indeed notice subtle SQ changes based on TCP segment flow patterns and timing ... not dissimilar to differences I hear between WAV and FLAC,. In short I say this accounts why many people hear differences between Tidal FLAC and locally streamed FLAC, and why Tidal SQ can vary. Tidal, by its nature of being carried via the internet, has a more dynamic set of TCP dynamics to deal with than local streaming.. if you like the RCP state machine is working harder or at least differently. The same happens with wifi
Naim are entirely aware of this, which is why it takes a long time validating firmware and its impact on SQ when the TCP/IP stack is changed.
But yes theoretically there should be no difference, but system coupling means there are often interactions in closed systems in the real world.
Simon
Can a big increase in RAM help? I suspect it does.
The OP states the fact that all his testpersons (20 out of 20) clearly prefers Ethernet yet Simon says the differences he has measured is only SUBTLE.
Can we assume somethings are wrong in OP's setup?
I do not think Simon said "subtle". Any quote from him?
Yes, used "subtle" in my post quoted above, and in the grand scheme of things network data dynamics do lead to subtle changes(like more naturalness, sound have better space, music seems breath/flow better) , perhaps like comparing playing wav and FLAC... So worthwhile changes, but overall character of performance not overly changed, but clearly subtle change/difference is a relative term
Yes, all is relative of course but if these differences are as small as flac vs wav I seriously doubt that ALL test persons could easily discern Wifi from Ethernet ?
And that is possibly correct. To me wav vs flac is quite apparent... but I am sensitised to it
Thanks to everybody,
I start that thread to get new ideas that could explain "why wi-fi is degrading the audio signal"
And they were a lot of interesting one. mainly:
1) the resources ( elecrtical, processing ) needed to stream a audio signal over Ethernet or Wi-Fi is different
2) The "translation" of the digital signal is different if you use a wire copper instead of a electromagnetic wave
3) the order and time consistency of the TCP frame differ too
For those that say it SHOULD not be different, I totally agree
but even if I have immense respect for maths I will always trust my ear over an equation :-)