Why Wi-Fi is degrading the audio signal : A physical explanation ?

HI everybody,

I'm appealing to this forum to find a explanation to this simple fact :

If I make a easy test with A (wifi) and B (wired:ethernet)  (could be run without interruptions) 20 people out of 20 will think the B (ethernet) will sound better.. and they are right... but why ?

If I send a copy of a word document over wi-fi or Ethernet I expect them to be the same , they usually are ;-)

I have small but really loved system compose of a unitiqute, a pair of ls 50 from Kef and driven by  a computer based Foobar2000.

Also I have a degree in physics and I work as computer programmer.

If there is any techies that tumble upon that Thread please speak your mind.

I will start , possible explanation :

Since  powers amps are sensible to electrical power modulation

Possibly the  wi-fi module drain more electrical power than the Ethernet card.

If you have any idea please respond, 

if you know the answer please do .

As a newbie on this forum I can not expected much.. but thanks anyway.

 

Original Post

Hi,

I have normal chromecasts + chromecast audio's on several places in my house. One of the Chromecasts is connected to the wired ethernet. The major benefit of this is not sound quality, but that it is much more snappy (starting audio 5 times quicker or so), + in case of rushhour (kids watching movies / playing computer games), it does not have any drops.

Since I use the Chromecasts audio purely as transport, using an optical cable, I don't notice any quality of sound difference since the amp has a dac and does the work. Only exepction is busy traffic on the network ...

But, this can be completely different if you compare a UQ wired versus UQ non-wired. It depends on the implementation of all the network protocols and mainly buffering (cant comment on effects of the antenna on the audio device, am simple programmer not expert in radios).

The protocol is robust so it shouldn’t matter what mechanism is used to send the data – as you say you can send a document etc etc….

 Anyway, assuming we have the perfect wi-fi connection, no drop outs, no congestion or interference, it should be the same as wired Ethernet but …

 1 – Wi-Fi card does use power so you’re giving the Uniti’s power supply an easier time with it disabled.

 2 - It’s better practice from EMI considerations not to have wireless transmitter operating near sensitive audio stages.

 3 – The differences between both transport mediums are down to changes at the digital to analogue conversion point. Noise can easily couple where you don’t want it (supply lines to clock stages, reference voltages on the DAC chip(s) etc). In a reasonably complex mixed signal device like a network player (high speed digital, low noise analogue) you’re going to get a small amount of coupling ((conducted and radiated)) between the various stages  and depending on what is going on upstream of the DAC stage this can have an effect downstream too.

 It’s a common finding (quite a few threads on here). I’m sure Simon will pop up as he’s done quite a bit of research into some of this with some interesting results.

 Out of interest, which do you find better ?

James

 

Thanks for your reply james 

As I said the sound is so much better with ethernet, it is a no contest,

I test it with a bunch of different people at home, some audiophile and some without knowledge 

they ALL choose Ethernet as the the best sound.

I can't ignore the question that Naim must have implemented something wrongly. I could - apart from loading times / loading duration of music tracks - accept a quality difference of about 10%, but in case there is a more significant deviation between wired / non-wired, there must be either a design flaw or Naim should have left the wireless option away.

Could it be that Naim limits the datatransfer using wireless and therefore get a lower bit-rate / lower quality music?

 

@DJEF, I miss the information on your source.

Naim haven't implemented it wrongly. Every design is a compromise in some way to meet performance at certain price points. Naim have chosen to ensure their network players work at their best with a wired connection. Wireless gives flexibility to those that want it but with the possibility of reduced SQ. 

 

There is and has been much debate on the internet about 0's & 1's, bit perfectness and audibility. I think the question raised in this thread is in the same league. There are no 0's & 1's in the real world and they are just numbers used in modelling and information theory. Every (energy)signal is an electromechanical wave that interacts with its surroundings (the transmission line; also a model btw). We know from the best measuring system that exists in nature (our brain & hearing) that (almost) everything in the signalpath (energypath) is audible and the signal should be treated with great care  (read: design & manufacturing of equipment based on our limited know how of our hearing system and signal fysics).

For example in case of wifi vs. ethernet the exactly same signal (according to the information model and "the word copy document") will be provided to the DAC. However, the electromechanical (energy)waves will differ a little bit from each other and thus the signal output of the DAC will differ a little bit and our hearing system has no problem detecting this difference. 

It's again our brain that tells us we prefer the sound of wired instead of WIFI and analysing the data of measurement equipment (spectrum analysers, measuring of distortion, scopes etc etc) will give no clear clue why this is the case.

This makes audio so very interesting because we all have our own hearing system and opinion on what sound we like best.  

 

 

True, but getting back to the OP, I think that there is more than only a difference between WIFI / Ethernet. There are differences audible between WIFI / Ethernet. but if you have 20/20 people hearing clearly the difference (this would never happen in my family / group of friends), I would distrust the test and think that there are at least other problems as well. I remember from a few weeks ago a test being done and at the end, there was a lossy file used.

So, I would go back to the source first. What for data is used for this test? And next after this, is the Wireless able to handle this (I could imagine that a lossless waf file is consuming to much of the wireless capacity and therefore giving problems in the wireless transfer e.g.).

In all tests / setups I've done, wireless was always able to come close to wired - nevertheless, I strongly prefer wired for all obvious reasons.

Having just tried a few network cables between a small switch and my microRendu, and then tried different USB cables, I have reluctantly acknowledged that there are real and repeatable  differences. One person I read posited that audio signals, as apposed to a document file, are very sensitive and that the quality of the cables / transport medium causes different modules / systems to kick in within, for instance, a DAC. The influence of the extra processing is audible to the user.

I suspect that our ears are simply far more sensitive than the electronic equipment we use to try and identify issues; that is assuming we know what we are trying to measure.

M

Hi M,

I fully agree. We know very little how our hearing system processes acoustic signals to a 3D image so how should we know what to measure ?? 

Think the only problem is that we have trouble trusting our own hearing and memory and we try to explain everything by what we learned and know (ratio thinking: "you cannot hear that").

The detection systems of living creatures are amazing; picture this: You're sitting in the garden enjoying your breakfast and out of the sky comes this fly straight to your slices of ham. The odor treshold of the fly must be unbelievably low or is it something else that is guiding it ??

One thing is for sure, our current drones are far from capable of doing this but for this living creatures their detection systems are highly sophisticated and dedicated so they can survive. 

Hi Bowers,

Natural selection has equipped us with some wonderful abilities.

I avoided above mentioning timing. Both the Naim DAC, and my current Bel Canto, re-clock signals internally .....so, why does the Mutec I own have such a positive influence on what I hear? 'All' it is doing is 'aggressively' re-clocking the signal.

There are both analogue and digital artifacts and influences at work and so I feel all I can do is use my ears, and try not to be caught up by some expensive foo. I objected to the cost of the AQ Cinnamon cable, must be a very healthy profit margin. That said, I do not want to spend hours making a variety of network cables to replicate their work.

M

 

Hi DJEF, >>>> Possibly the  wi-fi module drain more electrical power than the Ethernet card <<<<< that is certainly probably one contributor - but wifi has its advantages as well. Another reason is that the frame flow with wifi is more haphazard than a typical wired connection and so the timing between frames will potentially be more varying producing a richer range of  intermodulation frequencies in the connected system thereby potentially reducing the SQ... this is similar to web streaming over a wired connection on a distant or congested internet link... Thirdly there is more processing of the data with wifi protocol stack and so there is likely to be more electromagnetic interference from the wifi board and logic. Although usually screened there is some leakage..

So there you have my views

Simon

Hi M,

Imagine an experiment with two Mutecs in series. Because the second Mutec has to recondition "less", the output should be even "cleaner" ?!  Don't think that this will be measurable but I am convinced that a difference will be audible.

Five years ago my audio dealer asked if would like to try an AQ ethernet cable for my streamer (Logitech touch then). I thought he was joking but I tried it, convinced me and kept it.

We cannot try every piece of equipment; thx to this and other forums we can share our experiences although we might disagree on what sounds best.  

Peter

Ok so far we have as culpable  (thanks James N )

1)  Wi-FI use more power than ethernet wich leave less power room  for the amplification stage

    ( Since the test are based on my uniqute wich have one source of power for everything : streming , decoding , amplifying)

2) Wi-FI use electromagnetic field to send  data and Audio device are known to be sensible to electromagnetic induction which could degrade the signal  

3) The D/A converter use by the Wi-FI is different than the one use by Ethernet

For those asking about my source :

I have a computer windows 10 wich use Foobar2000 and foo_out_upnp and send the raw signal to my Uniqute.

and the file send over Ethernet will probably be a 24 bit, 48 Khz of the Flac type.

For those having a doubt about my way of conducting a differentiation field test , you are right.

It was not scientifically conduct but I stand by my point which is the audio quality of a musical pieces when use Ethernet as transport is better than wi-fi 

 

Djef posted:


3) The D/A converter use by the Wi-FI is different than the one use by Ethernet

 

The D/A converter is not different. My point (possibly not well explained in my original post) was that the Digital to Analogue conversion is the most sensitive part of the playback chain in the player. This can be affected by various factors (supply lines to clock stages, DAC reference voltages etc) that can be influenced by other things going on in other stages within the network player. With everything in one box it's difficult to prevent coupling between different sections.

See Simons post for more detail on this.  

After a talk with a friend I understood that I should not have use D/A to describe the coupling between the bit and the physical word, sorry about that. I will keep it for the stage where the bits are transfer into an audio signal. Our world is an analog one , Meaning that in order to transfer digital information we need to 'translate' it in a analog signal, so for a copper wire as a medium it would be a  electrical signal , if you use WIFI it would be a electromagnetic wave. We can call those device Couplers. Since the signal can take two different path, we can expect it to be of different quality. That was my point.

If you had a printer that printed different colours when it was sent the same file via ethernet or wi-fi what conclusions would you come to? That ethernet and wi-fi  "looked different"? That you had a rubbish printer? That you had a rubbish way of sending either ethernet or wifi? 

Djef posted:

After a talk with a friend I understood that I should not have use D/A to describe the coupling between the bit and the physical word, sorry about that. I will keep it for the stage where the bits are transfer into an audio signal. Our world is an analog one , Meaning that in order to transfer digital information we need to 'translate' it in a analog signal, so for a copper wire as a medium it would be a  electrical signal , if you use WIFI it would be a electromagnetic wave. We can call those device Couplers. Since the signal can take two different path, we can expect it to be of different quality. That was my point.

Yes but your starting point is to assume the transport mechanism (wired or wireless) is robust and the data arriving at the buffer by either method is error free and effectively the same data.

To use the usual the usual 'i work in IT response' , If it was a document file being transferred it would be the same via either connection  

likesmusic posted:

If you had a printer that printed different colours when it was sent the same file via ethernet or wi-fi what conclusions would you come to? That ethernet and wi-fi  "looked different"? That you had a rubbish printer? That you had a rubbish way of sending either ethernet or wifi? 

Hi likesmusic,

sorry to say I don't get your message, please explain.......

Peter

 

I think what likesmusic meant is that exactly the same info delivered to the DAC regardless whether it is wireless or wired. However, he misses a crucial point is that the timing between data frames from the wireless transmission  are not consistent as the wired one and that could cause some degradation in music reproduction.

banzai posted:

I think what likesmusic meant is that exactly the same info delivered to the DAC regardless whether it is wireless or wired. However, he misses a crucial point is that the timing between data frames from the wireless transmission  are not consistent as the wired one and that could cause some degradation in music reproduction.

Sure timing  between frames "could" cause some degradation in the colours of a image sent to a printer, or the sound of music sent to a DAC. But should they? Is it necessarily so? How exactly do these timing changes do that? Wifi and ethernet both send packets at inconsistent intervals, both have vastly more capacity than is needed for music. Surely they should produce indiscriminable results.  And if they don't, someone's got something vastly wrong. How do they send pictures all those thousands of miles from the space station? And then round the world. Do you think that a scientist in Oxford gets a different coloured picture from a scientist in Houston? Is maybe Mars not really red because someone used wifi to print the picture?

And if your argument is that wifi frames are inconsistent and this causes an audible difference, then it follows that every time you listen to a piece of music via wifi it will sound different. Does it? If it does I would just regard my hifi as broken.

 

Likesmusic makes a good argument. Let me expand on my post above a little bit.

Some of us accept that WAV sounds better than FLAC. Because the NAIM box has less work to do with WAV than FLAC. Therefore some of us that store in FLAC get our NAS to convert to WAV before we send it to the NAIM box.

I have a wireless link that goes to a box that converts the wifi to Ethernet that it then sends to the NAIM box. I believe that the management of the wifi connection is what causes any sound difference. Ethernet sounds better than wifi because the NAIM box has less work to do.

The less work' argument equally applies, in my opinion and experience,  to interframe timing and rate  and style of frame transfer to the streamer. We also see the higher level TCP data flow control, as managed by the streamer, varies depending on the rate and speed of the data transfer from the source such as a NAS media server or web streaming server. This affects the 'work' the streamer network stack undertakes. For me these aspects can have as much as a pronounced affect on SQ as using different Ethernet cable patch leads.

The point about different ADCs/DACs between wireless and connected copper is true and an interesting one , but these are specialised network interfacing/wifi ADCs encapsulated into specific network related chipsets, I am pretty doubtful that these  feature in any differences in SQ in terms of reconstructed audio and perturbations in the connected systems.

Simon-in-Suffolk posted:

The less work' argument equally applies, in my opinion and experience,  to interframe timing and rate  and style of frame transfer to the streamer. We also see the higher level TCP data flow control, as managed by the streamer, varies depending on the rate and speed of the data transfer from the source such as a NAS media server or web streaming server. This affects the 'work' the streamer network stack undertakes. For me these aspects can have as much as a pronounced affect on SQ as using different Ethernet cable patch leads.

Interesting Simon. I wonder if that's one of the reasons the Sonore MicroRendu seems to work so well. A low power device, optimised to do a specific task as simply as possible. Looks like there is still plenty of scope for optimisation of the path from data storage device to DAC. 

likesmusic posted:
banzai posted:

I think what likesmusic meant is that exactly the same info delivered to the DAC regardless whether it is wireless or wired. However, he misses a crucial point is that the timing between data frames from the wireless transmission  are not consistent as the wired one and that could cause some degradation in music reproduction.

Sure timing  between frames "could" cause some degradation in the colours of a image sent to a printer, or the sound of music sent to a DAC. But should they? Is it necessarily so? How exactly do these timing changes do that? Wifi and ethernet both send packets at inconsistent intervals, both have vastly more capacity than is needed for music. Surely they should produce indiscriminable results.  And if they don't, someone's got something vastly wrong. How do they send pictures all those thousands of miles from the space station? And then round the world. Do you think that a scientist in Oxford gets a different coloured picture from a scientist in Houston? Is maybe Mars not really red because someone used wifi to print the picture?

And if your argument is that wifi frames are inconsistent and this causes an audible difference, then it follows that every time you listen to a piece of music via wifi it will sound different. Does it? If it does I would just regard my hifi as broken.

 

Hi Likesmusic,

Your statement is that we all can see the same printed picture. I fully agree with that. But this picture is flat, no natural colours, no 3D and gives us minor information on the real thing although we might recognize it.

But IMO our goal is that we want to see the real thing in 3D.

As soon as two detection units (ears/eyes) are involved it becomes very complicated and delicate. Our brain is processing the signals of these two detectors. You cannot hear an 3D image with one ear or see a 3D image with one eye or reproduce a 3D image with one loudspeaker.

Our brain has gigantic abilities in processing very very very low signal differences. You just have to listen and trust your own brain. The thing that made us humans survive in nature. Do not under estimate it just because we think we can explain it all with signal- and information-theory. Your brain cannot be fooled.  Better stop trying this.

likesmusic posted:
banzai posted:

I think what likesmusic meant is that exactly the same info delivered to the DAC regardless whether it is wireless or wired. However, he misses a crucial point is that the timing between data frames from the wireless transmission  are not consistent as the wired one and that could cause some degradation in music reproduction.

Sure timing  between frames "could" cause some degradation in the colours of a image sent to a printer, or the sound of music sent to a DAC. But should they? Is it necessarily so? How exactly do these timing changes do that? Wifi and ethernet both send packets at inconsistent intervals, both have vastly more capacity than is needed for music. Surely they should produce indiscriminable results.  And if they don't, someone's got something vastly wrong. How do they send pictures all those thousands of miles from the space station? And then round the world. Do you think that a scientist in Oxford gets a different coloured picture from a scientist in Houston? Is maybe Mars not really red because someone used wifi to print the picture?

And if your argument is that wifi frames are inconsistent and this causes an audible difference, then it follows that every time you listen to a piece of music via wifi it will sound different. Does it? If it does I would just regard my hifi as broken.

 

The analogy of a printer rendering an image and a DAC music reproduction is very interesting, bearing in mind that there are 2 fundamental technical differences:

1) Buffering: Printer is only ready to print an image when it receives a full image or the start and end of a printed page, whereas a DAC is playing music on the fly
2) The haphazard nature of a wifi transmission can cause more packet drops and re-transmissions, and timeouts, thus causing some degradation in the music reproduction.
3) It can sound differently each time you listen to same piece of music via wifi, especially during the evening where there is a lot of network activities.

OK, for all you people who have DACs or streamers that cannot make wifi sound the same as Ethernet.

You argue that the performance of your DAC is compromised in some way by handling wifi packets.

Wifi transmission is not continuous, it is packetised. The capacity of  your network vastly exceeds the requirements of playing a cd.

So it follows that your sound should only be being degraded when a packet is being received.

Can you tell me how frequent the packets are? One a minute? One a second? How long does the performance of your system degraded while the packet is being processed? A microsecond? Half a second? A minute?  

likesmusic posted:

 

Can you tell me how frequent the packets are? One a minute? One a second? How long does the performance of your system degraded while the packet is being processed? A microsecond? Half a second? A minute?  

I haven't got a wifi trace to hand - but looking at a wired trace I created earlier  I can see on the wire my Netgear NAS sending to the Naim-AsiaPacific ethernet  interface  1514 bytes being at varying spacing times of less than 0.0005 seconds unless the flow is suspended because the Naim streamer TCP window buffer is full. As the Naim streamer receive window is being acknowledged this data is being spooled off and passed up to the application into the streamer processor.

With wifi and internet the frame rate is slower and more variable in timing - you see more timing variation than  0.0005 seconds  generally you don't see the flow suspended because the receive buffer is full, you see more of a dynamic interchange of acknowledging data. With wifi there is also a significant amount of overhead so the amount of data required for a given size of samples is a lot higher than for Ethernet.

 

 

Theoretically, for streaming 44.1khz 16bits, will result in  4.234 bits * 10^7 per min, which is  approx 60 packets per min. In practice, transmitting 44.1 khz 16 bits via wifi will result in 4 or 5 times that number.

However, it is not the bandwidth that degrades the music performance, it is the packet drops, retransmissions, recoveries and timeouts cause it.

banzai posted:

Theoretically, for streaming 44.1khz 16bits, will result in  4.234 bits * 10^7 per min, which is  approx 60 packets per min. In practice, transmitting 44.1 khz 16 bits via wifi will result in 4 or 5 times that number.

However, it is not the bandwidth that degrades the music performance, it is the packet drops, retransmissions, recoveries and timeouts cause it.

Hi banzai - not sure what 'packets' you are referring to... anyway if we are thinking TCP packets on an Ethernet network for stereo CD PCM - it goes something like this

CD PCM stereo Sample data:  16x2x44,100 = 1,411,200 bits/second = 176,400bytes/per second

Now assuming we are in a mid flow of sample data  the pay load carried by the TCP PDU segment  is determined by the frame size minus the TCP header and Ethernet header  overheads. On our LANs the typical Ethernet frame size is 1514 bytes, now remove the headers, and that leaves us 1460 bytes of payload in the TCP packet.

So 176,400 divided by 1460 = approx 121 Ethernet frames per second, or one Ethernet frame every 0.0083 seconds assuming linear consistent flow - which of course it isn't - as the frames burst in as fast as the sender can provide them and the Naim streamer when full effectively tells the sender to wait - so the data travels in burst of frames on the Ethernet wire, so all you can say is there is one frame every less than 0.0083 of a second, unless flow is paused.

As you say the overhead of wifi is significantly higher than ethernet - and there is a greater flow control due to slower throughput, collision avoidance algorithms and also increased likelihood of TCP segment resends 

Actually, there is a typo, I really meant ~60 packets per second (a packet is an ethernet frame, which ~1500 bytes or more precisely 1460 bytes user payload).

My calculation is 44.1 kHz 16bits = 4.234*10^7 bits per minute = ~88208 bytes per second = ~60 frames per second.

And I see that you use 44.1 16bits by 2 for PCM, which doubles my result. But please excuse my ignorance - why 44.1kHzx 16 bits x 2?

banzai posted:

Theoretically, for streaming 44.1khz 16bits, will result in  4.234 bits * 10^7 per min, which is  approx 60 packets per min. In practice, transmitting 44.1 khz 16 bits via wifi will result in 4 or 5 times that number.

However, it is not the bandwidth that degrades the music performance, it is the packet drops, retransmissions, recoveries and timeouts cause it.

How do you know? Have you established that with any kind of controlled, blind, listening test? Who says a packet drop necessarily causes a loss of sq? It doesn't have to. It surely isn't supposed to. Why is it allowed to in some pieces of (expensive) hifi. 

banzai posted:

Actually, there is a typo, I really meant ~60 packets per second (a packet is an ethernet frame, which ~1500 bytes or more precisely 1460 bytes user payload).

My calculation is 44.1 kHz 16bits = 4.234*10^7 bits per minute = ~88208 bytes per second = ~60 frames per second.

And I see that you use 44.1 16bits by 2 for PCM, which doubles my result. But please excuse my ignorance - why 44.1kHzx 16 bits x 2?

Stereo sampling i.e. 2 channels sampled at 16bits at 44.1kHz gives a sample rate of 1411 Kbps for redbook standard.

likesmusic posted

How do you know? Have you established that with any kind of controlled, blind, listening test? Who says a packet drop necessarily causes a loss of sq? It doesn't have to. It surely isn't supposed to. Why is it allowed to in some pieces of (expensive) hifi. 

Likesmusic, I have, as I have posted previously on this forum undertaken controlled and measured evaluations, and indeed notice subtle SQ changes based on TCP segment flow patterns and timing ... not dissimilar to differences I hear between WAV and FLAC,. In short I say this accounts why many people hear differences between Tidal FLAC and locally streamed FLAC, and why Tidal SQ can vary. Tidal, by its nature of being carried via the internet, has a more dynamic set of TCP dynamics to deal with than local streaming.. if you like the RCP state machine is working harder or at least differently. The same happens with wifi

Naim are entirely aware of this, which is why it takes a long time validating firmware and its impact on SQ when the TCP/IP stack is changed. 

But yes theoretically there should be no difference, but system coupling means there are often interactions in closed systems in the real world.

Simon

 

likesmusic posted:
banzai posted:

Theoretically, for streaming 44.1khz 16bits, will result in  4.234 bits * 10^7 per min, which is  approx 60 packets per min. In practice, transmitting 44.1 khz 16 bits via wifi will result in 4 or 5 times that number.

However, it is not the bandwidth that degrades the music performance, it is the packet drops, retransmissions, recoveries and timeouts cause it.

How do you know? Have you established that with any kind of controlled, blind, listening test? Who says a packet drop necessarily causes a loss of sq? It doesn't have to. It surely isn't supposed to. Why is it allowed to in some pieces of (expensive) hifi. 

Hi Likesmusic,

I don't think we can prove that the difference in SQ is (only) caused by packed drop, retransmissions, recoveries and/or time-outs. We are all seeking for an explanation what is causing the SQ differences with the little knowledge that we have. But please don't let us preach that if we (you) cannot prove the cause of a SQ difference that there is no SQ difference.  Therefore current know-how of our hearing system and processing in the brain is too limited.

We cannot deny that all major audio companies do their tests and selections with hearing sessions. This is not without reason; We cannot measure it and/or we don't know what to measure.

Peter

 

Likes (0)
×
×
×
×