Help please with some simple or perhaps not so simple questions

Posted by: JSH on 03 October 2017

Can someone help me with the nomenclature of streaming and the impact of different components on the sound, and what may well be some idiot questions or perhaps several cans of worms?

The nomenclature first, which I hope are yes/nos:-

I store my music in FLAC files on a NAS; these are the Sources, right?

I retrieve the files through Bubble or the Naim app: these are the clients, yes?

I use Asset and Minim to read the files; these are Servers are they not?

They feed the files to my Qute2 which is the Renderer; right?

 

Now the questions.

1 People here generally seem to think that WAV files sound better than FLACs. Is that so?

2 If so, should I convert my source files to WAV using Asset or some such? or

3 should I let Asset or Minim to do it on the fly?

4 If I do, is there likely to be any worthwhile audible difference between Asset doing this and Minim through ffmepg?

5 And if I do, to get the best sound, should I set Asset/Minim to stream even 16/44.1s at say 24/96 (presumably thereby adding a load of 0s around the place)

6 On a related tack, should I similarly set any old MP3s to convert to WAV too, and at 16/44/1 or 24 bit?

 

Sorry if everyone but me understands this. Thanks for any help you can offer.

Posted on: 03 October 2017 by nbpf
JSH posted:

Can someone help me with the nomenclature of streaming and the impact of different components on the sound, and what may well be some idiot questions or perhaps several cans of worms?

The nomenclature first, which I hope are yes/nos:-

I store my music in FLAC files on a NAS; these are the Sources, right?

I retrieve the files through Bubble or the Naim app: these are the clients, yes?

I use Asset and Minim to read the files; these are Servers are they not?

They feed the files to my Qute2 which is the Renderer; right?

 ....

The notion of "source" is, to the best of my understanding, poorly defined.  But (again, the best of my understanding):

- Bubble and the Naim app (Linn Kazoo, Lumin, etc.) are not UPnP client, they are UPnP control points.

- Asset and MinimServer are UPnP servers.

- The Qute2 is an integrated device consisting of a streaming module (UPnP client, renderer) and of a DAC (digital to analog converter).

- Other examples of UPnP/DLNA/OpenHome renderers are upmpdcli and MediaPlayer.

My understanding might be wrong, of course. Best, nbpf

Posted on: 03 October 2017 by nbpf
JSH posted:

...

Now the questions.

1 People here generally seem to think that WAV files sound better than FLACs. Is that so?

2 If so, should I convert my source files to WAV using Asset or some such? or

3 should I let Asset or Minim to do it on the fly?

4 If I do, is there likely to be any worthwhile audible difference between Asset doing this and Minim through ffmepg?

5 And if I do, to get the best sound, should I set Asset/Minim to stream even 16/44.1s at say 24/96 (presumably thereby adding a load of 0s around the place)

6 On a related tack, should I similarly set any old MP3s to convert to WAV too, and at 16/44/1 or 24 bit?

1. No

2. No

3. If you find that in your system .wav files sound better than .flac files, set MinimServer to transcode .flac to .wav. Otherwise  do nothing.

4. I do not know

5. No

6. No

Posted on: 03 October 2017 by hungryhalibut

1. Yes

2. No

3. Yes

4. No

5. No

6. No, because you can’t polish a turd  

 

Posted on: 03 October 2017 by Gianluigi Mazzorana

I got a brain hernia.

Posted on: 03 October 2017 by Huge
JSH posted:

Can someone help me with the nomenclature of streaming and the impact of different components on the sound, and what may well be some idiot questions or perhaps several cans of worms?


The nomenclature first, which I hope are yes/nos:-


I store my music in FLAC files on a NAS; these are the Sources, right?
Strictly speaking they are files containing the data to be streamed on request.


I retrieve the files through Bubble or the Naim app: these are the clients, yes?
These are DLNA (UPnP) "Control Points",  they control a DLNA device (in this case the UnitiQute 2).


I use Asset and Minim to read the files; these are Servers are they not?
Correct, they read the files as requested by the UQ2 and send the data stream to it across the network.  These servers are the source for the data stream.


They feed the files to my Qute2 which is the Renderer; right?
Technically it's a Media Player as it also has it's own control point.  When the UQ2 plays an album or a playlist, the IDs of all the files to be played is entered into the play queue in the UQ2.  It then works through its queue, asking the server to send it a stream of the data contained in each file in turn.


This is why you can turn off the device running the Naim app, and the play queue (in the UQ2) keeps on playing the files without the app having to tell it which file is next as each track comes up.


<snip>

For the answers to the questions, HH is quite correct.

Posted on: 03 October 2017 by JSH

Many thanks to all.  Much clearer now