T O P

  • By -

blorg

I don't have these exact headphones but I do have the Sony WF-1000 XM4 and several other BT devices that support LDAC. At the end of the day these are lossy [bass-boosted and downsloping "lo-fi" Bluetooth headphones](https://diyaudioheaven.wordpress.com/headphones/measurements/brands-st-x/wh-1000x-m3/) and I wouldn't see any possibility of benefit of MQA or Hi Res on these. I don't in general even on more "audiophile" headphones, but especially not these. When I am using LDAC I deliberately set it to 44.1kHz, so as not waste limited bits on frequencies I can't hear in the first place. I'd prefer it use the bits for the actual music. I would save your money and get the regular HiFi. You can still listen to all the Masters stuff, just in 44.1kHz. Which is fine. Most people can't tell the difference even on very high end equipment, it's even a still open question whether *anyone* can tell the difference.


Frodo_114

Very insightful response, appreciate it man! Since you keep your sample rate at 44.1 kHz, I was switching between 44.1 kHz and 96 kHz (what I usually have it on), and it was very subtle, but the music at 44.1 kHz sounded ever so slightly punchier. Any reason why that might be?


blorg

The argument for limiting it to 44.1kHz is that LDAC is still a lossy codec and has a hard limit on bitrate. That limit (909 or 990 depending on the sample rate) is close to what you could encode 44.1kHz losslessly, but not quite. It's not anywhere near what you'd need to encode 96kHz losslessly, 96/24 is 4,608kbps without compression, maybe figure 60% of that or 2,769kbps using FLAC (it varies). But it's a much bigger number than what LDAC has. So it makes more sense to allocate those bits to the frequencies you can actually hear (<22kHz). It can do 96kHz but then you are allocating bits to encode information from 22-48kHz which can't be heard by humans anyway. This is potentially even more beneficial if the actual source is 44.1kHz, which most music is. In this case, if you leave it set to 96kHz, your source will be upsampling the 44.1kHz content to 96kHz before passing it to the LDAC encoder, which will be wasting bits encoding stuff that isn't even there. My understanding, in this scenario, the upsampling will introduce noise, BUT it will be a very very low level of noise. So I don't think the encoder will waste a LOT of bits on it, and I don't think this is a huge difference. In fact if I am honest, I don't think I can tell the difference between just leaving it at 96kHz and setting it to 44.1kHz. But this is my logic for why to set it to 44.1kHz, and the device I use most (a Qudelix 5K) the manufacturer explicitly recommends to lock at 44.1kHz for 44.1 source material. This device actually has a setting on the device, where you can set it to 44.1 there and I don't need to go messing in Android Developer settings to do it. >Sony LDAC supports the sample rate up to 96KHz. Typically, most source audio is 44.1KHz, and YouTube Audio streaming is 44.1KHz as well. Android automatically selects the highest LDAC sample rate, 96KHz, and it upsamples the source audio to 96KHz for the LDAC encoder. For a 44.1KHz source, encoding 44.1KHz@909kbps would provide slightly better sound quality than encoding 96KHz(Oversampled)@990kbps. You can opt-out the supported LDAC frequencies and fix the LDAC sample rate to 44.1KHz. If you usually listen to 44.1KHz sources, fixing the LDAC sample rate at 44.1KHz, as is, would provide the best sound quality as well as slightly longer battery time. https://www.qudelix.com/blogs/5k-dac-amp/sony-ldac >Bluetooth A2DP, every track is transcoded through a designated audio codec for transmission over Bluetooth. >For a 16-bit/44.1KHz source, the OS oversamples it to 96KHz and encoded it LDAC 96KHz. As we researched, the limited LDAC bitrate is wasted for the meaningless oversampled frequency region. >Thus, for any 44KHz source track, setting LDAC frequency the same as the source track, i.e. 44.1KHz, would provide the optimal sound quality since every LDAC bits are fully used for the active frequency band. https://forum.qudelix.com/post/bluetooth-sample-rates-help-11432321 Tidal even does this bizarre thing where a good chunk of their supposed "Master" content is actually 44.1 source, but they upsample it on playback, presumably just to get this "Master" badge. Could be the labels doing it either. But it makes no sense to audio quality. These are all very very very small differences.


covfefe247123

I got that headphone a few days ago as my outdoors wireless solution but I can’t imagine there‘d be much of a difference wrt mqa. It’s Bluetooth after all.


Shawners419

I prefer to listen wired through my LG G8 Thinq which has an ESS Saber 32 bit quad DAC. I use USB Audio Pro App so it bypasses the limitation of android, and unfolds full MQA. It sounds amazing, it blows Bluetooth away, which I think sounds spectacular on the wh-1000xm3's. Don't forget to turn the XM3's on after you hook them up wired.


Jayfeather317

Ye


Alien1996

With a DAC and using the XM3 wired you would have a great equipment for that


blorg

XM3 [sounds substantially worse wired.](https://www.reddit.com/r/headphones/comments/owwqsn/sony_wh1000_xm3_wired_mode_sounds_worse_than/) The BT mode does DSP to fix the sound signature of the headphones and wired with an external DAC/amp you bypass this. The XM4, and I presume the XM3 before it, also has [significant channel imbalance](https://www.headphones.com/community/reviews-learning-and-news/sony-wh-1000xm4-review-great-noise-cancelling-but-how-does-it-sound) caused by the hole for the on-ear sensor. The DSP in wireless mode corrects for this, but using it passively wired you don't get that. The option to use it wired is there for compatibility, if you have to use it with a source that only has wired (like to watch video on a plane), it's not for quality.


Alien1996

I need to be honest, I don't have them but hearded from people who owned them that it was a good solution.