this post was submitted on 06 Mar 2024
290 points (88.4% liked)

Technology

72764 readers
2095 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] aleph@lemm.ee 6 points 1 year ago (2 children)

But 24-bit audio is useless for playback. The difference is literally inaudible. In fact, the application of dynamic range compression during the mixing/mastering process has a far greater impact on perceptible audio quality than sample rate or bitrate does (the placebo effect notwithstanding).

If you care about audio quality, seek out album masters and music that is well-recorded and not dynamically crushed to oblivion. The bitrate isn't really all that important, in the greater scheme of things.

[–] resetbypeer@lemmy.world 7 points 1 year ago (1 children)

I partially agree with you. Yes mixing and mastering is far more important than bitrate. However if I let my gf listen to a identical song both in normal 16/44khz and 24 bit version, she can hear difference. Now is it night and day ? Not always, but subtle Improvement can matter when enjoying music.

[–] aleph@lemm.ee 6 points 1 year ago* (last edited 1 year ago)

Literally the only difference between 16 bit and 24 bit is that the latter has a lower noise floor, which is really only useful for sound production - It doesn't translate to any increase in meaningful detail or dynamic range when dealing with playback.

16-bit was chosen as the defacto standard for CDs and digital music precisely because it contains more than enough dynamic range for human hearing.

Any difference your gf hears is due to the placebo effect rather than any inherent difference in the actual audio.

[–] datendefekt@lemmy.ml 2 points 1 year ago (1 children)

That writeup from Xiph is excellent. The comparison with adding ultraviolet and infrared to video makes so much sense. But you're dealing with audiophiles who seriously consider getting hi-end power and ethernet cables. I read somewhere that there was a listening test with speakers connected with hanger wire - and audiophiles couldn't tell.

In the end, it's all physics. I could never hear a quality improvement beyond normal 16bit, 320kbps, no matter how demanding the music.

[–] aleph@lemm.ee 2 points 1 year ago

As a recovering audiophile, I can safely say the hobby is heavily based around FOMO (the nagging doubt that something, somewhere, in your audio chain is causing a loss of audio quality), and digital audio is no exception. Not only is 320kbps more than enough, even with $1000s worth of equipment, but with codecs more efficient than MP3 (especially Opus), even 128kbps can be good enough to sound identical to lossless.

If you have plenty of local storage then 16-bit FLAC is ideal, but if you are just streaming then you really don't need a lossless service except to keep the FOMO at bay.