

A key difference between the DD+ and True HD codecs, however, is that DD+ uses lossy compression.ĭTS works in a fairly similar manner, with a standard DTS core accompanied by extensions that enable more advanced formats. The same process at work with True HD applies here: the Atmos extension gets folded into the DD+ bitstream, and is either decoded or ignored depending on your receiver’s capabilities. When a non-compatible decoder is detected, the extension data gets ignored and you hear just the regular 5.1 or 7.1 Dolby True HD soundtrack.ĭolby Digital Plus, a codec used by video streaming services such as Netflix and Vudu, also supports Atmos. When a compatible decoder is detected, the Atmos extension will be processed. That’s because Atmos is delivered not as a codec, but as an extension to True HD that gets folded into the bitstream to maintain backwards compatibility with older gear. The “fallback” 5.1 or 7.1 mix that’s sent from the Blu-ray player to a receiver over HDMI is typically a lossless Dolby True HD soundtrack, not a lesser-quality, compressed version. Jason AcostaĪ No, there isn’t a difference. What difference, if any, is there between this default version and something like a Dolby True HD mix? My assumption is that it would be better to select a dedicated lossless mix over a backward-compatible, and presumably compressed, fallback mix. I understand that when you play an Atmos disc, older receivers are presented with a standard 5.1- or 7.1-channel version of the soundtrack for decoding. Got a tech question for Sound & Vision? Email us at My current setup includes an “older” 7.1-channel receiver that isn’t equipped to decode new sound formats such as Dolby Atmos.
