That's the thing I don't quite get it
If the source is the same (let's call PCM the master copy)
means the dB level, LFE top cut-off, etc should be all fixed during the audio programing by the studio master, right?
Dolby & DTS are just using their own codec to encode the master copy
That's why they have different bitrates (just like different compression rate on the same file between winrar and winzip)
Based on this, we shd be getting the same dB level, LFE top cut-off, etc during the decoding
In fact there is difference between Dolby & DTS
Or is it because during the encoding, the codec will add value to the source to produce different results during decoding?
Now I'm getting more confused

I guess it means TrueHD and DTS MA (and LPCM) have different sounds.. (while maintaining the master track's digital resolution)
if certain frequencies are rolled off or boosted, means the final signal to the amp will be different between all of these 3 types....