QUOTE(kaspersky-fan @ Jan 2 2010, 12:31 AM)
I'm just wondering, why ASTRO's new decoder didnt have a setting to output default resolutions as it is broadcasted? For example if the channel is 576i, then just output that, etc... instead of forcing us to choose 720p or 1080i... I understand this option allows hd ready tv to be viewed, but if only they could include another option with "Broadcasted resolution".
The reason has to do with reducing the number of phone calls Astro CS will get.
Astro can actually do signal resolution via HDMI and output SD as SD and HD as 720p or 1080i. HOWEVER, that means for ONE SECOND, the HDTV will go blank as it applies the right setting to the signal (so that SD/720p is upconverted to fill up the entire screen). You can see this clearly when you play Blu-ray Discs when the disc cycles from 1080i to 1080p to 480p because of your selection.
So imagine somebody flipping the channels on Astro.
QUOTE
Another fear I have in mind is... how does the decoder work? Lets say you set your output resolution choice of 720p. Does the decoder continues to resize again 720p broadcasted video to 720p and perform deinterlacing or it will intelligently know "ah this video stream is 720p, no need resize and deinterlace, just output the video stream straight away".
There are flags in the video stream that identifies it as such so that the decoder doesn't do a lot of work.
QUOTE
I'm sure the decoder would do that to non 720p video stream, like if it seems 1080i stream, it will do the resize and deinterlacing. I just want to know if this decoder is smart.
The decoder is smart but it's also software driven. If Astro wants to screw it up majorly, they can upload a firmware that can do that. But they won't.
QUOTE
Same goes to 1080i setting. If you set 1080i resolution, what will it do? Will it perform resize and frame interpolation to video streams broadcasted in 1080i? Or it is smart enough to know ok its the same resolution, just output it to the tv?
If you set your output as 720p, a 720p signal goes out as is while 1080i will be downconverted to 720p.
If you set your output as 1080i, a 720p signal goes out upconverted to 1080i while 1080i will be released as is.
QUOTE
And I'm also afraid that does the decoder adds noise reduction to those standard definition channel? Im very sure if they need to resize SD channels to 1080i, noise reduction + smoothing effect would be needed to ensure the picture quality stays satisfying, hence they claimed that the SD channels are "enhanced".
Apart from upconverting SD video to 720p/1080i, we don't know if NR is applied or if so, how much.
QUOTE
I have a feeling one day recording via hdmi would be possible by bypassing hdcp protocols since it has know to have some serious flaws. And if that day comes, we would still not be recording pure digital video stream out of the hdmi since the decoder has adjusted something to the streams.
Which flaw does HDCP have? Sounds like you want to hack the HD signal before it gets to the HDCP chip. But that chip may already be integrated in the decoding chip.
Worry only that when B.yond enables recording, it will record the pure digital stream prior to decoding.
fuad