The resolution/bitrate of this video clip you using? The video clip is the
constant in this case, and the variables are the different screens shown by the images you quoted.
Let's say the video clip's resolution is 1280x720. On the S2, it is downsampled, from a higher resolution source (1280 x 720) to a lower resolution screen (800 x 480). When your source content resolution is better than the screen it is used to display, the result is always pleasing, you would expect lesser details, since the display is only capable of showing it at a lower resolution. On your Note, it is either being unconverted ( 1280 x 720 to 1280 x 800 ) or maintained at the screen resolution of the Note (1280 x 800), therefore retaining video detail. But video detail retained can either be shown as the right pixel information
IF and ONLY IF the resolution the video clip is in be matched with sufficiently HIGH enough nitrate to maintain image detail and clarity. In this case, since the Note's Android OS architecture only supports file size of 4GB or smaller, you have to set a low enough bitrate to fit into a 4GB file size. Hence low enough bitrate and therefore restricted pixel information and image detail, to show macro blocking due to low bitrate.
You don't see it much on the S2 because the smaller screen size and screen resolution, artificially masks the macro blocking, or showing them as a more "gradual" banding. Compare that to the Note's 5.3" screen with higher resolution, but still having to play with the same 4GB file size restriction of the Android OS, you're stuck with the same low bitrate you can play around with the S2, but having to display it on a bigger screen, hence making those macro blocks more apparent. Similar to blowing up a snapped photo to pixel-peep.
So which would you rather have? A smaller sized screen but lower resolution giving you a more "forgiving" picture because flaws aren't readily apparent, or a larger sized screen but higher resolution, showing you more of the flaws because it's more apparent due to the limited bitrate to fit into the 4GB file size?
Isn't that better? I don't want to hunt around for patches of black macro blocks or color banding while watching my movies, so HW acceleration by GPU is more desirable.
The bad news for me is that diceplayer is not available to buy anymore from the market. They said got problem with their google checkout account :-(