Outline ·
[ Standard ] ·
Linear+
AMD Radeon™ Discussion V4, Cut Cut Cut!~ Go Red !
|
bex9432
|
Nov 22 2010, 05:49 PM
|
Getting Started

|
QUOTE(Bettyem @ Nov 22 2010, 12:34 AM) Want to ask which ati graphics card can ply max setting @1920x1080 without any lag on call of duty black ops? im using HD5770 with max setting and its run smoothly at my LG 22inch - at 1920x1080 HD ready
|
|
|
|
|
|
angah_as
|
Nov 22 2010, 07:29 PM
|
Getting Started

|
QUOTE(8tvt @ Nov 22 2010, 03:01 PM) hahaha it's enough.. btw before this using 4870 also load sure reach 180W.. 150W from 6850 is nothing.. i'm using it to power up 3870, 4850, 4870 and now 6850.. how many years la it been running.. but make sure it's pure power ok, no chap ayam berkokok..  im also a HD 6850 wannabe this december but i duno wan to buy wt brand, lol
|
|
|
|
|
|
8tvt
|
Nov 22 2010, 08:44 PM
|
|
any brand also ok.. the HSF is quite ok.. if like oc maybe can take asus or msi..
|
|
|
|
|
|
law1777
|
Nov 22 2010, 09:14 PM
|
|
QUOTE(DarkSilver @ Nov 22 2010, 05:33 PM) Radeon HD6990 surely can pawn GeForce GTX580! Bow down to AMD, yet, again. By the way, can a 550W True Power PSU runs this monster? if the 6990 TDP is true then nvidia nid to bow to AMD this time!! try to imagine how much full load of 595 is since 580 already 350w
|
|
|
|
|
|
KLlee
|
Nov 22 2010, 10:37 PM
|
|
BTW guys, for AMD X3 3.0GHZ pair with 6870 will be underpower for the GC? If yes, which AMD to rec?
|
|
|
|
|
|
Demonic Wrath
|
Nov 22 2010, 10:42 PM
|
|
QUOTE(KLlee @ Nov 22 2010, 10:37 PM) BTW guys, for AMD X3 3.0GHZ pair with 6870 will be underpower for the GC? If yes, which AMD to rec? No, it won't. If you run into CPU bottleneck, most probably it's already around (or more) 80 FPS. And it depends on your monitor resolution too. If you're running 2560x1600, you'll rarely run into CPU bottleneck woth a HD6870.
|
|
|
|
|
|
xen0
|
Nov 22 2010, 11:38 PM
|
|
slide show » Click to show Spoiler - click again to hide... « » Click to show Spoiler - click again to hide... « HERE for more..they will hv new feature; EQAA
|
|
|
|
|
|
CopyX
|
Nov 23 2010, 12:01 AM
|
|
can CM Elite 400W supply enuf power a Sapphire HD5670?
|
|
|
|
|
|
chenhui87
|
Nov 23 2010, 01:42 AM
|
|
QUOTE(CopyX @ Nov 23 2010, 12:01 AM) can CM Elite 400W supply enuf power a Sapphire HD5670? sure, y not??
|
|
|
|
|
|
DarkSilver
|
Nov 23 2010, 02:00 AM
|
Idiosyncrasy
|
How much is the Radeon HD6970 will be? It seems pretty powerful. It should be on par with GeForce GTX580.
|
|
|
|
|
|
tech3910
|
Nov 23 2010, 09:25 AM
|
|
HD 6990 spec finalized http://www.hardware-infos.com/news.php?news=3767it's in german, use google translate. Added on November 23, 2010, 9:37 amQUOTE(DarkSilver @ Nov 23 2010, 02:00 AM) How much is the Radeon HD6970 will be? It seems pretty powerful. It should be on par with GeForce GTX580. there is chance of it being faster. gtx 580 is 30% faster then 6870. a 6870 has 1120 shader, which is 58.3333% of 6970 total shader. extra 41.6667% shader on 6970 most prbably will translate to @ least 30% performance boost, or even more. This post has been edited by tech3910: Nov 23 2010, 09:37 AM
|
|
|
|
|
|
zerorating
|
Nov 23 2010, 09:47 AM
|
|
QUOTE(tech3910 @ Nov 23 2010, 09:25 AM) HD 6990 spec finalized http://www.hardware-infos.com/news.php?news=3767it's in german, use google translate. Added on November 23, 2010, 9:37 amthere is chance of it being faster. gtx 580 is 30% faster then 6870. a 6870 has 1120 shader, which is 58.3333% of 6970 total shader. extra 41.6667% shader on 6970 most prbably will translate to @ least 30% performance boost, or even more. dont forget hd6900 series are completely new design, dual resterizer, 4d shader arch, double geometry compared to cypress.. hd6800 are liek redefined cypress but using smaller die size r600 v2 maybe This post has been edited by zerorating: Nov 23 2010, 09:52 AM
|
|
|
|
|
|
Demonic Wrath
|
Nov 23 2010, 11:54 AM
|
|
Uh-oh.. someone's got an explanation to do... Looks like AMD is downgrading the IQ for better performance.. Testing NVIDIA vs AMD Image QualitySource: AMD Radeon HD6870 test
|
|
|
|
|
|
zerorating
|
Nov 23 2010, 12:07 PM
|
|
QUOTE(Demonic Wrath @ Nov 23 2010, 11:54 AM) Uh-oh.. someone's got an explanation to do... Looks like AMD is downgrading the IQ for better performance.. Testing NVIDIA vs AMD Image QualitySource: AMD Radeon HD6870 testoptimization are normal as long as the worse IQ are not visible, if user see, they can always turn off Catalyst AI(lower fps), which is impossible to been turn off on nvidia card nvidia was also caught cheating before  catalyst 10.10e have 4 setting for catalyst ai for users taste compared to two setting (standard and quality) on older driver This post has been edited by zerorating: Nov 23 2010, 12:09 PM
|
|
|
|
|
|
Gamer
|
Nov 23 2010, 12:08 PM
|
|
i saw that news, AMD/ATI well done! if can't make it better then don't make it worst.
|
|
|
|
|
|
law1777
|
Nov 23 2010, 12:09 PM
|
|
QUOTE(Demonic Wrath @ Nov 23 2010, 11:54 AM) Uh-oh.. someone's got an explanation to do... Looks like AMD is downgrading the IQ for better performance.. Testing NVIDIA vs AMD Image QualitySource: AMD Radeon HD6870 testi have no idea but im not facing any flickering issues nvidia said amd is cheating.. and others said nvidia is cheating on physx.. the war will fight forever never ending. but the fact is that most of the reviews said cypress/barts are good cards
|
|
|
|
|
|
Demonic Wrath
|
Nov 23 2010, 12:12 PM
|
|
QUOTE(zerorating @ Nov 23 2010, 12:07 PM) optimization are normal as long as the worse IQ are not visible, if user see, they can always turn off Catalyst AI(lower fps), which is impossible to been turn off on nvidia card nvidia was also caught cheating before  Yes, optimization is normal. AMD does make the option to turn off Catalyst AI, but usually in benchmarks, the testers do not change the settings of the CP. This will result in higher performance in benchmarks, no? If it's okay to be like that, then AMD and NVIDIA should make the default settings to be the lowest quality so their performance is at its best.
|
|
|
|
|
|
zerorating
|
Nov 23 2010, 12:19 PM
|
|
QUOTE(Demonic Wrath @ Nov 23 2010, 12:12 PM) Yes, optimization is normal. AMD does make the option to turn off Catalyst AI, but usually in benchmarks, the testers do not change the settings of the CP. This will result in higher performance in benchmarks, no? If it's okay to be like that, then AMD and NVIDIA should make the default settings to be the lowest quality so their performance is at its best. how many fps difference you get went selecting different setting of IQ, yup i giving thumb up for nvidia, since they use quality setting first when driver was installed some stated that ComputerBase and PCGH use 'high' setting of catalayst ai compared to default 'quality' setting dont forget nvidia lie on 3dmark cpu test for running phsyx on gpu, hence more 3dmark score This post has been edited by zerorating: Nov 23 2010, 12:26 PM
|
|
|
|
|
|
DarkSilver
|
Nov 23 2010, 12:36 PM
|
Idiosyncrasy
|
QUOTE(zerorating @ Nov 23 2010, 12:19 PM) how many fps difference you get went selecting different setting of IQ, yup i giving thumb up for nvidia, since they use quality setting first when driver was installed some stated that ComputerBase and PCGH use 'high' setting of catalayst ai compared to default 'quality' setting dont forget nvidia lie on 3dmark cpu test for running phsyx on gpu, hence more 3dmark score That's why most 3DMark Vantage Competition had forbid users to use Nvidia GPU to run PhysX test. Because the score they're getting will be like 2x or 3x more than usual. This post has been edited by DarkSilver: Nov 23 2010, 12:36 PM
|
|
|
|
|
|
zerorating
|
Nov 23 2010, 12:40 PM
|
|
QUOTE(DarkSilver @ Nov 23 2010, 12:36 PM) That's why most 3DMark Vantage Competition had forbid users to use Nvidia GPU to run PhysX test. Because the score they're getting will be like 2x or 3x more than usual.  for cpu test,its suppose to bench cpu not gpu This post has been edited by zerorating: Nov 23 2010, 12:41 PM
|
|
|
|
|