Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
110 Pages « < 47 48 49 50 51 > » Bottom

Outline · [ Standard ] · Linear+

 AMD Radeon™ Discussion V4, Cut Cut Cut!~ Go Red !

views
     
bex9432
post Nov 22 2010, 05:49 PM

Getting Started
**
Junior Member
111 posts

Joined: Mar 2010
QUOTE(Bettyem @ Nov 22 2010, 12:34 AM)
Want to ask
which ati graphics card can ply max setting @1920x1080 without any lag on call of duty black ops?
*
im using HD5770 with max setting and its run smoothly at my LG 22inch - at 1920x1080 HD ready
angah_as
post Nov 22 2010, 07:29 PM

Getting Started
**
Junior Member
279 posts

Joined: Sep 2009
From: raincity

QUOTE(8tvt @ Nov 22 2010, 03:01 PM)
hahaha it's enough.. btw before this using 4870 also load sure reach 180W..
150W from 6850 is nothing..

i'm using it to power up 3870, 4850, 4870 and now 6850..
how many years la it been running..  sweat.gif
but make sure it's pure power ok, no chap ayam berkokok..  laugh.gif
*
rclxms.gif
im also a HD 6850 wannabe this december rclxm9.gif
but i duno wan to buy wt brand, lol
8tvt
post Nov 22 2010, 08:44 PM

Peace Lover
*******
Senior Member
8,753 posts

Joined: Jan 2003
any brand also ok.. the HSF is quite ok..
if like oc maybe can take asus or msi..
law1777
post Nov 22 2010, 09:14 PM

DreamMan
*******
Senior Member
2,654 posts

Joined: Dec 2007


QUOTE(DarkSilver @ Nov 22 2010, 05:33 PM)
Radeon HD6990 surely can pawn GeForce GTX580!
Bow down to AMD, yet, again.

By the way, can a 550W True Power PSU runs this monster?
*
if the 6990 TDP is true then nvidia nid to bow to AMD this time!! try to imagine how much full load of 595 is since 580 already 350w
KLlee
post Nov 22 2010, 10:37 PM

Enthusiast
*****
Senior Member
845 posts

Joined: Nov 2007


BTW guys, for AMD X3 3.0GHZ pair with 6870 will be underpower for the GC? If yes, which AMD to rec?
Demonic Wrath
post Nov 22 2010, 10:42 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(KLlee @ Nov 22 2010, 10:37 PM)
BTW guys, for AMD X3 3.0GHZ pair with 6870 will be underpower for the GC? If yes, which AMD to rec?
*
No, it won't. If you run into CPU bottleneck, most probably it's already around (or more) 80 FPS. And it depends on your monitor resolution too. If you're running 2560x1600, you'll rarely run into CPU bottleneck woth a HD6870.
xen0
post Nov 22 2010, 11:38 PM

ismi..alif..lam..ya..fa
******
Senior Member
1,486 posts

Joined: Jun 2005
From: Cyberjaya/Kamunting


slide show

» Click to show Spoiler - click again to hide... «

» Click to show Spoiler - click again to hide... «


HERE for more..they will hv new feature; EQAA
CopyX
post Nov 23 2010, 12:01 AM

Casual
***
Junior Member
317 posts

Joined: May 2006
From: City Of Angels



can CM Elite 400W supply enuf power a Sapphire HD5670?
chenhui87
post Nov 23 2010, 01:42 AM

♥Jigeumeun So Nyuh Shi Dae♥
*******
Senior Member
2,150 posts

Joined: Apr 2009
From: 首尔



QUOTE(CopyX @ Nov 23 2010, 12:01 AM)
can CM Elite 400W supply enuf power a Sapphire HD5670?
*
sure, y not??
DarkSilver
post Nov 23 2010, 02:00 AM

Idiosyncrasy
Group Icon
Elite
10,501 posts

Joined: Oct 2009
From: Tamriel


How much is the Radeon HD6970 will be?
It seems pretty powerful. It should be on par with GeForce GTX580.
tech3910
post Nov 23 2010, 09:25 AM

Anonymous
*******
Senior Member
5,644 posts

Joined: Feb 2008
From: Heaven to HELL


HD 6990 spec finalized

http://www.hardware-infos.com/news.php?news=3767

it's in german, use google translate.


Added on November 23, 2010, 9:37 am
QUOTE(DarkSilver @ Nov 23 2010, 02:00 AM)
How much is the Radeon HD6970 will be?
It seems pretty powerful. It should be on par with GeForce GTX580.
*
there is chance of it being faster.

gtx 580 is 30% faster then 6870.
a 6870 has 1120 shader, which is 58.3333% of 6970 total shader.
extra 41.6667% shader on 6970 most prbably will translate to @ least 30% performance boost, or even more.

This post has been edited by tech3910: Nov 23 2010, 09:37 AM
zerorating
post Nov 23 2010, 09:47 AM

Miskin Adab
*****
Senior Member
975 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(tech3910 @ Nov 23 2010, 09:25 AM)
HD 6990 spec finalized

http://www.hardware-infos.com/news.php?news=3767

it's in german, use google translate.


Added on November 23, 2010, 9:37 am
there is chance of it being faster.

gtx 580 is 30% faster then 6870.
a 6870 has 1120 shader, which is 58.3333% of 6970 total shader.
extra 41.6667% shader on 6970 most prbably will translate to @ least 30% performance boost, or even more.
*
dont forget hd6900 series are completely new design, dual resterizer, 4d shader arch, double geometry compared to cypress..
hd6800 are liek redefined cypress but using smaller die size
r600 v2 maybe hmm.gif

This post has been edited by zerorating: Nov 23 2010, 09:52 AM
Demonic Wrath
post Nov 23 2010, 11:54 AM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Uh-oh.. someone's got an explanation to do... Looks like AMD is downgrading the IQ for better performance..
Testing NVIDIA vs AMD Image Quality
Source: AMD Radeon HD6870 test
zerorating
post Nov 23 2010, 12:07 PM

Miskin Adab
*****
Senior Member
975 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(Demonic Wrath @ Nov 23 2010, 11:54 AM)
Uh-oh.. someone's got an explanation to do... Looks like AMD is downgrading the IQ for better performance..
Testing NVIDIA vs AMD Image Quality
Source: AMD Radeon HD6870 test
*
optimization are normal as long as the worse IQ are not visible, if user see, they can always turn off Catalyst AI(lower fps), which is impossible to been turn off on nvidia card
nvidia was also caught cheating before biggrin.gif
catalyst 10.10e have 4 setting for catalyst ai for users taste compared to two setting (standard and quality) on older driver

This post has been edited by zerorating: Nov 23 2010, 12:09 PM
Gamer
post Nov 23 2010, 12:08 PM

Be Original
*******
Senior Member
3,366 posts

Joined: Jan 2003
From: Sarawak, Sibu


i saw that news, AMD/ATI well done! if can't make it better then don't make it worst.
law1777
post Nov 23 2010, 12:09 PM

DreamMan
*******
Senior Member
2,654 posts

Joined: Dec 2007


QUOTE(Demonic Wrath @ Nov 23 2010, 11:54 AM)
Uh-oh.. someone's got an explanation to do... Looks like AMD is downgrading the IQ for better performance..
Testing NVIDIA vs AMD Image Quality
Source: AMD Radeon HD6870 test
*
i have no idea but im not facing any flickering issues hmm.gif

nvidia said amd is cheating.. and others said nvidia is cheating on physx.. the war will fight forever never ending. but the fact is that most of the reviews said cypress/barts are good cards
Demonic Wrath
post Nov 23 2010, 12:12 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(zerorating @ Nov 23 2010, 12:07 PM)
optimization are normal as long as the worse IQ are not visible, if user see, they can always turn off Catalyst AI(lower fps), which is impossible to been turn  off on nvidia card
nvidia was also caught cheating before  biggrin.gif
*
Yes, optimization is normal. AMD does make the option to turn off Catalyst AI, but usually in benchmarks, the testers do not change the settings of the CP. This will result in higher performance in benchmarks, no? If it's okay to be like that, then AMD and NVIDIA should make the default settings to be the lowest quality so their performance is at its best.
zerorating
post Nov 23 2010, 12:19 PM

Miskin Adab
*****
Senior Member
975 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(Demonic Wrath @ Nov 23 2010, 12:12 PM)
Yes, optimization is normal. AMD does make the option to turn off Catalyst AI, but usually in benchmarks, the testers do not change the settings of the CP. This will result in higher performance in benchmarks, no? If it's okay to be like that, then AMD and NVIDIA should make the default settings to be the lowest quality so their performance is at its best.
*
how many fps difference you get went selecting different setting of IQ, yup i giving thumb up for nvidia, since they use quality setting first when driver was installed
some stated that ComputerBase and PCGH use 'high' setting of catalayst ai compared to default 'quality' setting
dont forget nvidia lie on 3dmark cpu test for running phsyx on gpu, hence more 3dmark score

This post has been edited by zerorating: Nov 23 2010, 12:26 PM
DarkSilver
post Nov 23 2010, 12:36 PM

Idiosyncrasy
Group Icon
Elite
10,501 posts

Joined: Oct 2009
From: Tamriel


QUOTE(zerorating @ Nov 23 2010, 12:19 PM)
how many fps difference you get went selecting different setting of IQ, yup i giving thumb up for nvidia, since they use quality setting first when driver was installed
some stated that ComputerBase and PCGH use 'high' setting of catalayst ai compared to default 'quality' setting
dont forget nvidia lie on 3dmark cpu test for running phsyx on gpu, hence more 3dmark score
*
That's why most 3DMark Vantage Competition had forbid users to use Nvidia GPU to run PhysX test.
Because the score they're getting will be like 2x or 3x more than usual. sweat.gif

This post has been edited by DarkSilver: Nov 23 2010, 12:36 PM
zerorating
post Nov 23 2010, 12:40 PM

Miskin Adab
*****
Senior Member
975 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(DarkSilver @ Nov 23 2010, 12:36 PM)
That's why most 3DMark Vantage Competition had forbid users to use Nvidia GPU to run PhysX test.
Because the score they're getting will be like 2x or 3x more than usual.  sweat.gif
*
for cpu test,its suppose to bench cpu not gpu whistling.gif

This post has been edited by zerorating: Nov 23 2010, 12:41 PM

110 Pages « < 47 48 49 50 51 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0191sec    1.06    6 queries    GZIP Disabled
Time is now: 12th December 2025 - 04:12 AM