Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed

Outline · [ Standard ] · Linear+

 AMD Radeon™ Discussion V4, Cut Cut Cut!~ Go Red !

views
     
Demonic Wrath
post Nov 15 2010, 01:25 AM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

No Crysis 2 graphic doesn't suck compared to Crysis 1. It's just much more scalable with various rigs. If max settings, it'd look nicer.
Demonic Wrath
post Nov 18 2010, 12:48 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Doesn't the Catalyst has specific profile on every applications e.g. Turn Off MLAA for WLM, and so on..

MLAA is basically blurring the whole picture with a post process filter. Something like smudging in Photoshop.
Demonic Wrath
post Nov 18 2010, 01:31 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

It's updated to 10.11 at AMD website already.

Win x86
Win x64

Performance Improvements:-
The following performance improvements were observed with this release of AMD Catalyst 10.11:
  • Battleforge™ : Performance increases up to 3% on ATI Radeon™ HD 5800 Series single and CrossFire configurations with anti-aliasing disabled.
  • STALKER – Call of Pripyat™ benchmark: Performance increases up to 5% on ATI Radeon™ HD 5800 Series single and CrossFire configurations


This post has been edited by Demonic Wrath: Nov 18 2010, 01:33 PM
Demonic Wrath
post Nov 18 2010, 05:25 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Use the drivers provided by AMD. Version 10.10e.
Demonic Wrath
post Nov 19 2010, 01:46 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Okay, given that Metro 2033 is optimized for CPU PhysX, I wonder how a dual core/quad core perform if using CPU physx. That article shows hexacore and I doubt a lot of people here is using that. Even if the developer multithreads the PhysX code heavily and make use of multicore processors, a dual core/quad core still can't really compete with GPU in Physx even if multithreaded.

If optimized, most likely a 9600GT will perform on par with a x6 1060T.
Demonic Wrath
post Nov 21 2010, 07:45 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(leyley @ Nov 21 2010, 05:27 PM)
Which NVIDIA card go against ATI Mobility 5830?
*
GTX 260M, 9800M GTX. If 300 series, then it's GTS 360M (higher performance than Mob. HD5830).
Demonic Wrath
post Nov 21 2010, 10:56 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(41LY45 @ Nov 21 2010, 09:50 PM)
what about GT 435M?
*
M. HD5830 is a lot faster than GT 435M IINM since GT 435M is slower than 9800M GTX.
Demonic Wrath
post Nov 22 2010, 10:42 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(KLlee @ Nov 22 2010, 10:37 PM)
BTW guys, for AMD X3 3.0GHZ pair with 6870 will be underpower for the GC? If yes, which AMD to rec?
*
No, it won't. If you run into CPU bottleneck, most probably it's already around (or more) 80 FPS. And it depends on your monitor resolution too. If you're running 2560x1600, you'll rarely run into CPU bottleneck woth a HD6870.
Demonic Wrath
post Nov 23 2010, 11:54 AM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Uh-oh.. someone's got an explanation to do... Looks like AMD is downgrading the IQ for better performance..
Testing NVIDIA vs AMD Image Quality
Source: AMD Radeon HD6870 test
Demonic Wrath
post Nov 23 2010, 12:12 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(zerorating @ Nov 23 2010, 12:07 PM)
optimization are normal as long as the worse IQ are not visible, if user see, they can always turn off Catalyst AI(lower fps), which is impossible to been turn  off on nvidia card
nvidia was also caught cheating before  biggrin.gif
*
Yes, optimization is normal. AMD does make the option to turn off Catalyst AI, but usually in benchmarks, the testers do not change the settings of the CP. This will result in higher performance in benchmarks, no? If it's okay to be like that, then AMD and NVIDIA should make the default settings to be the lowest quality so their performance is at its best.
Demonic Wrath
post Nov 23 2010, 12:53 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(zerorating @ Nov 23 2010, 12:19 PM)
how many fps difference you get went selecting different setting of IQ, yup i giving thumb up for nvidia, since they use quality setting first when driver was installed
some stated that ComputerBase and PCGH use 'high' setting of catalayst ai compared to default 'quality' setting
dont forget nvidia lie on 3dmark cpu test for running phsyx on gpu, hence more 3dmark score
*
I've tested in benchmarks and got around 5% improvement going from Quality to High Performance [NVIDIA CP].
Demonic Wrath
post Dec 29 2010, 12:32 AM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(DarkSilver @ Dec 27 2010, 09:03 PM)
Currently, Radeon HD6800-Series CrossFireX Scaling is the BEST.
Radeon HD6900-Series are not as good as Radeon HD6800-Series.

Based on reviews,
2x Radeon HD6870 CFX is on par with 2x GeForce GTX580 SLi.
This is due to GeForce GTX580 have very poor SLi Scaling.
*
HD 6870 CFX is on par with GTX 580 SLI? Which review is that?


Added on December 29, 2010, 12:33 am
QUOTE(saturn85 @ Dec 28 2010, 11:40 PM)
if the HD6950 successfully unlock to 1536 sp and clock to HD6970 clock.
it will perform as HD6970. brows.gif
*
Slap a HD6970 sticker on it and voila, HD6970.

This post has been edited by Demonic Wrath: Dec 29 2010, 12:33 AM

Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0403sec    0.50    7 queries    GZIP Disabled
Time is now: 12th December 2025 - 12:25 AM