No Crysis 2 graphic doesn't suck compared to Crysis 1. It's just much more scalable with various rigs. If max settings, it'd look nicer.
AMD Radeon™ Discussion V4, Cut Cut Cut!~ Go Red !
AMD Radeon™ Discussion V4, Cut Cut Cut!~ Go Red !
|
|
Nov 15 2010, 01:25 AM
Return to original view | Post
#21
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
No Crysis 2 graphic doesn't suck compared to Crysis 1. It's just much more scalable with various rigs. If max settings, it'd look nicer.
|
|
|
|
|
|
Nov 18 2010, 12:48 PM
Return to original view | Post
#22
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
Doesn't the Catalyst has specific profile on every applications e.g. Turn Off MLAA for WLM, and so on..
MLAA is basically blurring the whole picture with a post process filter. Something like smudging in Photoshop. |
|
|
Nov 18 2010, 01:31 PM
Return to original view | Post
#23
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
It's updated to 10.11 at AMD website already.
Win x86 Win x64 Performance Improvements:- The following performance improvements were observed with this release of AMD Catalyst 10.11:
This post has been edited by Demonic Wrath: Nov 18 2010, 01:33 PM |
|
|
Nov 18 2010, 05:25 PM
Return to original view | Post
#24
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
Use the drivers provided by AMD. Version 10.10e.
|
|
|
Nov 19 2010, 01:46 PM
Return to original view | Post
#25
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
Okay, given that Metro 2033 is optimized for CPU PhysX, I wonder how a dual core/quad core perform if using CPU physx. That article shows hexacore and I doubt a lot of people here is using that. Even if the developer multithreads the PhysX code heavily and make use of multicore processors, a dual core/quad core still can't really compete with GPU in Physx even if multithreaded.
If optimized, most likely a 9600GT will perform on par with a x6 1060T. |
|
|
Nov 21 2010, 07:45 PM
Return to original view | Post
#26
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
|
|
|
|
|
|
Nov 21 2010, 10:56 PM
Return to original view | Post
#27
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
|
|
|
Nov 22 2010, 10:42 PM
Return to original view | Post
#28
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
QUOTE(KLlee @ Nov 22 2010, 10:37 PM) No, it won't. If you run into CPU bottleneck, most probably it's already around (or more) 80 FPS. And it depends on your monitor resolution too. If you're running 2560x1600, you'll rarely run into CPU bottleneck woth a HD6870. |
|
|
Nov 23 2010, 11:54 AM
Return to original view | Post
#29
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
Uh-oh.. someone's got an explanation to do... Looks like AMD is downgrading the IQ for better performance..
Testing NVIDIA vs AMD Image Quality Source: AMD Radeon HD6870 test |
|
|
Nov 23 2010, 12:12 PM
Return to original view | Post
#30
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
QUOTE(zerorating @ Nov 23 2010, 12:07 PM) optimization are normal as long as the worse IQ are not visible, if user see, they can always turn off Catalyst AI(lower fps), which is impossible to been turn off on nvidia card Yes, optimization is normal. AMD does make the option to turn off Catalyst AI, but usually in benchmarks, the testers do not change the settings of the CP. This will result in higher performance in benchmarks, no? If it's okay to be like that, then AMD and NVIDIA should make the default settings to be the lowest quality so their performance is at its best.nvidia was also caught cheating before |
|
|
Nov 23 2010, 12:53 PM
Return to original view | Post
#31
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
QUOTE(zerorating @ Nov 23 2010, 12:19 PM) how many fps difference you get went selecting different setting of IQ, yup i giving thumb up for nvidia, since they use quality setting first when driver was installed I've tested in benchmarks and got around 5% improvement going from Quality to High Performance [NVIDIA CP].some stated that ComputerBase and PCGH use 'high' setting of catalayst ai compared to default 'quality' setting dont forget nvidia lie on 3dmark cpu test for running phsyx on gpu, hence more 3dmark score |
|
|
Dec 29 2010, 12:32 AM
Return to original view | Post
#32
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
QUOTE(DarkSilver @ Dec 27 2010, 09:03 PM) Currently, Radeon HD6800-Series CrossFireX Scaling is the BEST. HD 6870 CFX is on par with GTX 580 SLI? Which review is that?Radeon HD6900-Series are not as good as Radeon HD6800-Series. Based on reviews, 2x Radeon HD6870 CFX is on par with 2x GeForce GTX580 SLi. This is due to GeForce GTX580 have very poor SLi Scaling. Added on December 29, 2010, 12:33 am QUOTE(saturn85 @ Dec 28 2010, 11:40 PM) Slap a HD6970 sticker on it and voila, HD6970.This post has been edited by Demonic Wrath: Dec 29 2010, 12:33 AM |
|
Topic ClosedOptions
|
| Change to: | 0.0403sec
0.50
7 queries
GZIP Disabled
Time is now: 12th December 2025 - 12:25 AM |