Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
110 Pages « < 48 49 50 51 52 > » Bottom

Outline · [ Standard ] · Linear+

 AMD Radeon™ Discussion V4, Cut Cut Cut!~ Go Red !

views
     
Demonic Wrath
post Nov 23 2010, 12:53 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(zerorating @ Nov 23 2010, 12:19 PM)
how many fps difference you get went selecting different setting of IQ, yup i giving thumb up for nvidia, since they use quality setting first when driver was installed
some stated that ComputerBase and PCGH use 'high' setting of catalayst ai compared to default 'quality' setting
dont forget nvidia lie on 3dmark cpu test for running phsyx on gpu, hence more 3dmark score
*
I've tested in benchmarks and got around 5% improvement going from Quality to High Performance [NVIDIA CP].
zerorating
post Nov 23 2010, 02:24 PM

Miskin Adab
*****
Senior Member
975 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(Demonic Wrath @ Nov 23 2010, 12:53 PM)
I've tested in benchmarks and got around 5% improvement going from Quality to High Performance [NVIDIA CP].
*
just recently i use to change texture quality (use 8600m gt) on resident evil (fixed benchmark)
quality-46.1fps
performance-46.3fps
high performance-46.6fps
not really 5% much, are u use quality, performance setting from "Adjust image quality with preview" and not Textture filtering - Quality from "Manage 3D setting" from nvidia cp? hmm.gif
sorry out of topic,mod please remove this if unrelevant

This post has been edited by zerorating: Nov 23 2010, 02:30 PM
DarkSilver
post Nov 23 2010, 02:39 PM

Idiosyncrasy
Group Icon
Elite
10,501 posts

Joined: Oct 2009
From: Tamriel


But, I can't notice the Graphics Difference(by bare eyes) between High Performance vs Quality/High Quality for Nvidia GPU.
So, setting it up for High Performance is better.
Similar to AMD/ATI, there's also significant Graphics Difference between High Performance vs High Quality.

AMD and Nvidia both have different optimization on every games.
Some games favour Nvidia and some to AMD. Similarly, the Graphics Quality too.
zerorating
post Nov 23 2010, 02:42 PM

Miskin Adab
*****
Senior Member
975 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(DarkSilver @ Nov 23 2010, 02:39 PM)
But, I can't notice the Graphics Difference(by bare eyes) between High Performance vs Quality/High Quality for Nvidia GPU.
So, setting it up for High Performance is better.
Similar to AMD/ATI, there's also significant Graphics Difference between High Performance vs High Quality.

AMD and Nvidia both have different optimization on every games.
Some games favour Nvidia and some to AMD. Similarly, the Graphics Quality too.
*
same as me,when playing game,we looking at moving image not static image whistling.gif

This post has been edited by zerorating: Nov 23 2010, 02:42 PM
TiF
post Nov 23 2010, 03:56 PM

Regular
******
Senior Member
1,192 posts

Joined: Dec 2009


I always like to set my games graphic setting to the highest possible while maintain a playable frame rate.
crysis and warhead max up with 8xAA give me smooth gameplay, so I do not sacrifice anything.
but metro 2033 with everything max (Dx11, very high, tess ON, Ad. DoF on, MSAA) is really crazy, which is why i turn off Ad. DoF, which boost frame rate drastically...
powerfox
post Nov 23 2010, 07:34 PM

Casual
***
Junior Member
427 posts

Joined: Jul 2008

fuh finally i revive my artifact HD3850 liao with applied with new thermal paste =.=
anyways the artifact problem will still happen again after revive from dead ?
i don wan for it to failed me once again soon -.-

This post has been edited by powerfox: Nov 24 2010, 12:16 AM
Kr0ll3R
post Nov 24 2010, 02:00 AM

I've Been Permanently Banned
******
Senior Member
1,049 posts

Joined: Apr 2007
From: 192.168.1.1



QUOTE(powerfox @ Nov 23 2010, 07:34 PM)
fuh finally i revive my artifact HD3850 liao with applied with new thermal paste =.=
anyways the artifact problem will still happen again after revive from dead ?
i don wan for it to failed me once again soon -.-
*
Probably due to overheat hmm.gif
since it works again when u apply new thermal paste.
powerfox
post Nov 24 2010, 02:25 AM

Casual
***
Junior Member
427 posts

Joined: Jul 2008

QUOTE(Kr0ll3R @ Nov 24 2010, 02:00 AM)
Probably due to overheat hmm.gif
since it works again when u apply new thermal paste.
*
maybe lo now the temperature is 49 Celsius idle 56 celsius full load after i apply the thermal paste ...
before my gpu dead is 62 celsius idle and 95 celsius full load smile.gif

wildwestgoh
post Nov 24 2010, 12:20 PM

Look at all my stars!!
*******
Senior Member
2,215 posts

Joined: Jul 2005


QUOTE(powerfox @ Nov 23 2010, 07:34 PM)
fuh finally i revive my artifact HD3850 liao with applied with new thermal paste =.=
anyways the artifact problem will still happen again after revive from dead ?
i don wan for it to failed me once again soon -.-
*
Depends on condition, artifact might be caused by overheating (your previous issue) or bad RAM (unrepairable), you can try FurMark to see if it fail you again.
xen0
post Nov 24 2010, 03:32 PM

ismi..alif..lam..ya..fa
******
Senior Member
1,486 posts

Joined: Jun 2005
From: Cyberjaya/Kamunting


if artifacts reappear..bake it..old card is worth to bake.. many successful story out there...
zerorating
post Nov 24 2010, 04:48 PM

Miskin Adab
*****
Senior Member
975 posts

Joined: Aug 2007
From: Lokap Polis


from cayman architecture
user posted image
cayman architecture dual graphic engine, are that mean that graphic card have started real multicore era(considered that shader processor as thread rather than core) hmm.gif

p.s- some site declared antilles, and cayman slide with spec are fake, oh well doh.gif

This post has been edited by zerorating: Nov 24 2010, 08:59 PM
saturn85
post Nov 24 2010, 08:01 PM

Folding@home
*******
Senior Member
8,686 posts

Joined: Mar 2009



QUOTE(powerfox @ Nov 24 2010, 02:25 AM)
maybe lo now the temperature is 49 Celsius idle 56 celsius full load after i apply the thermal paste ...
before my gpu dead is 62 celsius idle and 95 celsius full load  smile.gif
*
wow, huge different between before and after.
i think i have to do something on my card also. hmm.gif
jameslee84
post Nov 24 2010, 08:07 PM

Getting Started
**
Junior Member
105 posts

Joined: May 2009
guys, i juz install arctic xtreme plus for my 5870..
well 1st time to oc gpu..
default:
850 core 1200 memory..

so my question..
which thing u guys will increase? core or memory?

i juz done mine wit 950 n 1300 x8 tested wit FurMark for 30minute, temp working fine at max 53, idle 35. < but is tat really okie wit this oc?
any pro would like to share experience wit me?
Kaellis
post Nov 24 2010, 09:19 PM

Certified Perodua Alza hater
******
Senior Member
1,231 posts

Joined: Aug 2005
From: Shah Alam
core more important than memory, nice temp

if touch vcore only the temps gonna up a lot, if stock not that much
jameslee84
post Nov 24 2010, 09:33 PM

Getting Started
**
Junior Member
105 posts

Joined: May 2009
QUOTE(Kaellis @ Nov 24 2010, 09:19 PM)
core more important than memory, nice temp

if touch vcore only the temps gonna up a lot, if stock not that much
*
ya, i juz notice it... i lower down 1250 memory.. n tested 1 hours+ temp at 50 only..

so actually core is more important or the memory?
TDUEnthusiast
post Nov 24 2010, 09:38 PM

Critical thinking
Group Icon
Elite
10,015 posts

Joined: Mar 2009
From: the future
QUOTE(jameslee84 @ Nov 24 2010, 09:33 PM)
ya, i juz notice it... i lower down 1250 memory.. n tested 1 hours+ temp at 50 only..

so actually core is more important or the memory?
*
The higher the core speed and shader, the better, the memory clock doesn't matter too much smile.gif. Just a small improvement.
DarkSilver
post Nov 24 2010, 09:43 PM

Idiosyncrasy
Group Icon
Elite
10,501 posts

Joined: Oct 2009
From: Tamriel


QUOTE(TDUEnthusiast @ Nov 24 2010, 09:38 PM)
The higher the core speed and shader, the better, the memory clock doesn't matter too much smile.gif. Just a small improvement.
*
Not really, Memory and Core clocks are the MAIN KEY of OCing.
Shader Clocks are not, because usually, it's stick to Core Clock.
jameslee84
post Nov 24 2010, 09:47 PM

Getting Started
**
Junior Member
105 posts

Joined: May 2009
QUOTE(TDUEnthusiast @ Nov 24 2010, 09:38 PM)
The higher the core speed and shader, the better, the memory clock doesn't matter too much smile.gif. Just a small improvement.
*
den i think i go back default 1200 enough...


Added on November 24, 2010, 9:49 pm
QUOTE(DarkSilver @ Nov 24 2010, 09:43 PM)
Not really, Memory and Core clocks are the MAIN KEY of OCing.
Shader Clocks are not, because usually, it's stick to Core Clock.
*
erm... is it nid to balance up?

This post has been edited by jameslee84: Nov 24 2010, 09:49 PM
TSAMDAthlon
post Nov 24 2010, 09:52 PM

The future is Fusion
*******
Senior Member
5,221 posts

Joined: Aug 2007
From: Deneb star


More info on Cayman unveiled brows.gif

http://www.brightsideofnews.com/news/2010/...e-unveiled.aspx
tech3910
post Nov 24 2010, 11:29 PM

Anonymous
*******
Senior Member
5,644 posts

Joined: Feb 2008
From: Heaven to HELL


QUOTE(DarkSilver @ Nov 24 2010, 09:43 PM)
Not really, Memory and Core clocks are the MAIN KEY of OCing.
Shader Clocks are not, because usually, it's stick to Core Clock.
*
applies wit nvidia cad.
coz ati card, the shader run @ same speed wit core anyway......

110 Pages « < 48 49 50 51 52 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0276sec    0.98    6 queries    GZIP Disabled
Time is now: 12th December 2025 - 02:42 AM