Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
15 Pages « < 2 3 4 5 6 > » Bottom

Outline · [ Standard ] · Linear+

 AMD Radeon™ Discussion V9, Latest - 13.11 Beta 9.5 | WHQL - 13.10

views
     
TSAcid_RuleZz
post Oct 20 2013, 10:45 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(azsace @ Oct 20 2013, 09:46 PM)
The problem with vsync... if ur gpu process less than 57fps.. it will pull down to 30fps.. so basically ur gpu resource will be wasted since the addition 27frames will be discarded.. with proper frame capping, ur gpu will limit it frame so the gpu can optimised the frame process.. this will reduce frame variance as well reducing screen tearing
Previously im using 7950.. with metro i cant maintain the 60fps.. so i just capped the frame to 48fps.. smooth gameplay
*
Dynamic-Vsync will disabled Vsync when your FPS is under 59 so there's no hard frame drop to 30fps and coupled with TripleBuffering, the input lag is non existent

QUOTE(Lau Pan @ Oct 20 2013, 09:48 PM)
G-sync remind me a physx, need buy extra hardware to support it.
$.$....

I still prefer use LED TV as my monitor...
If I have extra money, I will buy 60' LED TV  for gaming, movie and anime ><""
*
As always, early adopter need to pay premium.

Asus announced first GSYNC ready monitor for $399 and $175 for the GSYNC kit. 1080p, TN panel and 24" for $399 and that is US price without taxes. Surely it will get around $500 when it reach local shore.

Sauce: ASUS Announces Adoption of NVIDIA G-SYNC Technology

TSAcid_RuleZz
post Oct 20 2013, 11:18 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(sai86 @ Oct 20 2013, 11:10 PM)
omg. d heck with that price. aint a dell u2713hm is a better buy? tho i'm not quite understand the advantage of the Gsync, but as a user, not really wallet friendly to me hmm.gif
*
Totally, and i'm done with TN panel.

The advantage is that there will be no jerkiness/stutters, input lag and tearing even when the FPS is below/over your typical 60fps. It doesn't work below 30fps though.
TSAcid_RuleZz
post Oct 21 2013, 12:00 AM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(azsace @ Oct 20 2013, 11:25 PM)
Yep.. totlly agree with u.. with dynamic vsync / adaptive vsync when ur fps under 59, it wont pull down the frame to 30fps as.as the vsync is disable... however screen tearing still can happen if the frame variance between frame is high
*
Yeah but so far i only get that in Farcry 3 and rFactor 2. No tearing in Crysis 3 even though the fps is under 60.

This post has been edited by Acid_RuleZz: Oct 21 2013, 12:00 AM
TSAcid_RuleZz
post Oct 21 2013, 12:19 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(alexei @ Oct 21 2013, 12:11 PM)
G-Sync: Once you have it, then you know it. smile.gif
*
+1 need to experience it live, Youtube video not good enough to show the difference.
TSAcid_RuleZz
post Oct 21 2013, 04:42 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


^ Well, AMD said 290x was meant to compete with the GTX780.
TSAcid_RuleZz
post Oct 22 2013, 09:36 AM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


GTX 780 Ti spec leaked. I say it will beat or on par with 290x out of the box. The question now is the price.

user posted image

user posted image

Sauce: http://www.chiphell.com/thread-880923-1-1.html
TSAcid_RuleZz
post Oct 22 2013, 12:55 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(Currylaksa @ Oct 22 2013, 11:24 AM)
Titan will still be faster if you do compute work, or use CUDA-supported programs like Blender
*
Titan GPGPU highlight only in double precision floating point, other just meh. Even 7970GE is faster than a Titan in many of the GPGPU test.
TSAcid_RuleZz
post Oct 22 2013, 02:16 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(Currylaksa @ Oct 22 2013, 01:51 PM)
unfortunately compute applications are using CUDA thanks to mature API and documentation
*
Mind showing me which Cuda software that have significant advantage on Titan but not 780? By significant i mean 30%-50% difference. hmm.gif

This post has been edited by Acid_RuleZz: Oct 22 2013, 02:17 PM
TSAcid_RuleZz
post Oct 22 2013, 02:45 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(sasaug @ Oct 22 2013, 02:28 PM)
user posted image

user posted image

Asus R9-280X DirectCUII TOP @ 1180/1700Mhz, up from 1070/1600. Stock voltage, which is 1.186V for mine.

Previously I didn't post benchmark because my CPU bottleneck, I lost like 10FPS from these result. Just got my FX-8350 today, stock voltage, clock it up to 4.3Ghz and ran these test.
*
Did you increase board power limit?

QUOTE(sai86 @ Oct 22 2013, 02:33 PM)
recommend me a 1k watt psu  shocking.gif thxs for d site.
*
LMAO
TSAcid_RuleZz
post Oct 22 2013, 05:00 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(sasaug @ Oct 22 2013, 03:13 PM)
Yea by 20%, just no voltage change.
*
Not sure whether because you're on Window 8 or that CPU is affecting the score but my 7950 get similar result at similar clock. hmm.gif
TSAcid_RuleZz
post Oct 23 2013, 12:12 AM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(sai86 @ Oct 22 2013, 11:07 PM)
Suicide run for me on heaven.
1250/1500 @ vddc1.281/ board li +20. temp <70C = minimum artifact shown but able to complete the test. (latest beta driver for bf4)
compare to sasaug, i say r-280x definitely worth to buy. lower power consumption, n better. my vddc is +0.1 compare to his r-280x.

*
If you lower the clock, will you get lower score? hmm.gif
TSAcid_RuleZz
post Oct 23 2013, 12:15 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(Puteih @ Oct 23 2013, 12:13 PM)
Just to be clear, Toxic 280x without OC is good enough?
*
Define what is good enough for you. What res, games and expected fps.
TSAcid_RuleZz
post Oct 23 2013, 02:46 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


AMD Catalyst 13.11 Beta 3 out.

QUOTE(Asder00)
Enjoy!

***For the people who are having problems extracting/CCC error please read THIS post for the fix!***

> Download <

No release notes available...

Build Info:
DriverVer=10/07/2013, 13.250.18.0000
13.25.18-131007a-163510E-ATI
Catalyst: 13.11
CCC: 2013.1007.2203.37785
3D: 9.14.10.01001
OGL: 6.14.10.12609
OCL: 10.0.1348.4

Supported OS:
Windows 7
Windows 8
Windows 8.1

» Click to show Spoiler - click again to hide... «


Sauce: AMD Catalyst 13.11 BETA3 (13.250.18.0 October 7)


Credit to Asder00
TSAcid_RuleZz
post Oct 24 2013, 10:19 AM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


Leaked overclocking bench for 290x doesn't look impressive.

Sauce: XFX Radeon R9 290X overclocking and temperatures exposed
TSAcid_RuleZz
post Oct 24 2013, 11:26 AM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(Puteih @ Oct 24 2013, 10:53 AM)
overclocking means shortening the lifespan of the graphic cards right? do correct me if I'm wrong. So you guys prefer to trade the life span for more power?
*
if shortening the lifespan mean the GPU will last 4-5 years instead of 5-6 years, i take extra free performance anytime.

i undervolt my GPU when i'm not playing games or playing less GPU intensive game like Dota 2, that mean extra lifespan? tongue.gif
TSAcid_RuleZz
post Oct 24 2013, 12:23 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


in before it's not all about raw performance/fps.
TSAcid_RuleZz
post Oct 24 2013, 12:29 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(PCPER)
XDMA CrossFire Frame Pacing - It Works!!!

As you might be able to tell: we didn't get this second AMD Radeon R9 290X card from AMD. They preferred us to wait a bit for our CrossFire testing and, in particular, our 4K CrossFire testing. But sometimes hardware finds its way to our office; when it does, we test!

And to be honest, I am not sure why AMD wouldn't have wanted this story out.  When I published an article in September that looked at the severe problems that plagued the Radeon HD 7000 series in Eyefinity configurations (and thus tiled 4K ones, by association) that also used CrossFire, we wanted to push the company forward to release a fix sooner rather than later.  Today's release of the Radeon R9 290X, based on a new architecture with a completely new CrossFire implementation, proves that the GCN design can be improved!  Our testing with the 4K ASUS PQ321Q monitor with a pair of R9 290X cards clearly shows that to be the case.

user posted image

The results are not perfect though. Notice I said the issues have been "improved" and not "fixed."  There are clearly some games and situations that need more work to get to a point where game stutter is no longer noticeable.  Metro: Last Light, for example, no longer drops frames when in an Eyefinity/4K configuration but it clearly has a need for tighter frame variances to improve the gaming experience.

And of course, this fix, that comes with a combination of a new driver and a completely new hardware level implementation of CrossFire, doesn't help users of the R9 280X, 270X or even the Radeon HD 7000 series cards that are already out in the wild.  We are still waiting on the promised answer from AMD for those consumers that bought into the idea of CrossFire as well as multi-display gaming.

That is all we have for now - I had only a single day with the second R9 290X and thus testing was short and sweet.  You can expect some more detailed analysis, video comparisons and benchmarks of Eyefinity in the days to come.

Sauce: Frame Rating: AMD Radeon R9 290X CrossFire and 4K Preview Testing


Oh yeah rclxms.gif
TSAcid_RuleZz
post Oct 24 2013, 12:49 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(sai86 @ Oct 24 2013, 12:25 PM)
the price is juz get us all surprised. now, they should back to focus on improving their driver  smile.gif
the CF scaling in benchie is very good. n the 290 performance is great as well. this should be more exciting since the 290x price is such low ady.
*
I need to start saving for a 290 and snatch one when the price reach RM1.5k ph34r.gif
TSAcid_RuleZz
post Oct 24 2013, 05:28 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(Jimsee @ Oct 24 2013, 05:26 PM)
since i'm gaming on 1080p and 120hz monitor.. Should i go for GTX 780 or R9 290x ?
*
What monitor model is that? If it can support Gsync, you better go for GTX780. Else go for 290x and watercool 'em.
TSAcid_RuleZz
post Oct 24 2013, 06:23 PM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


QUOTE(sai86 @ Oct 24 2013, 05:35 PM)
bro. any old monitor is unable to support gsync, hence, only new approve nvidia monitor only has gsync (need to add in chip). and asus gsync monitor already reach our shore? atm, not much announcement on this.
*
i thought some older 120hz/144hz can support gsync but you need to buy the gsync kit. hmm.gif

QUOTE(law1777 @ Oct 24 2013, 05:42 PM)
dont think he owned a gsync monitor eh???

with gsync do u even need 120hz hmm.gif
*
gsync need 120hz/144hz monitor. 60hz monitor is useless for gysnc.

15 Pages « < 2 3 4 5 6 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0496sec    0.32    7 queries    GZIP Disabled
Time is now: 9th December 2025 - 06:53 AM