Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
126 Pages « < 111 112 113 114 115 > » Bottom

Outline · [ Standard ] · Linear+

 AMD Radeon™ Discussion V9, Latest - 13.11 Beta 9.5 | WHQL - 13.10

views
     
shepard
post Dec 28 2013, 01:34 AM

We shall overcome
*****
Senior Member
849 posts

Joined: May 2010
From: Penang Island



QUOTE(Acid_RuleZz @ Dec 28 2013, 01:25 AM)
jk bro, i've yet to destroy any gpu from overclocking too.
*
haha yes but they'll scare seeing that tongue.gif...I destroyed my GPU once but due to water cooling..damn, that's why retired now.
king99
post Dec 28 2013, 03:00 AM

Regular
******
Senior Member
1,188 posts

Joined: Dec 2004



QUOTE(shepard @ Dec 28 2013, 01:34 AM)
haha yes but they'll scare seeing that tongue.gif...I destroyed my GPU once but due to water cooling..damn, that's why retired now.
*
I destoyed one by putting a 6-pin power to a 8-pin power GPU
Unseen83
post Dec 28 2013, 03:04 AM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(king99 @ Dec 28 2013, 03:00 AM)
I destoyed one by putting a 6-pin power to a 8-pin power GPU
*
wah i did not no that could hapen... ohmy.gif
king99
post Dec 28 2013, 03:10 AM

Regular
******
Senior Member
1,188 posts

Joined: Dec 2004



QUOTE(Unseen83 @ Dec 28 2013, 03:04 AM)
wah i did not no that could hapen...  ohmy.gif
*
It was my old 4850 haha when plug in 2nd PC accidently plug in wrong power cable . Turn on , heard a small spark noise...Both GPU and PSU died lol...it was a good PSU too.. =(
TSAcid_RuleZz
post Dec 28 2013, 03:15 AM

ミウ ❤
*******
Senior Member
6,612 posts

Joined: Jan 2003
From: Tomorrow


Incoming wall of texts

QUOTE
Nvidia’s GameWorks program usurps power from developers, end-users, and AMD

Over the past few months, Nvidia has made a number of high-profile announcements regarding game development and new gaming technologies. One of the most significant is a new developer support program, called GameWorks. The GameWorks program offers access to Nvidia’s CUDA development tools, GPU profiling software, and other developer resources. One of the features of GameWorks is a set of optimized libraries that developers can use to implement certain effects in game. Unfortunately, these same libraries also tilt the performance landscape in Nvidia’s favor in a way that neither developers nor AMD can prevent.

Understanding libraries

Simply put, a library is a collection of implemented behaviors. They are not application specific — libraries are designed to be called by multiple programs in order to simplify development. Instead of implementing a GPU feature five times in five different games, you can just point the same five titles at one library. Game engines like Unreal Engine 3 are typically capable of integrating with third party libraries to ensure maximum compatibility and flexibility. Nvidia’s GameWorks contains libraries that tell the GPU how to render shadows, implement ambient occlusion, or illuminate objects.

In Nvidia’s GameWorks program, though, all the libraries are closed. You can see the files in games like Arkham City or Assassin’s Creed IV — the file names start with the GFSDK prefix. However, developers can’t see into those libraries to analyze or optimize the shader code. Since developers can’t see into the libraries, AMD can’t see into them either — and that makes it nearly impossible to optimize driver code.

user posted image

Previous Arkham titles favored Nvidia, but never to this degree. In Arkham City, the R9 290X has a 24% advantage over the GTX 770 in DX11, and a 14% improvement in DX9. In Arkham Origins, they tie. Can this be traced directly back to GameWorks? Technically, no it can’t — all of our feature-specific tests showed the GTX 770 and the R9 290X taking near-identical performance hits with GameWorks features set to various detail levels. If DX11 Enhanced Ambient Occlusion costs the GTX 770 10% of its performance, it cost the R9 290X 10% of its performance.

The problem with that “no,” though, is twofold. First, because AMD can’t examine or optimize the shader code, there’s no way of knowing what performance could look like. In a situation where neither the developer nor AMD ever has access to the shader code to start with, this is a valid point. Arkham Origins offers an equal performance hit to the GTX 770 and the R9 290X, but control of AMD’s performance in these features no longer rests with AMD’s driver team — it’s sitting with Nvidia.

user posted image

user posted image

The first three scenes of the benchmark in Arkham Origins hammer tessellation. AMD’s driver allows us to manually define the tessellation level — changing that setting to x4 improves performance in the first three scenes of the test by 11%, from 134fps to 150fps. Total test performance improves by 7%, from 148fps to 158fps. AMD attempted to provide Warner Bros. Montreal with code to improve Arkham Origins performance in tessellation, as well as to fix certain multi-GPU problems with the game. The studio turned down both. Is this explicitly the fault of GameWorks? No, but it’s a splendid illustration of how developer bias, combined with unfair treatment, creates a sub-optimal consumer experience.

Nvidia’s GameWorks program is conceptually similar to what Intel pulled on AMD 8-10 years back. In that situation, Intel’s compilers refused to optimize code for AMD processors, even though AMD had paid Intel for the right to implement SSE, SSE2, and SSE3. The compiler would search for a CPU string rather than just the ability to execute the vectorized code, and if it detected AuthenticAMD instead of GenuineIntel, it refused to use the most advantageous optimizations.

Nvidia has done a great deal for gaming over the past decade. Features like hardware PhysX support and 3D gaming may never have gone truly mainstream, but they were appreciated premium features for the gamers that wanted them. G-Sync, by all accounts, offers real advantages as well. GameWorks, however, doesn’t just offer Nvidia customers an advantage — it curtails developer freedom and sharply limits AMD’s ability to optimize as well. Even if Nvidia never deliberately sabotages GameWorks code to run poorly on AMD or Intel GPUs, the inability to optimize these functions is itself a genuine competitive disadvantage.

Sauce: Nvidia’s GameWorks program usurps power from developers, end-users, and AMD

I didn't pasta everything btw. Bolded interesting parts.
Unseen83
post Dec 28 2013, 03:15 AM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(king99 @ Dec 28 2013, 03:10 AM)
It was my old 4850 haha when plug in 2nd PC accidently plug in wrong power cable . Turn on , heard a small spark noise...Both GPU and PSU died lol...it was a good PSU too.. =(
*
good psu ? hmm i question that statement.. lols
king99
post Dec 28 2013, 03:27 AM

Regular
******
Senior Member
1,188 posts

Joined: Dec 2004



QUOTE(Unseen83 @ Dec 28 2013, 03:15 AM)
good psu ? hmm i question that statement.. lols
*
500w corsair...i think something got stuck between the pins...causing shorting haha
king99
post Dec 28 2013, 04:08 AM

Regular
******
Senior Member
1,188 posts

Joined: Dec 2004



Hmm , I push my clock speed to 1150 already with stock voltage 1.256v . ( Enough ? )

Temp peak at 74c (Safe ?). I'm stressing it with Heaven loop ( correct way ? )
marfccy
post Dec 28 2013, 04:38 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Acid_RuleZz @ Dec 28 2013, 03:15 AM)
Incoming wall of texts
I didn't pasta everything btw. Bolded interesting parts.
*
so.. am i suppose to enjoice that i got a Nvidia card instead of AMD's? hmm.gif

this seems 'limiting' to everyone

but in a business view, its perfectly normal
Unseen83
post Dec 28 2013, 04:51 AM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(Acid_RuleZz @ Dec 28 2013, 03:15 AM)
*
sad.gif right.. well just boycott the game.. biggrin.gif (get jack sparrow) tongue.gif lols no wonder there level in Batman origin (final level/mission get Joker Asylum) FPS drop to 9-10 crawling point... (not to mention game full of Glitch)

Edit: thx for the enlightenment read... unsure.gif

This post has been edited by Unseen83: Dec 28 2013, 04:54 AM
Unseen83
post Dec 28 2013, 04:58 AM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(marfccy @ Dec 28 2013, 04:38 AM)
so.. am i suppose to enjoice that i got a Nvidia card instead of AMD's? hmm.gif

this seems 'limiting' to everyone

but in a business view, its perfectly normal
*
hmm i am very sure if Mantle successful(this is "just or story") implemented into GCN, and Nvidia stock market fall.. Nvidia Share holder not happy... very sure Nvidia would sued AMD ... (or make something up to sued AMD of un fare reason or infringement )

Edit: i mean if this was another way around.. tongue.gif

This post has been edited by Unseen83: Dec 28 2013, 04:59 AM
marfccy
post Dec 28 2013, 05:07 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Unseen83 @ Dec 28 2013, 04:58 AM)
hmm i am very sure if Mantle successful(this is "just or story") implemented into GCN, and Nvidia stock market fall.. Nvidia Share holder not happy... very sure Nvidia would  sued AMD ... (or make something up to sued AMD of un fare reason or infringement )

Edit: i mean  if this was another way around.. tongue.gif
*
in a business point of view, its completely acceptable as the youre obviously striving to be better than your competitor and get more as well

but in a consumer view, thats just Nvidia being a jacka$$, trying to monopolise the market again

we need competiton man, later AMD vs Intel happen

this move is purely there to annoy AMD users

as for Mantle, i heard alot about it but its still not ready? sweat.gif

This post has been edited by marfccy: Dec 28 2013, 05:10 AM
Unseen83
post Dec 28 2013, 05:17 AM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(marfccy @ Dec 28 2013, 05:07 AM)
in a business point of view, its completely acceptable as the youre obviously striving to be better than your competitor and get more as well

but in a consumer view, thats just Nvidia being a jacka$$, trying to monopolise the market again

we need competiton man, later AMD vs Intel happen

this move is purely there to annoy AMD users

as for Mantle, i heard alot about it but its still not ready? sweat.gif
*
biggrin.gif "tell that to Samsung vs Apple court"

x
Yes agree with you on we (end user) need competition so non be too monopolies "Nvidia never drop their price if amd did not bring in R9 200 Series into market" (i could spend RM2.6K on GTX 780 2month ago if amd did not bring out R9 290 tongue.gif i bee riding Green Pony into lala land blush.gif)

x
"this move is purely there to annoy AMD users"

as i say JackSparrow the Batman!! arggh lol

edit: on mantle... yeah apple Pie still in the oven sad.gif was hoping could play BF4 with Mantle api.. hmm.gif

This post has been edited by Unseen83: Dec 28 2013, 05:20 AM
AdamNg
post Dec 28 2013, 10:46 AM

Getting Started
**
Junior Member
219 posts

Joined: Sep 2008
Using Asus R9 280X..
When playing games...the fan speed 45% & the gpu 75 degree (max)
Need opinion, should I configure the fan speed higher?
law1777
post Dec 28 2013, 12:00 PM

DreamMan
*******
Senior Member
2,654 posts

Joined: Dec 2007


QUOTE(Unseen83 @ Dec 27 2013, 04:47 PM)
hmm.gif

hmm if like that, im DEFO gonna sell my Powercolor R9 290x OC with 30% off ori price biggrin.gif
*
make sure your PC 290x OC is faster than 780Ti and same price as Tri-X 290 or else no one wants.. hahaha
law1777
post Dec 28 2013, 12:10 PM

DreamMan
*******
Senior Member
2,654 posts

Joined: Dec 2007


QUOTE(marfccy @ Dec 28 2013, 05:07 AM)
in a business point of view, its completely acceptable as the youre obviously striving to be better than your competitor and get more as well

but in a consumer view, thats just Nvidia being a jacka$$, trying to monopolise the market again

we need competiton man, later AMD vs Intel happen

this move is purely there to annoy AMD users

as for Mantle, i heard alot about it but its still not ready? sweat.gif
*
i dont care all that.. the only thing i know is that Tri-X 290 OCed surpassed the RM1k premium 780Ti thumbup.gif

addon: AMD is finalizing Mantle now

This post has been edited by law1777: Dec 28 2013, 01:39 PM
MatchesMalone
post Dec 28 2013, 06:25 PM

Getting Started
**
Junior Member
106 posts

Joined: Aug 2008
From: Bolehland,Nation of can-or-nauts and hypocrites


Just looked at TechArm's FB page. RM1699 with RM50 cash voucher is a good price considering factory OC and custom cooler and still faster than a 780.
TA has also put up the Gigabyte Windforce 290 for RM1649.

The only thing I'm worried about is stocks. These GPUs are hotcakes. Newegg has already run out of TriXX model.

This post has been edited by MatchesMalone: Dec 28 2013, 06:27 PM
Unseen83
post Dec 28 2013, 10:32 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(law1777 @ Dec 28 2013, 12:00 PM)
make sure your PC 290x OC is faster than 780Ti and same price as Tri-X 290 or else no one wants.. hahaha
*
smile.gif well... got RM3K i reserve for NON reference R9 290x.. at moment.. so feel small fly annoyance if nobody wanna buy my Reference R9.. biggrin.gif

This post has been edited by Unseen83: Dec 28 2013, 10:32 PM
Unseen83
post Dec 28 2013, 10:57 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


WOW! TechArmory Selling Non reference cards R9 290 (Giga/Saprhire) below RM1.7K (exact RM1,649-1,699) that is like $517 US dollar.. hmm i think is very reasonable asking price smile.gif (Storm88) Awesome guy!! lols
SUSbingding
post Dec 28 2013, 11:02 PM

On my way
****
Senior Member
662 posts

Joined: Jul 2013
someone asked me to post here, so i did.
guys, i encountered one v weird situation. my mobo is asus b85m-e and my gpu is asus r9 290. i bought the samsung s24c350hl monitor just now and after i connect my hdmi cable between my monitor and my gpu hdmi port, nothing came out on the screen. i have tested the hdmi cable wiv my monitor and ps3, and there is display on the monitor. on the other hand, i connected my cpu and lg tv using hdmi cable, there is oso display on the tv... can anyone tell me what is wrong between my monitor and newly bought rig???

126 Pages « < 111 112 113 114 115 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0331sec    0.58    6 queries    GZIP Disabled
Time is now: 13th December 2025 - 12:03 AM