QUOTE(Acid_RuleZz @ Dec 28 2013, 01:25 AM)
haha yes but they'll scare seeing that AMD Radeon™ Discussion V9, Latest - 13.11 Beta 9.5 | WHQL - 13.10
AMD Radeon™ Discussion V9, Latest - 13.11 Beta 9.5 | WHQL - 13.10
|
|
Dec 28 2013, 01:34 AM
|
![]() ![]() ![]() ![]() ![]()
Senior Member
849 posts Joined: May 2010 From: Penang Island |
|
|
|
|
|
|
Dec 28 2013, 03:00 AM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,188 posts Joined: Dec 2004 |
|
|
|
Dec 28 2013, 03:04 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,337 posts Joined: Dec 2008 From: KING CANNEL JB |
|
|
|
Dec 28 2013, 03:10 AM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,188 posts Joined: Dec 2004 |
|
|
|
Dec 28 2013, 03:15 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
Incoming wall of texts
QUOTE Nvidia’s GameWorks program usurps power from developers, end-users, and AMD Over the past few months, Nvidia has made a number of high-profile announcements regarding game development and new gaming technologies. One of the most significant is a new developer support program, called GameWorks. The GameWorks program offers access to Nvidia’s CUDA development tools, GPU profiling software, and other developer resources. One of the features of GameWorks is a set of optimized libraries that developers can use to implement certain effects in game. Unfortunately, these same libraries also tilt the performance landscape in Nvidia’s favor in a way that neither developers nor AMD can prevent. Understanding libraries Simply put, a library is a collection of implemented behaviors. They are not application specific — libraries are designed to be called by multiple programs in order to simplify development. Instead of implementing a GPU feature five times in five different games, you can just point the same five titles at one library. Game engines like Unreal Engine 3 are typically capable of integrating with third party libraries to ensure maximum compatibility and flexibility. Nvidia’s GameWorks contains libraries that tell the GPU how to render shadows, implement ambient occlusion, or illuminate objects. In Nvidia’s GameWorks program, though, all the libraries are closed. You can see the files in games like Arkham City or Assassin’s Creed IV — the file names start with the GFSDK prefix. However, developers can’t see into those libraries to analyze or optimize the shader code. Since developers can’t see into the libraries, AMD can’t see into them either — and that makes it nearly impossible to optimize driver code. ![]() Previous Arkham titles favored Nvidia, but never to this degree. In Arkham City, the R9 290X has a 24% advantage over the GTX 770 in DX11, and a 14% improvement in DX9. In Arkham Origins, they tie. Can this be traced directly back to GameWorks? Technically, no it can’t — all of our feature-specific tests showed the GTX 770 and the R9 290X taking near-identical performance hits with GameWorks features set to various detail levels. If DX11 Enhanced Ambient Occlusion costs the GTX 770 10% of its performance, it cost the R9 290X 10% of its performance. The problem with that “no,” though, is twofold. First, because AMD can’t examine or optimize the shader code, there’s no way of knowing what performance could look like. In a situation where neither the developer nor AMD ever has access to the shader code to start with, this is a valid point. Arkham Origins offers an equal performance hit to the GTX 770 and the R9 290X, but control of AMD’s performance in these features no longer rests with AMD’s driver team — it’s sitting with Nvidia. ![]() ![]() The first three scenes of the benchmark in Arkham Origins hammer tessellation. AMD’s driver allows us to manually define the tessellation level — changing that setting to x4 improves performance in the first three scenes of the test by 11%, from 134fps to 150fps. Total test performance improves by 7%, from 148fps to 158fps. AMD attempted to provide Warner Bros. Montreal with code to improve Arkham Origins performance in tessellation, as well as to fix certain multi-GPU problems with the game. The studio turned down both. Is this explicitly the fault of GameWorks? No, but it’s a splendid illustration of how developer bias, combined with unfair treatment, creates a sub-optimal consumer experience. Nvidia’s GameWorks program is conceptually similar to what Intel pulled on AMD 8-10 years back. In that situation, Intel’s compilers refused to optimize code for AMD processors, even though AMD had paid Intel for the right to implement SSE, SSE2, and SSE3. The compiler would search for a CPU string rather than just the ability to execute the vectorized code, and if it detected AuthenticAMD instead of GenuineIntel, it refused to use the most advantageous optimizations. Nvidia has done a great deal for gaming over the past decade. Features like hardware PhysX support and 3D gaming may never have gone truly mainstream, but they were appreciated premium features for the gamers that wanted them. G-Sync, by all accounts, offers real advantages as well. GameWorks, however, doesn’t just offer Nvidia customers an advantage — it curtails developer freedom and sharply limits AMD’s ability to optimize as well. Even if Nvidia never deliberately sabotages GameWorks code to run poorly on AMD or Intel GPUs, the inability to optimize these functions is itself a genuine competitive disadvantage. Sauce: Nvidia’s GameWorks program usurps power from developers, end-users, and AMD I didn't pasta everything btw. Bolded interesting parts. |
|
|
Dec 28 2013, 03:15 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,337 posts Joined: Dec 2008 From: KING CANNEL JB |
|
|
|
|
|
|
Dec 28 2013, 03:27 AM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,188 posts Joined: Dec 2004 |
|
|
|
Dec 28 2013, 04:08 AM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,188 posts Joined: Dec 2004 |
Hmm , I push my clock speed to 1150 already with stock voltage 1.256v . ( Enough ? )
Temp peak at 74c (Safe ?). I'm stressing it with Heaven loop ( correct way ? ) |
|
|
Dec 28 2013, 04:38 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
|
|
|
Dec 28 2013, 04:51 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,337 posts Joined: Dec 2008 From: KING CANNEL JB |
QUOTE(Acid_RuleZz @ Dec 28 2013, 03:15 AM) Edit: thx for the enlightenment read... This post has been edited by Unseen83: Dec 28 2013, 04:54 AM |
|
|
Dec 28 2013, 04:58 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,337 posts Joined: Dec 2008 From: KING CANNEL JB |
QUOTE(marfccy @ Dec 28 2013, 04:38 AM) so.. am i suppose to enjoice that i got a Nvidia card instead of AMD's? hmm i am very sure if Mantle successful(this is "just or story") implemented into GCN, and Nvidia stock market fall.. Nvidia Share holder not happy... very sure Nvidia would sued AMD ... (or make something up to sued AMD of un fare reason or infringement )this seems 'limiting' to everyone but in a business view, its perfectly normal Edit: i mean if this was another way around.. This post has been edited by Unseen83: Dec 28 2013, 04:59 AM |
|
|
Dec 28 2013, 05:07 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
QUOTE(Unseen83 @ Dec 28 2013, 04:58 AM) hmm i am very sure if Mantle successful(this is "just or story") implemented into GCN, and Nvidia stock market fall.. Nvidia Share holder not happy... very sure Nvidia would sued AMD ... (or make something up to sued AMD of un fare reason or infringement ) in a business point of view, its completely acceptable as the youre obviously striving to be better than your competitor and get more as wellEdit: i mean if this was another way around.. but in a consumer view, thats just Nvidia being a jacka$$, trying to monopolise the market again we need competiton man, later AMD vs Intel happen this move is purely there to annoy AMD users as for Mantle, i heard alot about it but its still not ready? This post has been edited by marfccy: Dec 28 2013, 05:10 AM |
|
|
Dec 28 2013, 05:17 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,337 posts Joined: Dec 2008 From: KING CANNEL JB |
QUOTE(marfccy @ Dec 28 2013, 05:07 AM) in a business point of view, its completely acceptable as the youre obviously striving to be better than your competitor and get more as well but in a consumer view, thats just Nvidia being a jacka$$, trying to monopolise the market again we need competiton man, later AMD vs Intel happen this move is purely there to annoy AMD users as for Mantle, i heard alot about it but its still not ready? x Yes agree with you on we (end user) need competition so non be too monopolies "Nvidia never drop their price if amd did not bring in R9 200 Series into market" (i could spend RM2.6K on GTX 780 2month ago if amd did not bring out R9 290 x "this move is purely there to annoy AMD users" as i say JackSparrow the Batman!! arggh lol edit: on mantle... yeah apple Pie still in the oven This post has been edited by Unseen83: Dec 28 2013, 05:20 AM |
|
|
|
|
|
Dec 28 2013, 10:46 AM
|
![]() ![]()
Junior Member
219 posts Joined: Sep 2008 |
Using Asus R9 280X..
When playing games...the fan speed 45% & the gpu 75 degree (max) Need opinion, should I configure the fan speed higher? |
|
|
Dec 28 2013, 12:00 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,654 posts Joined: Dec 2007 |
|
|
|
Dec 28 2013, 12:10 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,654 posts Joined: Dec 2007 |
QUOTE(marfccy @ Dec 28 2013, 05:07 AM) in a business point of view, its completely acceptable as the youre obviously striving to be better than your competitor and get more as well i dont care all that.. the only thing i know is that Tri-X 290 OCed surpassed the RM1k premium 780Ti but in a consumer view, thats just Nvidia being a jacka$$, trying to monopolise the market again we need competiton man, later AMD vs Intel happen this move is purely there to annoy AMD users as for Mantle, i heard alot about it but its still not ready? addon: AMD is finalizing Mantle now This post has been edited by law1777: Dec 28 2013, 01:39 PM |
|
|
Dec 28 2013, 06:25 PM
|
![]() ![]()
Junior Member
106 posts Joined: Aug 2008 From: Bolehland,Nation of can-or-nauts and hypocrites |
Just looked at TechArm's FB page. RM1699 with RM50 cash voucher is a good price considering factory OC and custom cooler and still faster than a 780.
TA has also put up the Gigabyte Windforce 290 for RM1649. The only thing I'm worried about is stocks. These GPUs are hotcakes. Newegg has already run out of TriXX model. This post has been edited by MatchesMalone: Dec 28 2013, 06:27 PM |
|
|
Dec 28 2013, 10:32 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,337 posts Joined: Dec 2008 From: KING CANNEL JB |
QUOTE(law1777 @ Dec 28 2013, 12:00 PM) make sure your PC 290x OC is faster than 780Ti and same price as Tri-X 290 or else no one wants.. hahaha This post has been edited by Unseen83: Dec 28 2013, 10:32 PM |
|
|
Dec 28 2013, 10:57 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,337 posts Joined: Dec 2008 From: KING CANNEL JB |
WOW! TechArmory Selling Non reference cards R9 290 (Giga/Saprhire) below RM1.7K (exact RM1,649-1,699) that is like $517 US dollar.. hmm i think is very reasonable asking price
|
|
|
Dec 28 2013, 11:02 PM
|
![]() ![]() ![]() ![]()
Senior Member
662 posts Joined: Jul 2013 |
someone asked me to post here, so i did.
guys, i encountered one v weird situation. my mobo is asus b85m-e and my gpu is asus r9 290. i bought the samsung s24c350hl monitor just now and after i connect my hdmi cable between my monitor and my gpu hdmi port, nothing came out on the screen. i have tested the hdmi cable wiv my monitor and ps3, and there is display on the monitor. on the other hand, i connected my cpu and lg tv using hdmi cable, there is oso display on the tv... can anyone tell me what is wrong between my monitor and newly bought rig??? |
|
Topic ClosedOptions
|
| Change to: | 0.0331sec
0.58
6 queries
GZIP Disabled
Time is now: 13th December 2025 - 12:03 AM |