Full written review here.
Pretty impressive chip they got there.

AMD Radeon™ Discussion V9, Latest - 13.11 Beta 9.5 | WHQL - 13.10
|
|
Dec 27 2013, 12:56 AM
Return to original view | Post
#261
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
HardwareCanucks review of the Asus 290x DCU2
Full written review here. Pretty impressive chip they got there. ![]() |
|
|
|
|
|
Dec 27 2013, 10:54 AM
Return to original view | Post
#262
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
|
|
|
Dec 27 2013, 11:28 AM
Return to original view | Post
#263
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
|
|
|
Dec 27 2013, 06:44 PM
Return to original view | Post
#264
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
QUOTE(king99 @ Dec 27 2013, 05:58 PM) er...i'm now overclocking to 1100 ... i only use 1.3v for benchmarking, for gaming i usually stick with 1.2v-1.25v max. I see other ppl with my card push it to 1.3v =X Will my current voltage enough for a 1100 core overclock ? Tested furmark.....my clock fluctuate from 1000-1100 repeatedly =X Power tune set to +20 edi Using 750W Corsair. I don't think .862v is enough for 1.1jhz clock unless you won a huge silicone lottery. Don't use Furmark, use Valley or Heaven. |
|
|
Dec 27 2013, 07:08 PM
Return to original view | Post
#265
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
QUOTE(king99 @ Dec 27 2013, 06:45 PM) I realize both my lethal boost and normal 7970 bios is the same , that explained why there is no change when I press the Blue button =X. Meh... just overclock your card with Afterburner. Sometime the 2nd BIOS only bring more problem than performance.Anyway to solve this ? need flash ? |
|
|
Dec 27 2013, 10:13 PM
Return to original view | Post
#266
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
QUOTE(king99 @ Dec 27 2013, 07:17 PM) I have another issue...the VDDC stuck at 0.8 v when I run the valley benchmark, good thing is the clock is stable. It seem GPU-Z can't show the correct voltage for your card. I did not alter the voltage at anyways at all... From Hexus review, it seem the stock volt of this card is 1.256v which is plenty. QUOTE And the quality of the cooler is indirectly confirmed by the amount of heat it has to dissipate. Run in its Sunday-best form, with the Lethal Boost switched on, the system pulls almost 100W more than a vanilla GTX 680 and, tellingly, 56W more than a Radeon HD 7970 GHz Edition card. Remember the increase in GPU voltage for the TOXIC? It runs at 1.256V in standard mode and 1.281V when in Lethal Boost livery. Double-sized framebuffer, extra frequency and voltage increases don't make for attractive power-draw readings, obviously. |
|
|
|
|
|
Dec 28 2013, 12:20 AM
Return to original view | Post
#267
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
QUOTE(king99 @ Dec 27 2013, 10:47 PM) You can go to the overclocking thread here. The guys there will he happy to My only advise is start slowly, 1.256v is already plenty for 1150mhz-1200mhz imo. What is your card ASIC quality % btw? |
|
|
Dec 28 2013, 01:25 AM
Return to original view | Post
#268
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
QUOTE(shepard @ Dec 28 2013, 01:08 AM) lol you serious?.. I never destroy my cards before due to even an extreme overclocking...dont worry guys as long as we know what we're doing nothing will goes wrong when overclocking our GPU. just need to monitor the temperatures(core,vrms) by playing around with fan profile, A/C or watercooling. jk bro, i've yet to destroy any gpu from overclocking too. |
|
|
Dec 28 2013, 03:15 AM
Return to original view | Post
#269
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
Incoming wall of texts
QUOTE Nvidia’s GameWorks program usurps power from developers, end-users, and AMD Over the past few months, Nvidia has made a number of high-profile announcements regarding game development and new gaming technologies. One of the most significant is a new developer support program, called GameWorks. The GameWorks program offers access to Nvidia’s CUDA development tools, GPU profiling software, and other developer resources. One of the features of GameWorks is a set of optimized libraries that developers can use to implement certain effects in game. Unfortunately, these same libraries also tilt the performance landscape in Nvidia’s favor in a way that neither developers nor AMD can prevent. Understanding libraries Simply put, a library is a collection of implemented behaviors. They are not application specific — libraries are designed to be called by multiple programs in order to simplify development. Instead of implementing a GPU feature five times in five different games, you can just point the same five titles at one library. Game engines like Unreal Engine 3 are typically capable of integrating with third party libraries to ensure maximum compatibility and flexibility. Nvidia’s GameWorks contains libraries that tell the GPU how to render shadows, implement ambient occlusion, or illuminate objects. In Nvidia’s GameWorks program, though, all the libraries are closed. You can see the files in games like Arkham City or Assassin’s Creed IV — the file names start with the GFSDK prefix. However, developers can’t see into those libraries to analyze or optimize the shader code. Since developers can’t see into the libraries, AMD can’t see into them either — and that makes it nearly impossible to optimize driver code. ![]() Previous Arkham titles favored Nvidia, but never to this degree. In Arkham City, the R9 290X has a 24% advantage over the GTX 770 in DX11, and a 14% improvement in DX9. In Arkham Origins, they tie. Can this be traced directly back to GameWorks? Technically, no it can’t — all of our feature-specific tests showed the GTX 770 and the R9 290X taking near-identical performance hits with GameWorks features set to various detail levels. If DX11 Enhanced Ambient Occlusion costs the GTX 770 10% of its performance, it cost the R9 290X 10% of its performance. The problem with that “no,” though, is twofold. First, because AMD can’t examine or optimize the shader code, there’s no way of knowing what performance could look like. In a situation where neither the developer nor AMD ever has access to the shader code to start with, this is a valid point. Arkham Origins offers an equal performance hit to the GTX 770 and the R9 290X, but control of AMD’s performance in these features no longer rests with AMD’s driver team — it’s sitting with Nvidia. ![]() ![]() The first three scenes of the benchmark in Arkham Origins hammer tessellation. AMD’s driver allows us to manually define the tessellation level — changing that setting to x4 improves performance in the first three scenes of the test by 11%, from 134fps to 150fps. Total test performance improves by 7%, from 148fps to 158fps. AMD attempted to provide Warner Bros. Montreal with code to improve Arkham Origins performance in tessellation, as well as to fix certain multi-GPU problems with the game. The studio turned down both. Is this explicitly the fault of GameWorks? No, but it’s a splendid illustration of how developer bias, combined with unfair treatment, creates a sub-optimal consumer experience. Nvidia’s GameWorks program is conceptually similar to what Intel pulled on AMD 8-10 years back. In that situation, Intel’s compilers refused to optimize code for AMD processors, even though AMD had paid Intel for the right to implement SSE, SSE2, and SSE3. The compiler would search for a CPU string rather than just the ability to execute the vectorized code, and if it detected AuthenticAMD instead of GenuineIntel, it refused to use the most advantageous optimizations. Nvidia has done a great deal for gaming over the past decade. Features like hardware PhysX support and 3D gaming may never have gone truly mainstream, but they were appreciated premium features for the gamers that wanted them. G-Sync, by all accounts, offers real advantages as well. GameWorks, however, doesn’t just offer Nvidia customers an advantage — it curtails developer freedom and sharply limits AMD’s ability to optimize as well. Even if Nvidia never deliberately sabotages GameWorks code to run poorly on AMD or Intel GPUs, the inability to optimize these functions is itself a genuine competitive disadvantage. Sauce: Nvidia’s GameWorks program usurps power from developers, end-users, and AMD I didn't pasta everything btw. Bolded interesting parts. |
|
|
Dec 30 2013, 10:50 AM
Return to original view | Post
#270
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
QUOTE(Powet @ Dec 30 2013, 10:29 AM) guys, why i cannot pick another option like maintain aspect ratio? Try http://support.amd.com/en-us/kb-articles/P...GPUScaling.aspx» Click to show Spoiler - click again to hide... « |
|
|
Dec 30 2013, 01:22 PM
Return to original view | Post
#271
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
QUOTE(Unseen83 @ Dec 30 2013, 12:35 PM) I heard if you put the red sticker it will overclock higher. QUOTE(bingding @ Dec 30 2013, 01:15 PM) hi, regarding to my problem which i posted on post 2260, the shop told me that my r9 290 hdmi wasnt compatible wiv samsung monitor. when he tried with acer monitor connecting the hdmi from r9 290 to the dvi from acer monitor, there is display. and when he connect to the aoc monitor using hdmi to hdmi, there is also display. he told me that only samsung monitor isnt compatible... has any user here using samsung monitor and r9 290 gpu as well? don really know whether he is telling the truth What model is that? |
|
|
Dec 30 2013, 01:52 PM
Return to original view | Post
#272
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
I just noticed this in japamd upcoming RP:Infinity page
QUOTE Stuff going to be added soon
Sauce: Get ready for new RadeonPro: Infinity Does that mean we will have ShadowPlay like feature in a few months? This post has been edited by Acid_RuleZz: Dec 30 2013, 02:07 PM |
|
|
Dec 30 2013, 02:12 PM
Return to original view | Post
#273
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
QUOTE(Unseen83 @ Dec 30 2013, 01:58 PM) It is part of AMD Media SDK i think.. correct me if i'm wrong. |
|
|
|
|
|
Dec 30 2013, 06:22 PM
Return to original view | Post
#274
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
QUOTE(chocobo7779 @ Dec 30 2013, 06:19 PM) I think you can pre-order from Feiton already. |
|
|
Dec 30 2013, 06:50 PM
Return to original view | Post
#275
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
QUOTE(law1777 @ Dec 30 2013, 06:40 PM) argh.. the 290'X' is not needed for me as i only game with 1080p now.. for a while i was deciding to get the 280X or not but then when 290 vs 280X there's still big difference in performance even only at 1080p You can try downsampling if your monitor support it, work great with games that have horrible AA like GTA4 or many DX9 games. I play few games @1440p with my 1080p monitor. |
|
|
Jan 1 2014, 11:33 AM
Return to original view | Post
#276
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
|
|
|
Jan 1 2014, 07:31 PM
Return to original view | Post
#277
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
QUOTE(Unseen83 @ Jan 1 2014, 07:10 PM) Hello, I usually go there to compare really low end card. guy what you think of GPUBoss (yes i was Tags noob for use to recommend this site as reference for better VS...) link but base site R9 290X lost against"s all benchmark against GTX 780... if you read comment on site the Mod Jude Fiorillo say gonna change he wrote "o clarify: we're updating our system to show a broader range of high resolution benchmarks, which is where the R9 290X excels against the Titan and the GTX 780. Right now we don't have full coverage of benchmarks for the R9, which is why the overall score is a bit lower than it would be - once we make these adjustments this will change. We're not trying to be biased and are working to round out our coverage! " but this was like 2 month ago lols... No, that site is no good and their "Reason to consider.. " is mostly BS. ![]() You can't really compare those thing between different architecture except TDP and nobody care about Passmark scores. |
|
|
Jan 1 2014, 08:32 PM
Return to original view | Post
#278
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
|
|
|
Jan 2 2014, 01:16 AM
Return to original view | Post
#279
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
Gigabyte R9-290x halt production because of design flaw.
QUOTE Gigabyte’s R9 290(X) graphics cards were one of the first custom Hawaii-based models available for purchase. First samples were available yet before Christmas. Unfortunately, the first batch has a design flaw, caused by a design modification of which Gigabyte was not aware of (PCBs are not manufactured by Gigabyte). Gigabyte: There is a problem with a heat sink. The production sample is different from the sample provided by our vendor. The sink for early media samples are not optimized to meet the design requirement. We will improve it immediately and we stop the mass production today. ![]() Apparently Gigabyte rushed the production without proper testing. The first review samples were in fact different than graphics cards available in retail channel. Manufacturer is working on a solution, which will be implemented with a new batch released this month. If you were planning to buy Gigabyte R9 290(X) graphics card, I suggest you look at other models instead. Sauce: Gigabyte halts Radeon R9 290(X) WindForce production due to design flaw |
|
|
Jan 2 2014, 01:15 PM
Return to original view | Post
#280
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
Sapphire R9-290x review from PCPer.
QUOTE Taking those clock rate changes into account, the Tri-X was running at an average clock speed of 1214 MHz when set to 1225 MHz, a nice bump of 14% over the standard clock rate of the card out of the box. ![]() Just like I saw with the ASUS DirectCU II R9 290X card from earlier in the month, the Sapphire Tri-X R9 290X fixes basically all of the problems and complaints I had with the reference design of the Radeon R9 290 and R9 290X products. These GPUs are meeting their specified clock speeds without complaints and are doing so with lower temperatures (and even lower noise levels) as well. The base clock rate of the Hawaii GPU on the Sapphire card is 1040 MHz and, while that is only 40 MHz higher than the rated speed of the reference designs, it makes the Tri-X model a much faster GPU in practice. The reason? ![]() Final Thoughts Sapphire's R9 290X Tri-X 4GB graphics card is among the fastest we have ever tested at PC Perspective. Its overclocked settings out of the box, at 1040 MHz GPU clock, are a bit lower than the ASUS model but some very simple and basic overclocking can easily level the playing field. The GeForce GTX 780 Ti now has another strong competitor in the performance department that also comes with a $100 lower price tag. Once again, my conclusion is based solely on the fact that these parts SHOULD be available at these prices sometime in the not-too-distant future. If the retail partners and etailers continue to jack up the prices on AMD's R9 series of graphics cards, our outlook could change pretty dramatically. For now, the Sapphire R9 290X Tri-X 4GB looks to be another fantastic retail Hawaii GPU. Sauce: Sapphire Radeon R9 290X Tri-X 4GB Graphics Card Review Beat ref GTX780 Ti in Bioshock Infinite @ 1440p ![]() Look that that pretty frametime. ![]() |
|
Topic ClosedOptions
|
| Change to: | 0.0458sec
0.55
7 queries
GZIP Disabled
Time is now: 10th December 2025 - 12:18 AM |