Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed

Outline · [ Standard ] · Linear+

 AMD Radeon™ Discussion V12, Latest - 14.12 | WHQL - 14.12

views
     
JohnLai
post Jun 6 2015, 08:54 PM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(Minecrafter @ Jun 6 2015, 08:48 PM)
Although Intel is better,games are using more cores.Even for entry level if the person plays games like DA:I,kinda thinking of an FX-6300 build.

I saw this one comment on one of LinusTechTips' videos,the person said "Linus,why is there no videos with AMD CPUs in a long time? Lol. laugh.gif
Are you sure?Palt GTX750 StormX OC or Dual 2GB RM567.10 at Jayacom. tongue.gif Although the R7 265 is better.
CyntrixTech.. brows.gif Zotac 1 fan.
*
Well, this ought to do.
Will sell my twin 260X to friend for RM350 each and buy that. Thanks for the info. rclxms.gif
JohnLai
post Jun 6 2015, 10:19 PM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(terradrive @ Jun 6 2015, 10:12 PM)
I see, no wonder. Running i3 and 260X CF isn't that good idea.
*
Can't do much. Sometimes I spend 2 weeks on site A, then 3 weeks on site B. What a hectic life......
No budget left. Need to cut cost ever since GST is enforced. cry.gif

If I dont do some gaming as entertainment, I think I might go crazy soon. laugh.gif


JohnLai
post Jun 7 2015, 12:58 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(kizwan @ Jun 7 2015, 12:47 AM)
Pretty much you have never tried Mantle before & just take whatever you read in the internet as 100% accurate. The whole point of Mantle is to reduce CPU overhead & it does work.
*


Are you sure your're not the one confusing here? Well I'm not talking about DX9 or DX11 here but Mantle. (Shortened previous reply above to show that is the only thing I'm commenting.) You said it's useless which is not true. Did you ever try it? I don't think so. AMD has been updating Mantle at least few times since release, are sure it wasn't new update of Mantle in the driver that fixed the problem with 285? I have been using Mantle since release in BF4 & never look back. Nowadays I'm no longer playing games @1080p. Now 1440p or 4K resolution for me. Now I'm using DX11 again even in BF4. At the moment I'm only playing two games, BF4 & GTAV with DX11 & 1440p resolution. Well I don't have any complain with performance whatsoever.

You're talking about driver overhead but keep linking the issue with DX9 - DX11. What about driver overhead with Mantle then?

Have you seen AMD FX CPU with quad 290X rig? Note: he doesn't play @1080p of course.
*
Can somebody help me to explain the stuff to kizwan?

EDIT: Formatting.

This post has been edited by JohnLai: Jun 7 2015, 01:02 AM
JohnLai
post Jun 7 2015, 10:34 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(kizwan @ Jun 7 2015, 04:31 AM)
Well, why don't you then? lol

We all know the issue with AMD driver is high DX11 CPU overhead & usage. This is what driver overhead is referring to actually. When you said Mantle or DX12 is useless if AMD doesn't fix the driver overhead issue, it doesn't make sense to me. To solve driver overhead issue, AMD introduced Mantle & thanks to Mantle, now Microsoft introducing DX12. Using BF4 as an example, since it's DX11-enabled & Mantle-enabled game, performance when using Mantle much better than when using DX11 because the API CPU overhead & CPU usage are reduced, allowing more draw calls per second. However, while Mantle work great with GCN 1.1 card (Hawaii), it doesn't look so great with older cards. I'm afraid this trend may carry forward when Win 10 & DX12-enabled games available. We'll see. Right now I think to really utilize DX12, you need at least Hawaii card(s) if you go with AMD cards.

If DX12 is anything like Mantle, we will see better performance once Win 10 & DX12-enabled games available. However, AMD will still struggling when running Nvidia-optimized games though. So we will still see the same crap that we're seeing currently. With the closed codes, e.g. Gameworks, it's hard for AMD to release optimized drivers for that games.
*
http://www.techspot.com/review/793-thief-b...le-performance/
First, take a look on Battlefield 4 result with mantle renderer. Since AMD and Dice works very closely, both are able to optimize the Mantle API and catalyst overhead accordingly. Look at 290X result. CPU from FX-8350, A8-7600, I3-4130 and I7-4770 all resulted in same FPS indicating 290X becoming the bottleneck aka maximum/optimal GPU utilization thumbup.gif

Second, take a look on Thief mantle result. Thief is developed by Eidos Montreal use mantle api, but this studio doesn't exactly work as close as DICE with AMD. Mantle result here is very strange and one of the reason why I suspect driver overhead is still an issue. hmm.gif Take a look on the bar chart. Core i3-4130 and Core i7-4770 are actually faster than fx-8350 in Mantle mode by tiny margin indicating there is still some driver overhead.

QUOTE(empire23 @ Jun 7 2015, 06:55 AM)
It's not black and white, as depending on the inherent efficiency of the rendering interface and any additional abstraction steps that the driver may or may not have to make, a choice of Mantle or DX can affect things.

Graphics is a lot more complex than just a simple SIP pipeline. For example a driver might have issues with texture calls and transfers being the most cycle heavy, thus the use of a CTM (Close to Metal) like Mantle where you can  actively manage hierarchy can reduce CPU usage as a whole even on the driver side of things. This is dependency in action.
*
Still, the driver needs to expose the low level interface first.

QUOTE(area61 @ Jun 7 2015, 09:49 AM)
Hard to see anyone running AMD processor and pairing it up to enthusiast Radeons. Basically The issue of CPU overhead is moot when running on Intel processor.
*
Well.....what can I say....
JohnLai
post Jun 7 2015, 10:50 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(terradrive @ Jun 7 2015, 10:39 AM)
I don't think that is the case. If there's driver overhead then i3-4130 should perform significantly worse than i7-4770. FX-8350 simply cannot fight intel CPU performance in gaming.
*
Eh? But the graph did show 4130 perform worse than 4770 in both mantle and dx11......
Notice the 8350 extreme performance improvement when changing from dx11 to mantle.
Compare 8350 mantle with 4130 mantle.

To be honest, in thief mantle mode, I dont expect 4 modules (8 threads) fx 8350 4.0ghz to lose to i3-4130 2 core 3.40ghz with hyper-threading.
JohnLai
post Jun 7 2015, 04:27 PM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(cstkl1 @ Jun 7 2015, 04:17 PM)
empire23
Personally what i fear the future will be one sided

Maxwell is akin to time of core2duo for intel. Everybody thought amd will challenge soon.

Amd better not sit on their laurels. They need to make more efficient gpu.
They have lost the desktop cpu, then notebook cpu n igpu and nowhere to be found on small form factor pc/tablets/ultrabook etc.

290x was ok because alot of ppl ignored the power requirement it used.

If they dont challenge nvidia in all levels this time.. I fear whats gonna happen with pascal. Maxwell is a truly incredible jump from kepler.

390x n Fury X or whatever the naming scheme better rise up. Nobody even cares or even bothers on amd cpu refreshes n updates but all know intels roadmap. Lets pray this does not happen with discreet gpu.
*
The bad news if the rumour is to believed, nvidia already in taped-out status for its Pascal microarchitecture......

On AMD roadmap = HBM v2 with double memory size and bandwidth........ where is roadmap for upcoming GCN itself after Fiji? We know upcoming Fiji will based on GCN 1.2.......

Guys, there was rumour floating around mentioning the upcoming Grenada etc will be manufactured by GlobalFoundry instead of TSMC.
Any opinion on this?
EDIT: Source : http://hexus.net/tech/news/industry/78757-...foundries-2015/

This post has been edited by JohnLai: Jun 7 2015, 04:40 PM
JohnLai
post Jun 13 2015, 10:40 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(hazard_puppet @ Jun 13 2015, 10:18 AM)
lol this guy doesnt even knew about r9 290x doesnt have crossfire port..haha
*
And this poor guy actually thinks 390x has 4096 GCN cores....... doh.gif

I think he didn't know about 390x being a rebranded/refined version of 290x with 2816 GCN cores.....
JohnLai
post Jun 16 2015, 09:17 PM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
By the way, any non www.twitch.tv/amd for the so called upcoming furry (instead of fiery fury) reveal?

twitch requires flash player installation -.- and I dont want to install it.
JohnLai
post Jun 17 2015, 12:18 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
109 = 360
149 = 370
199 = 380
*329 = 390
429 = 390x

Price in USD.......

*=not sure if I listened wrongly......
JohnLai
post Jun 17 2015, 01:05 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
649 = Fury x water cooling
549 = Fury x air cooling

-.-.....
JohnLai
post Jun 17 2015, 01:12 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
The only problem is the fury X consuming a crazy amount of power despite the HBM switch saving.

JohnLai
post Jun 17 2015, 01:35 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(Acid_RuleZz @ Jun 17 2015, 01:15 AM)
What is the power consumption?
*
The nano version consumed half of 290x power according to one of the presenter.

One of the slide mentioned 4x performance/watt (I believe this is for HBM compared to GDRR5)

290X (2816 cores) consumed 290watts (in general).

Fury X (4096 cores), let assume if this one still uses GDDR5 for some rough calculation

2816 = 290watts
4096 = 421watts

Under 28nm node, it is a suicide to use GDDR5 with 4096 GCN cores.

Now, let take a optimistic look on samsung GDDR5 power consuming in watts
http://www.samsung.com/us/business/oem-sol...Green-GDDR5.pdf

-.- 290X has 512bit width and 16 chips of 256Mb of GDDR5.

Since Fury will use 4GB of HBM as well (4 HBMs stacked around the GPU core, simply takes the 4x performance/watt and divide it accordingly)

Rough calculation will indicate the Fury X power consumption to be around 300 watts plus/minus 10%.

As I said, this is just a rough estimate, AMD probably will overclock the GPU core at insane amount (reason for water cooling), so my estimated calculation is minimum possible value.
JohnLai
post Jun 17 2015, 10:34 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
To be honest, I am disappointed with AMD line up from 360 to 390x.

I saw many AMD defenders at other forum defending the rebranding saying Nvidia did it too for the 8800GTX, 9800GTX, 9800GTX+ and GTS250.......

But they forgot about the incremental features added by nvidia into each 'rebrand'.
From 8800GTX to 9800GTX, Nvidia added H264 and VC1 Purevideo 2 (unofficial name) decoding block, in addition of die shrink from 90nm to 65nm, giving it higher clock speed and power usage improvement (155watts down to 140watts).
From 9800GTX to 9800GTX+, it is another die shrink from 65nm to 50nm giving it higher clocked gpu core with slight power increase (140watts to 141watts) negating the power saving from die shrink in order to compete with HD4850.
From 9800GTX+ to GTS250, the gpu reference board was optimized to be smaller and use less components and energy.


Fiji family is the only range that have my interest due to all its additional features such as HBM, GCN 1.2 (At least?), increase in gcn cores, ROP and Texture Units.

EDIT: any incremental feature like updated UVD or VCE for 360 to 390x other than higher gpu and memory clock and larger vram?

This post has been edited by JohnLai: Jun 17 2015, 10:36 AM
JohnLai
post Jun 17 2015, 10:47 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(ruffstuff @ Jun 17 2015, 10:40 AM)
Did anyone discuss on the pricing?  USD649, USD549, i think this is great pricing.Fury X is way cheaper than titan x, Fury Air cooled is slightly cheaper than 980 Ti.

Fury Air cooled caught my attention, probably get one of this, and water cooled it to fit into my ncase M1.  Started to think about waterblok or if i can reuse my existing universal block with proper mounting brackets.
*
About the pricing though, I read at some forum mentioning someone from reddit will eat his shoe or sock if the Fury X costs less than USD 650........which AMD priced at USD649.....so..... whistling.gif
JohnLai
post Jun 17 2015, 10:52 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(ruffstuff @ Jun 17 2015, 10:49 AM)
suddenly im so exited of going to freeysync route instead of g-sync.  As perfect g-sync/freesync monitor still non-available in malaysia.  Whichever came first i'll go for that.

That small form factor of air cooled fury really caught my attention, and of course the price.  thumbup.gif
*
Both freesync and gsync really costs a bomb.
Wouldn't recommend you to buy it yet. There are some issues associated with variable refresh rate for both.
JohnLai
post Jun 17 2015, 03:45 PM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(Acid_RuleZz @ Jun 17 2015, 03:10 PM)
Enough with gameworks guys, it's all AMD, Nvidia and developers fault.

Yeah FC4 4k benchmark looks good but i take it with a grain of salt because it's not from trusted reviewers.

user posted image
*
Take this with a grain of salt. For what we know, AMD may take the fps reading in far cry 4 village/town house (less stuff to render) instead of actual gameplay during shooting or hunting in the jungle. (Cherry picked result, AMD has track record of doing it)

JohnLai
post Jun 17 2015, 03:54 PM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(terradrive @ Jun 17 2015, 03:52 PM)
I don't think it's cherry picked result, more like cherry picked game.

user posted image
290X performs really well for Far Cry 4 4K at Ultra settings.
*
So.....the fact AMD uses fc4 as benchmark for 4k with results of 54 and 43........means other games will get less than 30fps? doh.gif
JohnLai
post Jun 17 2015, 04:00 PM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(terradrive @ Jun 17 2015, 03:58 PM)
4K is too overhyped, I'd rather go for 144Hz 1440p than 4K.
*
Speaking of 144Hz, both nvidia and amd gpus have problem with proper idling when 144hz is being used.
Reducing refresh rate to 120hz will fix the gpu idling issue, but this isnt the solution.
JohnLai
post Jun 17 2015, 04:06 PM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(cstkl1 @ Jun 17 2015, 04:02 PM)
Err i found it happens on altertered vbios.  Ref doesnt seem to be affected.
*
Are you certain? It happens on my official beta vbios for sapphire 260x (to fix the 2D clock horizontal flickering issue) for my 144hz 1080p monitor.

Even GTX 970 also exhibits the same issue where it can't go to proper idle mode. hmm.gif
JohnLai
post Jun 20 2015, 11:49 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(cstkl1 @ Jun 20 2015, 11:17 AM)
tdp dude.

a lot of ppl dont seem to understand tdp. Got a lot of noobs on PC Gaming Community (Malaysia) FB

nvidia measures tdp at 85C with their own respectable db sound. Thats y the clock rates so low with massive oc reserved
until they see a better yield with lower temps they up it. Just like titan black vs 780ti. Its not for fun.

Fury X in comparison should be at 50C. Need to see the aircool version.

temps affect tdp value.
*
Have you see nvidia gpu result with water cooling yet?
You will be suprised on GTX980 water cooled temperature.

On pcper.com article about EVGA Hydro Copper GTX 980 Water Block testing, the gpu was clocked at base clock 1407mhz with only average temp of 40'C.
Most time spent on boost clock of 1582mhz.
Reviewer said the only limiting factor from further overclocking is software-limited voltage regulation.

If Fury X is aimed to compete with GTX980TI, I think Nvidia probably will follow suit with similar 'official' water cooling edition.
Two can play at that game.
For instance, EVGA GTX 980 Ti Hybrid smile.gif

Notice that most benchmark claiming Fury X beats GTX980Ti at its stock clock.

Too bad Nvidia will probably artifically limit the maximum allowable voltage.



3 Pages < 1 2 3 >Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0201sec    0.98    7 queries    GZIP Disabled
Time is now: 28th November 2025 - 08:22 AM