Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed

Outline · [ Standard ] · Linear+

 AMD Radeon™ Discussion V12, Latest - 14.12 | WHQL - 14.12

views
     
stringfellow
post Jun 17 2015, 01:30 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
Interesting point brought up in Overclock.net forum, based on this claim on the presentation slides:

user posted image

Meaning that the Dual Fiji + HBM card (coming in the fall 2015) is the world's fastest graphics card. Which means, it's a dual GPU card on a single PCB. Ala R9 295X2. Which means a single Fiji with HBM card is NOT the world's fastest graphics card?

Also they fail to mention VRAM size. HBM1 is limited to 4GB VRAM. And as much as anyone wanna spin that 4GB VRAM is good enough for 4K, it isnt. GTAV, which is one of the mentioned games on that presentation being claimed as the best 4K experience with the Fury cards, at Very High settings alone eats more than 4GB of memory. 4.5-5GB depending on which settings enabled. If they wanna claim "best 4K experience", running out of VRAM and stuttering all over shouldn't be one of those "experience". Count this rebuttal moot if Fury comes with 6 or 8GB VRAM.

I like the Project Quantum form factor though. Fits the living room aesthetics, very minimalistic. They mentioned it being equipped with dual Fiji. Should be ample performance to pair it with a 4K UHD living room TV with HDMI 2.0 running 60hz refresh rate.
stringfellow
post Jun 17 2015, 01:44 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
It was mentioned 275W for the Fury X during the banter between Koduri, that engineer, and the other guy.
stringfellow
post Jun 17 2015, 10:18 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(Human10 @ Jun 17 2015, 09:20 AM)
you are too paranoid.

It's nature that FuryX alone can't claim the title of most powerful card since there is a card which features two of it, of course more/most powerful title go to the latter.
*
You are too forgiving and accepting. It's NATURAL to question all the hyperbole and marketing claims on these slides. Of course two is better than one. The point of he matter is that it took them TWO Fijis to claim that, instead of one. What's stopping them from putting on that same slide that claim of "world's fastest single GPU" right next to the Fury X, instead of putting that "world's fastest graphics card" on the dual Fiji instead?

CFX implementation on anything AMD is still poor so even if it does claim the world's fastest graphics card, having to deal with CFX peculiarities like micro stuttering and missing CFX profiles alone does not ensure "the best 4K experience" as what they claimed. It's like having the fastest Ferrari car around but giving it to drive to an 18 year old who just pass his driving license test. Single GPU card is still the metric to look for to avoid CFX complications. rolleyes.gif

You can be defensive about folks questioning AMD all you want, but it doesn't help their cause when they're not being true to their own claims. Or in this case, side skirting around vague claims and statements.

Personally to me, THIS was the only good thing that came out of the presentation yesterday:



It shows forward thinking and innovation on behalf of AMD to think outside the box on what their HBM implementation has done to shrink the PC form factor into a form which was not possible before. Almost like a PC version of that dustbin-looking Mac Pro.

This post has been edited by stringfellow: Jun 17 2015, 10:31 AM
stringfellow
post Jun 17 2015, 03:12 PM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(Acid_RuleZz @ Jun 17 2015, 03:10 PM)
Enough with gameworks guys, it's all AMD, Nvidia and developers fault.

Yeah FC4 4k benchmark looks good but i take it with a grain of salt because it's not from trusted reviewers.

user posted image
*
I know you said enough with GameWorks, but ......that figure with or without GameWorks on? tongue.gif

And I guess the question of VRAM is answered: 4GB HBM only. Unless they figure out a way to compress texture or flush the texture data fast enough to compensate for that frame buffer size, expect the all-too-familiar-symtoms of running out of video RAM at 4K to happen. After all, this is targeted at 4K users right?

user posted image

This post has been edited by stringfellow: Jun 17 2015, 03:17 PM
stringfellow
post Jun 17 2015, 03:32 PM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(Acid_RuleZz @ Jun 17 2015, 03:25 PM)
No idea, that's why i'll wait for trusted reviewers.  Can't you? tongue.gif
*
When people cant even take what AMD themselves say with confidence...tongue.gif


stringfellow
post Jun 17 2015, 03:47 PM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(Acid_RuleZz @ Jun 17 2015, 03:40 PM)
Haha, just skeptical a little because they didn't show the numbers at E3 last night.

*
Not to mention that they use FC4 which does not have built-in benchmark. Discrepancy against other posted benchmark article is a definite already unless they bench similar runs at similar scenes in the game. They should've used games like GTAV with its preset benchmark.
stringfellow
post Jun 17 2015, 04:04 PM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Jun 17 2015, 03:55 PM)
not bad Price i say for NEW HBM memory, i guess ppl guessing high Price for HBM.. is a bit wrong.. as Nvidia charge it DDR5 mem gpu at same price.. 

[attachmentid=4490018] [attachmentid=4490019]
*
Fury falling flat on its face at 5K bench, signs of running on fumes or out of video memory. Looks like 4K users have to tread that fine line of adjusting their settings (instead of single click Very High or Ultra) on their games, definitely no fancy AA (not required at such high res anyway), AO or even tweaking with draw distance and crowd numbers when it comes to GTAV for example.
stringfellow
post Jun 22 2015, 01:46 PM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
Displayport-HDMI2.0 Active display adapters are extremely finicky to work with and adds additional cost of USD100 MINIMUM on top of the cost of your graphics card purchase.

I was interested in the Project Quantum for my living room, not anymore. Shame, the Nano and Quantum is perfect for this scenario but AMD decides to play it cheap on HDMI specifications.
stringfellow
post Jun 22 2015, 03:53 PM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
The problem with that HDMI1.4 versus HDMI2.0 at running 4k60 is on TV sets here its either at 4K30 on HDMI1.4 or 4K60 on HDMI2.0. Nothing in between. Unless there's a 55inch and above PC monitor Running at 4K60 I can use as replacement for a living room TV, it still affected the decision of those who sees Nano and Project Quantum as a viable HTPC solution but let down by AMD's lack of foresight.
stringfellow
post Jun 22 2015, 09:21 PM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(ruffstuff @ Jun 22 2015, 09:14 PM)
I remember AMD demoed freesync over HDMI in computex.  Not sure its 1.4 or 2.0. 

Depends on the scaler and software, i do hope we can have a true 24p (24hz) experience in watching movies, and vrr when playing game consoles over HDMI via LCD TV in the future.
*
I've already delegated 4K movie watching to my recently purchased Shield Pro console (which has HDMI2.0 for 4K60 uncompromised), after finding out about the lack of HDMI2.0 on the Fury lineup. Need the device connected to my TV to be as inconspicuous as possible, not huge honking towers or bulky cubes.
stringfellow
post Jun 22 2015, 11:46 PM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
Medium. You don't position a flagship GPU to run on Medium.

This post has been edited by stringfellow: Jun 22 2015, 11:49 PM
stringfellow
post Jun 23 2015, 12:24 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
Look, if you're buying flagship, you run it at the maximum setting. If this is what they got after trying to run it as such, and can only manage 60fps at Medium, then what's the point? You don't market your flagship running game on mediocre settings. It's like marketing a movie theatre as IMAX but running movie at standard definition.

I can run games on dual 4K monitors at Low, but I'm not gonna advertise that as a plus point, in fact I'd be embarrassed to proclaim that.
stringfellow
post Jun 23 2015, 12:37 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Jun 23 2015, 12:31 AM)
okay you have point.. but maybe.. just to show the amd flagship capability to run it flawless in 5K 60Fps but with just Medium setting..  as everyone know this gpu no issues on 4K gaming (which im planing to use it for when i receive gpu) xx
*
Or they set it to run at Medium to avoid hitting VRAM limits and stutter due to the limited 4GB VRAM, setting any higher than medium breaches that.

To each his own. I do not run my games on anything lower than High. Having UHD displays but running it on low or medium settings is a waste of that display capability to me.

ADDENDUM: Check the setting screenshot again, it's running at 5K30, NOT 5K60. The reviewer just mentions it "feels like running at 60+fps". Which is which? Also the fact that they mentioned The Witcher 3 at the start of the article and then run about correcting it to Dragon Age Inquisition, without correcting the first mention of the Witcher 3. Shows how suspect their method of reporting to me.

This post has been edited by stringfellow: Jun 23 2015, 12:52 AM
stringfellow
post Jun 23 2015, 12:46 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(kizwan @ Jun 23 2015, 12:40 AM)
Why embarrassed, running medium at 5K with single card? I think you missing the point there. It's just to show what the single Fury X can do at 5K with Dragon Age game, not claiming any title.  rclxub.gif
*
Sure, it's not claiming any title, so from the way you word your sentences there, it's nothing special either. So what the big deal then?

I'd still say that they are purposely AVOIDING running any settings higher than Medium to hide the fact that they're running out of VRAM (and the ensuing consequences of VRAM starvation). At 1440p, Dragon Age Inquisition is already eating up 4GB of video memory.

user posted image

stringfellow
post Jun 23 2015, 12:52 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
Check the setting screenshot again, it's running at 5K30, NOT 5K60. The reviewer just mentions it "feels like running at 60+fps". Which is which? Also the fact that they mentioned The Witcher 3 at the start of the article and then run about correcting it to Dragon Age Inquisition, without correcting the first mention of the Witcher 3. Shows how suspect their method of reporting to me.

user posted image

This post has been edited by stringfellow: Jun 23 2015, 12:53 AM
stringfellow
post Jun 23 2015, 01:14 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
Then technically it is NOT 60fps, it's merely running above 30fps to alleviate any instances of stuttering, that made that "felt like 60+fps" comment. I don't have to be an AMD fan to like what they're doing or hate Nvidia to show appreciation of what they did right. Same reason why you dont have to be an AMD user here to post what you felt they did wrong. Heck, if I'm AMD (or Nvidia), I'd rather listen to constructive criticism to improve myself rather than thinking I did great (Medium 4K? Seriously.) when it is actually nothing to shout about.

Which would you rather want: people praising you for something mediocre, or people critisizing you for something great? I'd take the latter.

For posterity sake, I just disabled SLI on my rig and ran a single card at 4K Medium on Dragon Age Inquisition. I get 60 fps (possibly higher since my monitor only goes up to 60Hz) as well recorded on Shadowplay FPS counter. I don't feel so special here either. Granted I'm running Titan X, but since the Fury X's competitor, the 980Ti is a Titan X killer, 60fps on 4K at Medium in DA:I, is normal. The point to take here, it is normal to get 30-60fps at 4K and above running DA:I on Medium on flagship cards. It's probably astonishing to AMD users here since neither of their previous flagship cards are able to do so before the arrival of Fury. But it has been that way with its competitor's card, so nothing really much to shout about really. *shrugs*

This post has been edited by stringfellow: Jun 23 2015, 01:19 AM
stringfellow
post Jun 23 2015, 02:08 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(kizwan @ Jun 23 2015, 01:45 AM)
"The first thing that probably slapped you in the face is the close to 6GB VRAM usage at 4K while our VRAM test cards the R9-290s in Crossfire only have 4GB (mirrored) . This is probably due to a bug on the driver side, so it should be taken with a grain of salt."

Read more at HardwarePal: Dragon Age Inquisition Benchmark Mantle Vs DirectX http://www.hardwarepal.com/?p=7793

Take with a grain of salt. The reviewer doesn't mentioned stuttering (which is what would you get when the card out of VRAM) when testing at 4K with 290 or 970. The graph meaningless to me since the reviewer didn't go into details on it. For 290, they should have two graphs for DX11 & Mantle. Like they mentioned in the article, take with grain of salt.
It's not even a review. It's only one page with 3 screenshots. Everything shown there, 5K @30Hz & like you said, he/she basically said "It was liquid smooth to my eyes, with the graphics being set to 'Medium' at 5K. It felt like 60FPS+". Sorry, I don't see why you need to fuss about this though. And no one going to run 5K or 4K even, with single GPU. I also checked other reviews on Dragon Age Inquisition, they also doesn't mentioned stuttering when playing the games at 4K @High or Ultra with either 295X2 or 980 SLI.
*
I didnt say it was a review. It was a post with 3 screenshots. But which is which? At first say Witcher 3 then later show screenshots of DA:I? Setting screenshot says 5K at 30Hz, then mention "feels like 60+"? If the post is meant to generate a "feel-good" feeling for those justifying a Fury X purchase, maybe it works for those who are already invested or buying the Fury X already. Perhaps as justification of purchase since the person who first posted the link to that article has he himself invested in the Fury X from a local reseller, and posting this link made his purchase justifiable, maybe? The rest who are sitting on the fence, money in hand but still researching, I reckon, would prefer a more unbiased, trustworthy, and proof-readed/proof-provided article than 3 screenshots and a "feel like 60+fps" article. They're gonna be spending RM2800-2900 on this, so proof of such performance would help them make the right decision on whether to buy or not, not driven by fanboyism alone. I was in the market for this myself for a Fury SFF build, so I prefer and appreciate an unbiased article than a loosely cobbled-together one like this.

There are no reviews on DA:I with regards to frame-rating which would elaborate further on stuttering, other than a smattering of posts from users in forums. Does not mean that there aren't any. I looked up the usual suspects like TPU, MAximumPC and PcPER since they're the ones usually with graphs of frame-rating, nothing there either. Still it does not mean that there aren't any.
stringfellow
post Jun 23 2015, 02:45 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(kizwan @ Jun 23 2015, 02:19 AM)
"AMD user here", yeah, nice trolling.  doh.gif If you can't discuss nicely here without inflaming people here, why bother posting here. According to anandtech review, 290X is only a couple FPS behind 980 in DA: Inquisition @4K @high & ultra settings. If 980 can run @60FPS with medium settings @4K, 290X will be able too. Well, your argument is faulty since you're testing @4K, not 5K.
*
Wasn't trolling. That was an observation. In fact, it was my experience with the past AMD flagship card that they cant do 4K proper at all. I wouldnt be spending RM16,000 for CrossFire R9 290X rig for 4K if I didn't give AMD a try. If I am trolling, that would be an expensive pricetag to put on just to troll. A very expensive AMD rig built to troll? Try again.

If you cant take constructive criticism here, why bother posting here. Sure, I'll go ahead and buy the Dell 5K monitor tomorrow just to correct my faulty "argument". Just like how I built a CrossFire AMD rig just to "troll" people here. rolleyes.gif

I post on experience. Experience of using an AMD R9 290X, and unable to satisfactorily run 4K without serious repercussion on its lifespan and/or power requirement/heat generation from the rig. Wasn't arguing just for the sake or arguing. Just because I went Team Green doesn't mean that I didn't consider Team Red viable at one time. Wouldn't be building a rig that meticulous if I didn't consider it viable. Try not to put blanket statement or immediately label someone as "trolling" just because he has an Nvidia rig on his sig. It doesn't help showing how bitter AMD diehard users are. icon_rolleyes.gif
stringfellow
post Jun 23 2015, 03:01 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(kizwan @ Jun 23 2015, 02:42 AM)
That's the problem right there. Unseen already in his first post saying it's DA: Inquisition. On the (little) title is wrongly said it's Witcher 3, yes but if you really read the short article, it specifically mentioned DA: Inquisition. Anyone would be able to know there's typo there in the title. It's not really an issue.

Like I said, it's not a review or even an article to be honest. He/she basically written based on what he sees or feels on what he/she witness at the AMD event. There is no biased article/review here. No one in their right mind going to take this short news too seriously but it's worth to share it here or anywhere else. So, there is no reason to fuss about this really.  doh.gif

If the game is really is maxing out 4GB RAM, then 980 SLI won't be able to run DA: Inquisition @70FPS or 54FPS at high & ultra respectively @4K. No way.
*
LOL calm down man, you are the one who kept mentioning this as a "review", and nowhere in my previous post did I state that this is a review. Chill out, AMD cards are already running hot, dont tell me the users are also hot-tempered too? tongue.gif I was questioning the integrity of the article (with multiple typos), not Unseen. Read properly and calmly. Anger can distort one's conclusions.

My point was the article is taken as it is, a simple Twitter style post that generate hype and excitement without proper mentioning of parameters. It sure does create excitement here, but doesn't help potential buyers or fence-sitters whether they should jump in or go to the other team. It's just hype. No one is gonna take this seriously too I agree, but apparently it is you who are taking it seriously about me taking this article lightly (by questioning its validity). tongue.gif

Even if a game maxes out 4GB of VRAM, it will still run, it'll just have to swap textures and graphics data when its memory buffers ran out of it. That cycle of swapping textures and data is the one contributing towards the stutter. From what little I know lah. Correct me if I'm wrong.
stringfellow
post Jun 23 2015, 03:12 AM

Ultrawide | 4K | VR
********
Senior Member
11,305 posts

Joined: Jan 2003
QUOTE(kizwan @ Jun 23 2015, 02:57 AM)
Again, you failed to read & understand properly. Seriously. I came to the conclusion you're trolling based on what you wrote, not because you have Nvidia rig.  doh.gif

"It's probably astonishing to AMD users here since neither of their previous flagship cards are able to do so before the arrival of Fury. But it has been that way with its competitor's card, so nothing really much to shout about really. *shrugs*"

You're basically inflaming people right there. I have no problem with your previous posts but when you start to degenerate the discussion with the above line, it's a problem.

I can take constructive criticism. Inflaming in the other hand is not constructive. That is trolling.
*
I would be trolling if it is taken in context of looking at it from reading people's experiences and drawing conclusions from there, not from my own experiences using one. How can I be trolling when I had been using one in the first place? Sensitive much?

I was in the unique position before of having experienced, owned and used both camp's flagship GPUs in its lifetime. I draw my conclusions from those experiences. I do not base my opinion on what people posts or what I read. How can I be trolling when I have been using a flagship AMD card before and drawing my conclusion from using it? I have to like the AMD flagship otherwise I am trolling? Wow, even I don't draw the lines there.

I dont cherry pick my games to favor one Team or the other. My choice of games comes from a broad spectrum of genres, from AMD favored games like Crysis 3 and Battlefield 4(the main reason why I built this: https://forum.lowyat.net/index.php?showtopic=3466728&hl= ), to Nvidia favored ones like Assassin's Creed Unity and Dying Light. It is from these experiences with these games that I draw the conclusion that the last generation of AMD's flagship card the 290X as being inadequate to run at 4K compared to its counterpart, hence why I wrote "It's probably astonishing to AMD users here since neither of their previous flagship cards are able to do so before the arrival of Fury. But it has been that way with its competitor's card, so nothing really much to shout about really. *shrugs*". Nothing in between the line, there is no line, it's a straight up observation from my own experience using and owning pools of cards from the two camps. *shrugs* <-----an innocent shrug, not a sarcastic one. Have to put disclaimer, otherwise, you'd "read between these lines again"*

This post has been edited by stringfellow: Jun 23 2015, 03:22 AM

2 Pages  1 2 >Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0499sec    0.30    7 queries    GZIP Disabled
Time is now: 8th December 2025 - 08:43 PM