Welcome Guest ( Log In | Register )

Outline · [ Standard ] · Linear+

 AMD Radeon™ Discussion V15, Radeon Software 20.10.1

views
     
SSJBen
post Aug 15 2017, 06:45 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(abu.shofwan @ Aug 15 2017, 10:57 AM)
How does the RPM thing come into play? Will it impact gaming performance?

Just read this piece and was wondering the above

https://www.notebookcheck.net/PS4-Pro-could...s.241410.0.html
*
This is what happens when people who knows nothing about graphics engineering writes about how it works.
It's like Trump telling people how to play dota. Imagine it and see how stupid it sounds.

Here's a short gist on WHY FP16 doesn't matter on the grand scheme of things, written by people who knows their shit:

QUOTE
Taking advantage of this feature, in turn, requires several things. It requires API support and it requires compiler support, but above all it requires code that explicitly asks for FP16 data types. The reason why that matters is two-fold: virtually no existing programs use FP16s, and not everything that is FP32 is suitable for FP16. In the compute world especially, precisions are picked for a reason, and compute users can be quite fussy on the matter. Which is why fast FP64-capable GPUs are a whole market unto themselves. That said, there are whole categories of compute tasks where the high precision isn’t necessary; deep learning is the poster child right now, and for Vega Instinct AMD is practically banking on it.

As for gaming, the situation is more complex still. While FP16 operations can be used for games (and in fact are somewhat common in the mobile space), in the PC space they are virtually never used. When PC GPUs made the jump to unified shaders in 2006/2007, the decision was made to do everything at FP32 since that’s what vertex shaders typically required to begin with, and it’s only recently that anyone has bothered to look back. So while there is some long-term potential here for Vega’s fast FP16 math to become relevant for gaming, at the moment it doesn’t do much outside of a couple of benchmarks and some AMD developer relations enhanced software. Vega will, for the present, live and die in the gaming space primarily based on its FP32 performance.

The biggest obstacle for AMD here in the long-term is in fact NVIDIA. NVIDIA also supports native FP16 operations, however unlike AMD, they restrict it to their dedicated compute GPUs (GP100 & GV100). GP104, by comparison, offers a painful 1/64th native FP16 rate, making it just useful enough for compatibility/development purposes, but not fast enough for real-world use. So for AMD there’s a real risk of developers not bothering with FP16 support when 70% of all GPUs sold similarly don’t support it. It will be an uphill battle, but one that can significantly improve AMD’s performance if they can win it, and even more so if NVIDIA chooses not to budge on their position.

~ Anandtech



To answer your question, if you think RPM will suddenly improve performance for all games - then you'd be sorely mistaken and will be in for a rude awakening.
SSJBen
post Jul 19 2019, 04:52 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Modules that has fairly high success rate of being OC'd to 3600mhz (widely available in Msia) -

G.Skill Trident Z/Royal/RGB - 16GTZR(X)*
Corsair Dominator Platinum - CMD16GX4M2B3000C15

*note There is a specific Ryzen module for G.Skill, but there are enough success reports claiming that the regular non Ryzen modules works too.
SSJBen
post Oct 29 2020, 01:24 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Those are some mighty lofty claims AMD. Hope you guys can walk the talk.
SSJBen
post Oct 29 2020, 01:36 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


First things first though, please don't release the cards with a broken driver that takes 6 months to fix ffs.... again.

This post has been edited by SSJBen: Oct 29 2020, 01:36 AM
SSJBen
post Oct 29 2020, 02:04 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(nrw @ Oct 29 2020, 12:51 AM)
No1 wanna talk about how the cache is starting to limit RDNA2 in 4k and above? Check the 2k and 4k slides.
How will AMD possibly support developers for InfinityCache if they can't even support OEM/ODM's with APU's? Forget about that in the near feauture.
SMA sounds great but really scarce on details.

Wonder what prices in here will be like.
The 6800 is priced too high. the 6800XT is fine. the 6900XT likewise.
*
Infinity cache is not new, dev tools has been supporting it for over a decade now. The xbox 360 had what was called eDRAM and the xbox one/S has eSRAM, both were AMD GPUs. They are the same concept as what Infinity Cache is just that the latter is much larger.
SSJBen
post Oct 30 2020, 04:09 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


I think by large, AMD should only be catching up to Nvidia with big Navi - not beat them, or so they claim. That's quite an achievement on its own, as they've been no where close to Nvidia for 2 generations now.

Beating nvidia is a tough ordeal, especially when AMD doesn't yet have their own AI upscaler. The more important task for them is to ensure 6000 series of cards "just works" out of the box. No dumb bullshit trying to jump through hoops with drivers and weird registry tweaks just to get the card stable enough to go on youtube. That IMO is much more important than "beating" nvidia.
SSJBen
post Oct 30 2020, 11:21 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(latios507 @ Oct 30 2020, 04:59 PM)
Again, the fact that u need DLSS to run high fidelity games @ 4k >60FPS/8k 60fps instead of native resolution, demonstrated that games can't run a smooth frame rate without DLSS. If DLSS is widely adopted, then yes DLSS is good. But since it's only available in selected title, it has limited use.

I don't see what's with the hype when it's just a upsampling, similar concept to how high end Smart TV processors works. I've seen the videos and imo, I'd pick native resolution anyday e.g. 4K than getting a 1080p up-sampled by x4 to get 4K.

If ur still hung up on DLSS, get a Nvidia card and just be done with it. It's a good GPU. Good luck if u can find one out there.

*
The fact that you compared DLSS to upscaling techniques used in 4k smart TVs tells me that you're actually rather ignorant about the topic of upscaling as a whole. You're oversimplifying it.

UHD TVs generally use:
- Bicubic
- Nearest Neighbor
- Bilinear

Either of these 3 ancient techniques are used on mainstream UHD TVs, while some of the highest end TVs has more elaborate in-house techniques that rivals Lanczos and Jinc (which are far from cutting edge as they've been around for over a decade now). However, all of these upscaling techniques are pixel based (except for Bicubic and Nearest Neighbor as they are texture based). The upscaler is taking the image as is and processing it for the final resolution output. This means the upscaler does NOT KNOW what images it is receiving before it upscales and errors will often time occur, especially in the chroma range. You'd see issues with variable sharpness (which is annoying) from scene to scene, you'd see aliasing on high geometry scenes, and you'd see dithering artifacts in backgrounds with a lot of noise.

DLSS 1.0 was lackluster because it only used a single perfect frame for a game, meaning it's always the highest quality image possible. That creates an issue because the AI doesn't know what the image looks like if the frame isn't perfect, i.e - lower resolution. DLSS 2.0 fixed this as it is presented with both a high quality and low quality image, then have it in real time upscale the lower quality image to the highest one with the data derived from the former.

With this, DLSS IS NOT pixel based upscaling, it is IMAGE BASED. That's the key difference. The frames are already trained onto the super computer Nvidia leverages and is passed back onto the end-user through the drivers. The info of the image is already there before hand. Since both luma and chroma range are already pre-calculated, errors DO NOT occur.

I am indeed hung up about DLSS because it works and it's like black magic. Native res is not always better, because you're wasting GPU power on resolution where you can't perceive as obvious as you would with higher framerates (which leads to less stutter, lower latency), and better in-game visual effects (that includes the like of ray tracing). It is true that more games need to be supported and I'm sure it will, because we're not hitting native 8k60 anytime soon. We're reaching the limits of silicon process fabrication and there's only so much more transistors you can pack onto a single silicon.

This post has been edited by SSJBen: Oct 30 2020, 11:27 PM
SSJBen
post Nov 26 2020, 10:27 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Lol RDNA2 died before it even made an appearance. Nice job.

This post has been edited by SSJBen: Nov 26 2020, 10:27 AM
SSJBen
post Dec 1 2020, 08:47 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(wolf1991 @ Dec 1 2020, 04:45 PM)
lol. AMD quite bad in setting the price of their aib card this round but it's a good thing that they are in the market. If the graphic market was monopolized by Nvidia, the price will be much more higher as there are no competition. Intel wise they are going downhill as they lost the crown to previous underdog AMD. Better Nvidia do something to retain the market share, if not who know one day will become like Intel now.
*
laugh.gif laugh.gif laugh.gif

Intel ain't done lmao. You make it sound like AMD has never beaten Intel before and neither has Intel bounced back ever.

AMD isn't all guns and roses either alright. They sweep a lot of their bullshit under the carpet that comes with their ZEN and RDNA products. Just so to have a facade to trick people like you to say "Intel is done".
SSJBen
post Dec 1 2020, 11:08 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(wolf1991 @ Dec 1 2020, 09:52 PM)
Don't take me wrong. I just saying thing doesn't look good for Intel as of now. I never mention anything like Intel is done and I never said they didn't produce a good product.

Lol. Please look at the thing objectively and don't be such a fan boy and twisting my comment in a wrong way. I am more than happy that there are competition and not monopolized market. As a consumer, whoever make a good product with reasonable price then I will go to buy and use it.
*
Fan boy? How? Been on a 3900x and 5900x, haven't bought anything Intel. I'm as objective as can be.

You just made it sound like intel has no way to climb back up and that's what I found very odd.
SSJBen
post Dec 2 2020, 01:34 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(wolf1991 @ Dec 2 2020, 12:30 PM)
Seem like you perceive my message wrongly. I just mentioning it look bad for intel now. I never mention they have no way to climb back, don't get it why you so fired up for my opinion.

I found it very odd, the way you act are just like a intel fanboy yet you claim that you are using amd processor. Lol, maybe I am wrong but no point to discuss on this further, I just want to make myself clear that I am no fanboy to anyone and I wish more competition coming so consumer get a better product.
*
On second thought, perhaps I perceived your message wrongly. My apologies.

But Intel's CPUs not looking attractive is down to the media and mindless sheeps that blows their lack of advancement out of proportion. People forget or don't know that as ancient as Intel's skylake arc is, it's backed with a very mature platform. There are so much less problems on their side vs what you see with Zen 3 (and 2), even if the latter performs better for the most part.
SSJBen
post Dec 2 2020, 03:58 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(wolf1991 @ Dec 2 2020, 03:45 PM)
Ya I agree on reliability part based on personal experience. I have been using Intel product since long time ago from my last 8 years old Dell xps 15 laptop till today still can do normal document stuff but just not able to play game.

I have another one years old gaming desktop with i5 Intel which I use it for gaming in 4k TV n no problem so far. I just own a ryzen 7 cpu for a new desktop that I just build last month so cannot comment much on it reliability but it price to performance ratio was good for me.

I am not gonna be fanboy to either Intel or amd. I will just go over to the 1 with better price performance ratio unless it was limited by CPU slot on motherboard etc etc.
*
It's not reliability or longetivity. AMD has good reliability too.

I'm talking about stability of the platform. AMD has issues with its AGESA based BIOS, so there are a lot of minor quibbles which while not system breaking, it is very annoying when they all add up. There are plenty of people suffering from WHEA errors, not that Intel doesn't have this issue either, but they are a lot more less pronounced and more specific, which is easy to pinpoint the issue and fix, where as AMD is kinda all over the place.
SSJBen
post Dec 8 2020, 10:53 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Not a bad showing for the 6900XT, but definitely didn't live up to the promise of beating the 3090. On paper (following MSRP that is), makes no sense to go for the 3090.

But as we know, AMD's launch for RDNA2 has been a shit show.
SSJBen
post Aug 11 2021, 03:26 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


6600XT really is a stupid card.

It's a PS5 GPU (mostly) that costs more than PS5 and is just as equally unavailable, yet performs worse than the console.
SSJBen
post Nov 4 2022, 01:42 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


If the 7900XTX can actually compete with the 4090 at that price point, Nvidia does actually look very stupid at this point. Time will tell.

You guys shouldn't dismiss FSR btw. 2.0 is actually pretty good right now, not quite up to DLSS level yet but it's not far behind (and I bet most people wouldn't be able to tell difference without a side-by-side comparison). Besides that, FSR can be implemented quite easily compared to DLSS.
SSJBen
post Dec 12 2022, 10:41 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Lol GG to 4080 unless ngreedia lowers the price.

7900XTX not only faster, but much cheaper.
SSJBen
post Dec 13 2022, 01:16 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Acid_RuleZz @ Dec 12 2022, 10:53 PM)
Nah, most will still buy the 4080 i'm sure of that. 🥴
*
Actually, most won't buy anything. smile.gif
SSJBen
post Dec 14 2022, 03:02 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


7900XTX can be had for RM5.2k,

Cheapest 4080 is RM6.2k. RM1k difference for just 25% better RT perf? It's obvious which card looks stupid here. The only saving grace for the 4080 is that it has DLSS3.
SSJBen
post Dec 14 2022, 04:48 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Duckies @ Dec 14 2022, 04:31 PM)
But then 7900 XTX has higher power consumption..about 100W for gaming. Not sure how much that translates to our electric bill in Malaysia. If you are concern on the electric bill on a long term..maybe that makes a difference?  hmm.gif
*
user posted image

52w difference you go say 100w....? rclxub.gif

I don't get the whole "I'm worried about power consumption thing".
SSJBen
post Dec 14 2022, 05:11 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Duckies @ Dec 14 2022, 04:58 PM)
Ah I was looking at the third party one like Asus and XFX. Not exactly 100w but close to. For me I do not care about the power consumption thing but I do not know about others. There's this Youtube video I watched that calculates like 1 day 2 hours gaming which translate like 50 Euro per year for electric bill  sweat.gif

[attachmentid=11344912]
[attachmentid=11344917]
*
Is that fair to compare Asus/XFX third party cards to Nvidia's FE card?

Here is a 4080 MSI Gaming X which is an equivalent to the TUF 4080:

user posted image

Compare that to 7900XTX TUF, ain't close 100w now is it?

Also there's quite a big difference between electric tariffs between EU and MY. You can't use their calculations here. Cukur PMX say no increase in tariffs next year la.

2 Pages  1 2 >Top
 

Change to:
| Lo-Fi Version
0.1123sec    0.38    7 queries    GZIP Disabled
Time is now: 8th December 2025 - 10:49 PM