Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed

Outline · [ Standard ] · Linear+

 AMD Radeon™ Discussion V12, Latest - 14.12 | WHQL - 14.12

views
     
ruffstuff
post Jan 13 2015, 09:09 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(S4PH @ Jan 6 2015, 06:18 PM)
im waiting for the 20nm  thumbup.gif
*
no 20nm gpu for 2015. Probably ever. They may skip to different node.
ruffstuff
post Jan 13 2015, 09:19 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(chocobo7779 @ Jan 6 2015, 01:02 PM)
Well, let's kickstart this thread with this news...

http://wccftech.com/amd-fiji-xt-r9-390x-zauba/

Another leak of the 390X?  brows.gif
*
New leaked suggested the FIJI will be 300W TDP. If its true, it's so obvious it will be 28nm. The TDP is just shocking.gif
ruffstuff
post Mar 11 2015, 04:56 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Mar 11 2015, 10:33 AM)
hmm.gif i exp with gaming no issues that i think i need a Freesync...  eh feel like they made up excuses to sell LCD new tech
*
It's an issues ever since LCD was invented. People are so use to the 60hz/fixed refresh rate, they don't even thought it need to be fixed. Until nvidia came up with g-sync, the industries just so comfortable with fixed refresh rate.

Lot of things can be solved with VRR because media sources are recorded/film with different frame rate at different refresh rate. So this need to be address. We don't want to have fixed refresh rate, capped fps for games and duplicated frames and odd frames for movies that run at 25fps at 60hz.

VRR on monitor is the first step to fix all this issues. Hopefully this will come to consumer electronics (LCD tv, stb and what not).

I'm looking forward they fixed this first before going to 4k/5k/8k.

This post has been edited by ruffstuff: Mar 11 2015, 04:56 PM
ruffstuff
post Mar 20 2015, 03:50 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(terradrive @ Mar 20 2015, 07:42 AM)
I rarely use V-sync, hate the mouse lag with it.

But then even without V-sync the tearing isn't that much if the fps is high enough. Above 40 fps is good enough.

When the fps drops below 30fps is where the tearing looked bad.
*
below 30fps is not tearing artifact, it is called stuttering.

when the graphics card push above your refresh rate, than it is tearing.

This post has been edited by ruffstuff: Mar 20 2015, 03:52 PM
ruffstuff
post Mar 20 2015, 03:51 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Mar 20 2015, 05:09 AM)
OMG!!! so we all need Freesync !!!!

[attachmentid=4379332]

seriously... i almost Never see this happen on my Gaming... a small minor screen tearing during game.. but vsync help remove it.. and  i am just afraid next driver update made all my gaming EXP to be  tearing... like the example top just cause they want to sell new tech in LCD...  hmm.gif
*
It is very hard to explain VRR, or to show it on videos etc. You have to experience it yourself, then you know the difference.
ruffstuff
post Mar 28 2015, 02:56 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003

G-sync and freesync tested via oscilloscope. It is important how both technology handle VRR below the panel refresh rate.

It looks like G-sync deal with low fps smarter than freesync. In a nutshell, freesync just stop doing it works at below panel refresh rate, leave you with the classic stuttering and tearing. Meanwhile, g-sync doing frames interpolation (double or tripple the frames), and match it exactly with the refresh rate. Genius.

This post has been edited by ruffstuff: Mar 28 2015, 03:03 PM
ruffstuff
post Mar 28 2015, 08:06 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Mar 28 2015, 07:35 PM)
hmm.gif  Question is Different price cost for G-sync and Freesync...  RM300-400 different.. ? hmm
*
If you already using nvidia, changing to ATI is even costlier.
ruffstuff
post Mar 28 2015, 08:32 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Mar 28 2015, 08:13 PM)
dudy that goes same any brand if you change brand gpu if not faulty...  i talk about cost of the two brand charging their customer/fans. freesync(amd) GSync(nvidia)  both do same job just each hav it own advantage/disadvantage.. but Price of each tech is hugely different...  wink.gif
*
Im not saying you need to change to nvidia for g-sync. rolleyes.gif
ruffstuff
post Mar 28 2015, 08:42 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
Another drawbacks of freesync, due to narrow VRR window for this particular monitor.



Although AMD claim the default refresh rate for adaptive sync is from 9-240hz, but this not the case because it goes back to the panel limitation itself.

Although this problem is not AMD freesync specific, it more to the panel itself.
ruffstuff
post Mar 30 2015, 11:57 AM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Acid_RuleZz @ Mar 28 2015, 11:33 PM)
Yeah kinda disappointed, i was expecting 30hz minimum for 34um67/29um67. I guess i'll wait for the technology to mature and get significantly cheaper.
Both is legit idealtech rep innm, if in doubt, go for zhenwei.
*
Reading through the VESA white paper for adaptive sync, it seems that the standard do have that doubling/trippling frames feature if the fps drops below VRR window. But, to implement this, the panel itself need frame buffer. This is the missing piece in Freesync technology. The scaler manufacture mostly will not include frame buffer into their scaler for the cost reason. If they do, the price margin between freesync and g-sync is getting narrow.

The g-sync module as we know, do have frame buffer/memory chips.
ruffstuff
post Apr 17 2015, 11:41 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Apr 17 2015, 03:39 PM)


AMD Freesync or G-Sync

according to AMD, not requiring a proprietary module, no licensing fees, it being open source, using DisplayPort, is compatible with all standard monitor features, and has a refresh rate range of 9-240Hz. G-Sync, on the other hand, needs a proprietary module, charges a license fee, isn't open source and has a refresh rate range of 30-144Hz.
[attachmentid=4414994]

http://wccftech.com/amd-freesync-nvidia-gsync-verdict/

http://www.rockpapershotgun.com/2015/04/09...ync-amd-nvidia/
*
The refresh rate window is panel specific. The adaptive-sync spec is from 9-240hz, but no way panel manufacturer will allow the panel to refresh as low as 10hz.


At this moment theres no freesync monitor that can do VRR below 30hz that im aware of.
ruffstuff
post Apr 18 2015, 04:30 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Apr 17 2015, 11:55 PM)
VRR ?  hmm.gif  below 30hz... ? eh i thought ppl want LCD monitor with at least 60hz ?   so G-sync got VRR that below 30hz? (watever it is)  worth the Extra RM300-500 ?  sorry i am not LCD monitor expert.. but  when im gaming on 4k 60hz lcd i barely notice any Screen tearing so on.. hmm maybe i got really bad Eye sight hmm.gif
*
Freesync monitor have refresh rate window from 40hz. Within the Variable refresh rate (VRR) window, the freesync works flawlessly. But thing get really different between freesync and g-sync when the frame rate drop below the vrr window. When fps drop below 40, you'll get the classic stuttered like any other non vrr monitor. In g-sync, they implement the frame interpolation (frame duplicating/doubling/tripling), if the FPS drop below VRR window. For example, if the fps is at 30, g-sync monitor will refresh the panel at 60hz, while it g-sync module buffer the fps by doubling it to match the refresh rate. If the fps at 20, the panel will refresh at 60hz, the g-sync buffer will triple the fps. The g-sync module will keep the panel to refresh at the VRR window, at any fps without odd frames that mismatch the refresh rate.

It is still unknown why freesync unable to do frame doubling, the fact that VESA adaptive standard DP1.2a+ white paper have frame doubling technology too if the fps drop below vrr window. My speculation is, the lack of asics/frame buffer in panel scaler to achieve what g-sync can achieve. I do think this can be done in the GPU buffer itself. Maybe lack of mature drivers?

However, if fps shoot above VRR window, freesync have the option to have v-sync on or v-sync off. While nvidia g-sync will force v-sync on (cap fps) at maximum panel refresh rate.

List of freesync panel and its VRR window.

user posted image

This post has been edited by ruffstuff: Apr 18 2015, 04:37 PM
ruffstuff
post Apr 18 2015, 06:50 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Apr 18 2015, 05:16 PM)
hmm.gif  so let say im using Samsung UE590 4K it's minimum refresh rate is - ?  so it's should be okay right.. as Freesync  need 9-240hz to be very effective, eh im still confuse...
*
The samsung UE590 final spec is not revealed yet. But i bet it is minimum 40hz maximum 60hz.

The 9-240hz is the universal refresh rate space. Not the panel refresh rate. Freesync works within panel refresh rate. The gpu negotiate with panel the VRR window before hand, and AMD claim this is more superior, and have 0 latency than nvidia g-sync.

There is no panel that can run at 9-240hz at this moment.
ruffstuff
post Apr 19 2015, 04:39 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Apr 19 2015, 02:02 AM)
hmm.gif  so which AMD Gpu ur using ?  well let say my FreeSync  LCD minimum 40Hz.. max 60Hz..  but im using R9 290x on crossfire setting...  if my Gpu manage to get Fps average 50Fps...  my understanding i should get to used Freesync.. ? ?  e.g. on GTA V in 4K resolution i do get 50-60Fps... and if i am using Freesync Panel i should be able to enjoy the game in this so call Freesync tech .. right ?
*
Should be no problem if framerate within the vrr window. But i would like to see minimum 30hz for 4k panel. Not really concern on max refresh rate because current gen card hardly can output more than 60. Minimum refresh rate is important because you'll encounter frame drops more often on 4k resolution.
ruffstuff
post Apr 23 2015, 08:11 AM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Apr 22 2015, 09:12 PM)
hey i play GTA V i exp good 50FPS + Average in 4K with my R9 290X Trix/Vapor Crossfire  so im guessing if i own Freesync LCD i would Greatly Benefit from the tech right  hmm.gif  ?  check out review comparison of GTX980 Sli/ Titan X/ R9 295X2  on 1440p R9 295x2 may get  last of it MORE Expensive competitor.. but on 4K it get First... x not bad for Old timer ?

http://www.hardocp.com/article/2015/04/20/...mance_preview/3

[attachmentid=4422579]
*
Hopefully, but freesync doesn't work on crossfire. Yet.
ruffstuff
post May 21 2015, 12:03 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Acid_RuleZz @ May 20 2015, 02:35 PM)
So, the rumored Fiji XT will be price similarly to Titan-X and categorize as "Titan tier" instead of R9-290x successor. Quite disappointed but somewhat understandable.
*
I wonder what is AMD new mainstream and high performance line up, if FIJI is enthusiast rather than affordable performance card. How they can scale down FIJI chip? The new Tonga chip is not a performance chip, can they make it a new performance chip based on Tonga?

Or AMD user will have to stick with Hawaii rebadge for mainstream performance card?

P/S: I have a strong feeling that FIJI will not be using a R9 390x name scheme. It can be R10 XXX.

This post has been edited by ruffstuff: May 21 2015, 12:05 PM
ruffstuff
post Jun 3 2015, 11:41 AM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Jun 3 2015, 09:39 AM)
so.. prototype r9 390x is slower than GTX 980 Ti...

http://wccftech.com/amd-radeon-r9-fury-fij...wer-gtx-980-ti/
*
And it is not R9 390X. It will be call Radeon Fury.
ruffstuff
post Jun 3 2015, 05:56 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Acid_RuleZz @ Jun 3 2015, 02:39 PM)
You didn't state it in you previous post. tongue.gif
Ya mang, the card feel abit "rushed". Maybe they have another one coming.
*
Nope. The maxwell full fat core was core was completed ever sknce the introduction of 980. Before 980 released, people expecting GM200 core to be the 980. But no, it is a crippldf version of GM200. Nvidia dont feel the need to release the full.maxwell core at launch because it isnt necessary. The stupid move of nvidia is the branding of GM200 as the Titan X which is kinda confusing this time around. The Titan X is missing important feateru that distinguish itself as gaming centric card. It doesnt have the fp64 and double precision. Which is strongly suggest this chip is the supposed to be 980 at launch. Well at least they made Ti out of it.
ruffstuff
post Jun 3 2015, 06:02 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Gregyong @ Jun 3 2015, 04:44 PM)
that's the whole point of HBM.
to stack the dies one on top of the other and they're on the same die as the GPU itself.
the plus size is easier cooling and more compact size, though the downside would be that we wont be seeing AIB partners doing fun things like launching similar GPU card with lower or more memory (ie. the 6950 1GB....my baby) or GDDR5 R7 240s (compared to the OEM ddr3 versions)
On a side note, I would really want to see AMD sell their Fiji at USD500-600 range, which would/should render all other cards to cost less.......I miss the good old days of the HD4800, HD5800 and HD6900 pricings
*
The downside of the card is the chip itself. HBM is great, but what is the point of having small footprint card, but ended up with huge radioator alongside with it? If AMD can make a high performance card with that footprint, without the need of wc, its a clear winner. Even if the performance is slightly below 980. I hope the non XT Fury will be like this. No need wc, but still maintaining the small footprint.
ruffstuff
post Jun 3 2015, 07:17 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(terradrive @ Jun 3 2015, 07:14 PM)
It's definitely possible, but clockspeeds will be lowered a lot.

Just like R9 290/290X can be hugely downclocked mem wise and saved tons of power consumption as well as the released heat. It's really cool if ran the R9 290 at 1000Mhz core and 1000Mhz memory with -65mv. With is can just run fine with 2 fan small coolers.
*
If possible, slightly better performance than 290x, without the watercooling, and maintaining the footprint. Still a good deal.

2 Pages  1 2 >Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0483sec    0.32    7 queries    GZIP Disabled
Time is now: 28th November 2025 - 09:00 PM