QUOTE(S4PH @ Jan 6 2015, 06:18 PM)
no 20nm gpu for 2015. Probably ever. They may skip to different node.AMD Radeon™ Discussion V12, Latest - 14.12 | WHQL - 14.12
AMD Radeon™ Discussion V12, Latest - 14.12 | WHQL - 14.12
|
|
Jan 13 2015, 09:09 PM
Return to original view | Post
#1
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
|
|
|
|
|
|
Jan 13 2015, 09:19 PM
Return to original view | Post
#2
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(chocobo7779 @ Jan 6 2015, 01:02 PM) Well, let's kickstart this thread with this news... New leaked suggested the FIJI will be 300W TDP. If its true, it's so obvious it will be 28nm. The TDP is just http://wccftech.com/amd-fiji-xt-r9-390x-zauba/ Another leak of the 390X? |
|
|
Mar 11 2015, 04:56 PM
Return to original view | Post
#3
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Unseen83 @ Mar 11 2015, 10:33 AM) Lot of things can be solved with VRR because media sources are recorded/film with different frame rate at different refresh rate. So this need to be address. We don't want to have fixed refresh rate, capped fps for games and duplicated frames and odd frames for movies that run at 25fps at 60hz. VRR on monitor is the first step to fix all this issues. Hopefully this will come to consumer electronics (LCD tv, stb and what not). I'm looking forward they fixed this first before going to 4k/5k/8k. This post has been edited by ruffstuff: Mar 11 2015, 04:56 PM |
|
|
Mar 20 2015, 03:50 PM
Return to original view | Post
#4
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(terradrive @ Mar 20 2015, 07:42 AM) I rarely use V-sync, hate the mouse lag with it. below 30fps is not tearing artifact, it is called stuttering. But then even without V-sync the tearing isn't that much if the fps is high enough. Above 40 fps is good enough. When the fps drops below 30fps is where the tearing looked bad. when the graphics card push above your refresh rate, than it is tearing. This post has been edited by ruffstuff: Mar 20 2015, 03:52 PM |
|
|
Mar 20 2015, 03:51 PM
Return to original view | Post
#5
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Unseen83 @ Mar 20 2015, 05:09 AM) OMG!!! so we all need Freesync !!!! It is very hard to explain VRR, or to show it on videos etc. You have to experience it yourself, then you know the difference.[attachmentid=4379332] seriously... i almost Never see this happen on my Gaming... a small minor screen tearing during game.. but vsync help remove it.. and i am just afraid next driver update made all my gaming EXP to be tearing... like the example top just cause they want to sell new tech in LCD... |
|
|
Mar 28 2015, 02:56 PM
Return to original view | Post
#6
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
G-sync and freesync tested via oscilloscope. It is important how both technology handle VRR below the panel refresh rate. It looks like G-sync deal with low fps smarter than freesync. In a nutshell, freesync just stop doing it works at below panel refresh rate, leave you with the classic stuttering and tearing. Meanwhile, g-sync doing frames interpolation (double or tripple the frames), and match it exactly with the refresh rate. Genius. This post has been edited by ruffstuff: Mar 28 2015, 03:03 PM |
|
|
|
|
|
Mar 28 2015, 08:06 PM
Return to original view | Post
#7
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
|
|
|
Mar 28 2015, 08:32 PM
Return to original view | Post
#8
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Unseen83 @ Mar 28 2015, 08:13 PM) dudy that goes same any brand if you change brand gpu if not faulty... i talk about cost of the two brand charging their customer/fans. freesync(amd) GSync(nvidia) both do same job just each hav it own advantage/disadvantage.. but Price of each tech is hugely different... Im not saying you need to change to nvidia for g-sync. |
|
|
Mar 28 2015, 08:42 PM
Return to original view | Post
#9
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
Another drawbacks of freesync, due to narrow VRR window for this particular monitor.
Although AMD claim the default refresh rate for adaptive sync is from 9-240hz, but this not the case because it goes back to the panel limitation itself. Although this problem is not AMD freesync specific, it more to the panel itself. |
|
|
Mar 30 2015, 11:57 AM
Return to original view | Post
#10
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Acid_RuleZz @ Mar 28 2015, 11:33 PM) Yeah kinda disappointed, i was expecting 30hz minimum for 34um67/29um67. I guess i'll wait for the technology to mature and get significantly cheaper. Reading through the VESA white paper for adaptive sync, it seems that the standard do have that doubling/trippling frames feature if the fps drops below VRR window. But, to implement this, the panel itself need frame buffer. This is the missing piece in Freesync technology. The scaler manufacture mostly will not include frame buffer into their scaler for the cost reason. If they do, the price margin between freesync and g-sync is getting narrow. Both is legit idealtech rep innm, if in doubt, go for zhenwei. The g-sync module as we know, do have frame buffer/memory chips. |
|
|
Apr 17 2015, 11:41 PM
Return to original view | Post
#11
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Unseen83 @ Apr 17 2015, 03:39 PM) AMD Freesync or G-Sync according to AMD, not requiring a proprietary module, no licensing fees, it being open source, using DisplayPort, is compatible with all standard monitor features, and has a refresh rate range of 9-240Hz. G-Sync, on the other hand, needs a proprietary module, charges a license fee, isn't open source and has a refresh rate range of 30-144Hz. [attachmentid=4414994] http://wccftech.com/amd-freesync-nvidia-gsync-verdict/ http://www.rockpapershotgun.com/2015/04/09...ync-amd-nvidia/ At this moment theres no freesync monitor that can do VRR below 30hz that im aware of. |
|
|
Apr 18 2015, 04:30 PM
Return to original view | Post
#12
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Unseen83 @ Apr 17 2015, 11:55 PM) VRR ? Freesync monitor have refresh rate window from 40hz. Within the Variable refresh rate (VRR) window, the freesync works flawlessly. But thing get really different between freesync and g-sync when the frame rate drop below the vrr window. When fps drop below 40, you'll get the classic stuttered like any other non vrr monitor. In g-sync, they implement the frame interpolation (frame duplicating/doubling/tripling), if the FPS drop below VRR window. For example, if the fps is at 30, g-sync monitor will refresh the panel at 60hz, while it g-sync module buffer the fps by doubling it to match the refresh rate. If the fps at 20, the panel will refresh at 60hz, the g-sync buffer will triple the fps. The g-sync module will keep the panel to refresh at the VRR window, at any fps without odd frames that mismatch the refresh rate.It is still unknown why freesync unable to do frame doubling, the fact that VESA adaptive standard DP1.2a+ white paper have frame doubling technology too if the fps drop below vrr window. My speculation is, the lack of asics/frame buffer in panel scaler to achieve what g-sync can achieve. I do think this can be done in the GPU buffer itself. Maybe lack of mature drivers? However, if fps shoot above VRR window, freesync have the option to have v-sync on or v-sync off. While nvidia g-sync will force v-sync on (cap fps) at maximum panel refresh rate. List of freesync panel and its VRR window. ![]() This post has been edited by ruffstuff: Apr 18 2015, 04:37 PM |
|
|
Apr 18 2015, 06:50 PM
Return to original view | Post
#13
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Unseen83 @ Apr 18 2015, 05:16 PM) The 9-240hz is the universal refresh rate space. Not the panel refresh rate. Freesync works within panel refresh rate. The gpu negotiate with panel the VRR window before hand, and AMD claim this is more superior, and have 0 latency than nvidia g-sync. There is no panel that can run at 9-240hz at this moment. |
|
|
|
|
|
Apr 19 2015, 04:39 PM
Return to original view | Post
#14
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Unseen83 @ Apr 19 2015, 02:02 AM) |
|
|
Apr 23 2015, 08:11 AM
Return to original view | Post
#15
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Unseen83 @ Apr 22 2015, 09:12 PM) hey i play GTA V i exp good 50FPS + Average in 4K with my R9 290X Trix/Vapor Crossfire so im guessing if i own Freesync LCD i would Greatly Benefit from the tech right Hopefully, but freesync doesn't work on crossfire. Yet.http://www.hardocp.com/article/2015/04/20/...mance_preview/3 [attachmentid=4422579] |
|
|
May 21 2015, 12:03 PM
Return to original view | Post
#16
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Acid_RuleZz @ May 20 2015, 02:35 PM) So, the rumored Fiji XT will be price similarly to Titan-X and categorize as "Titan tier" instead of R9-290x successor. Quite disappointed but somewhat understandable. I wonder what is AMD new mainstream and high performance line up, if FIJI is enthusiast rather than affordable performance card. How they can scale down FIJI chip? The new Tonga chip is not a performance chip, can they make it a new performance chip based on Tonga?Or AMD user will have to stick with Hawaii rebadge for mainstream performance card? P/S: I have a strong feeling that FIJI will not be using a R9 390x name scheme. It can be R10 XXX. This post has been edited by ruffstuff: May 21 2015, 12:05 PM |
|
|
Jun 3 2015, 11:41 AM
Return to original view | Post
#17
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Unseen83 @ Jun 3 2015, 09:39 AM) so.. prototype r9 390x is slower than GTX 980 Ti... And it is not R9 390X. It will be call Radeon Fury.http://wccftech.com/amd-radeon-r9-fury-fij...wer-gtx-980-ti/ |
|
|
Jun 3 2015, 05:56 PM
Return to original view | Post
#18
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Acid_RuleZz @ Jun 3 2015, 02:39 PM) You didn't state it in you previous post. Nope. The maxwell full fat core was core was completed ever sknce the introduction of 980. Before 980 released, people expecting GM200 core to be the 980. But no, it is a crippldf version of GM200. Nvidia dont feel the need to release the full.maxwell core at launch because it isnt necessary. The stupid move of nvidia is the branding of GM200 as the Titan X which is kinda confusing this time around. The Titan X is missing important feateru that distinguish itself as gaming centric card. It doesnt have the fp64 and double precision. Which is strongly suggest this chip is the supposed to be 980 at launch. Well at least they made Ti out of it.Ya mang, the card feel abit "rushed". Maybe they have another one coming. |
|
|
Jun 3 2015, 06:02 PM
Return to original view | Post
#19
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(Gregyong @ Jun 3 2015, 04:44 PM) that's the whole point of HBM. The downside of the card is the chip itself. HBM is great, but what is the point of having small footprint card, but ended up with huge radioator alongside with it? If AMD can make a high performance card with that footprint, without the need of wc, its a clear winner. Even if the performance is slightly below 980. I hope the non XT Fury will be like this. No need wc, but still maintaining the small footprint.to stack the dies one on top of the other and they're on the same die as the GPU itself. the plus size is easier cooling and more compact size, though the downside would be that we wont be seeing AIB partners doing fun things like launching similar GPU card with lower or more memory (ie. the 6950 1GB....my baby) or GDDR5 R7 240s (compared to the OEM ddr3 versions) On a side note, I would really want to see AMD sell their Fiji at USD500-600 range, which would/should render all other cards to cost less.......I miss the good old days of the HD4800, HD5800 and HD6900 pricings |
|
|
Jun 3 2015, 07:17 PM
Return to original view | Post
#20
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(terradrive @ Jun 3 2015, 07:14 PM) It's definitely possible, but clockspeeds will be lowered a lot. If possible, slightly better performance than 290x, without the watercooling, and maintaining the footprint. Still a good deal.Just like R9 290/290X can be hugely downclocked mem wise and saved tons of power consumption as well as the released heat. It's really cool if ran the R9 290 at 1000Mhz core and 1000Mhz memory with -65mv. With is can just run fine with 2 fan small coolers. |
|
Topic ClosedOptions
|
| Change to: | 0.0483sec
0.32
7 queries
GZIP Disabled
Time is now: 28th November 2025 - 09:00 PM |