NVIDIA GeForce Community V14
NVIDIA GeForce Community V14
|
|
Feb 9 2015, 11:11 AM
|
![]() ![]()
Junior Member
196 posts Joined: Jan 2010 From: Kuala Lumpur |
dammit... we need more maturity here in the forum. Anyway, my previous card was an Asus 460. Served me for 3 years plus if not mistaken. Still worked wonderfully until I upgraded.
|
|
|
|
|
|
Feb 9 2015, 11:16 AM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
By the time the graphics card breakdown, it's time for upgrade anyways haha
Even the normal warranty period is like 2 to 3 years now. |
|
|
Feb 9 2015, 11:16 AM
|
![]() ![]()
Junior Member
91 posts Joined: Jan 2010 |
my 9500gt still working in my movie rig
|
|
|
Feb 9 2015, 11:22 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,612 posts Joined: Jan 2003 From: Tomorrow |
|
|
|
Feb 9 2015, 11:26 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
All Stars
11,244 posts Joined: Jul 2005 |
|
|
|
Feb 9 2015, 11:32 AM
Show posts by this member only | IPv6 | Post
#646
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,451 posts Joined: Jan 2003 |
QUOTE(marfccy @ Feb 8 2015, 03:34 AM) tbh both are similar, theyre some sort of adaptive v-sync. just that FreeSync leaves the GPU to do the processing compared to Nvidia's GSync module which is an external unit in terms of actual usage from what i readprolly explains why Nvidia charges extra for the module freesync pros: using vesa standards presumably cheaper than gsync because no additional module required cons: this implementation may have tearing ..... A more accurate description of this issue which i hate about the freesync version QUOTE For FreeSync (and the Adaptive Sync standard as a whole), when a game renders at a frame rate above or below this VRR window, the V-Sync setting is enforced. That means on a 60 Hz panel, if your game runs at 70 FPS, then you will have the option to enable or disable V-Sync; you can either force a 60 FPS top limit or allow 70 FPS with screen tearing. If your game runs under the 40 Hz bottom limit, say at 30 FPS, you get the same option: V-Sync on or V-Sync off. With it off, you would get tearing but optimal input/display latency but with it off you would reintroduce frame judder when you cross between V-Sync steps. gsync pros: has no tearing cons: high cost due to gsync module added onto gsync monitors. This however may not be applicable for laptops which rumors say is using a variant more closer to freesync method. a more detailed description why gsync is possibly the superior option QUOTE And you'd be wrong, G-Sync does handle this better because at no point do you need to enable V-Sync to avoid tearing above max refresh, thus undoing much of the benefit of this tech to begin with. At higher than monitor refresh rates, the monitor continues to update at its max refresh rate with no input lag, no tearing because the G-Sync module with the onboard DRAM (y'know, the same magic stuffs AMD and their fanboys thought was extraneous and unnecessary) actively compares, holds, and renders each frame using it as a lookaside buffer. So, any frames above what the monitor can refresh, the G-Sync module holds, compares, and chooses to display new, hold or throws out. I guess all that pricey, proprietary hardware was justified after all! But yes it is nice to see AMD finally show us something with FreeSync beyond slidedecks, fixed refresh windmills and empty promises, but now they have shown it, they have also confirmed what many of us expected: G-Sync isn't going anywhere, its better than FreeSync and wehatever premium Nvidia is charging will be justified. QUOTE G-Sync above max refresh has the option to hold a frame and wait for the next frame after monitor refresh, effectively reducing the actual frame rate output of the monitor, because the G-Sync module is dynamically changing and controlling the monitors refresh rate. Unlike typical refresh and V-sync, where the GPU is still slave to the monitor's refresh and the next frame must be rendered, regardless of how old it is, based on the timing of the monitor's refresh. So in a worst case scenario, with uneven frametimes, you might see 6-7ms of input lag/latency on a 120Hz monitor (8.33ms between frame refresh). QUOTE the onboard DRAM on the G-Sync PCB acts as a lookaside buffer that allows the G-Sync module to compare, hold, and send frames to display. 768MB is capable of storing a whole lot of 4-20MB frames. All that awesome knowledge Tom Petersen dumped on us in interviews on this very site starting to pay off! Guess that G-Sync module and DRAM wasn't extraneous and unnecessary after all! QUOTE There are several PROBLEMS which G-Sync fixes. Screen tearing - happens if you can't synch monitor and GPU (because normal monitors update at a FIXED interval) LAG - VSYNC is used to fix screen tearing but this causes lag because there's a delay caused by buffering the GPU output to match the monitors update cycle STUTTER - happens if you have VSYNC on but are outputting below the refresh rate *So you can fix screen tearing by enabling VSYNC but then get lag (or lag and stutter) or disable VSYNC but get screen tearing. G-SYNC fixes all these issues AT THE SAME TIME. The only real issue is staying above 30FPS which isn't a huge deal and even that will be fixed with newer PANELS (a panel limitation, not a G-Sync limitation). **Above max? I believe this is where G-SYNC is superior but it's hard to confirm. It's my understanding that G-Sync's added fast memory as a "lookaside" buffer allows G-Sync to stay enabled when the GPU can output more than the monitor's maximum. Thus the monitor still updates ONLY as directed by the GPU software which keeps things SMOOTH. So you can basically stay capped at 60FPS on a 60Hz monitor this way. FREE-SYNC however as a limitation of the monitor hardware (no proprietary hardware with lookaside buffer) seems forced to DISABLE the asynchronous method and go back to the normal fixed update by the monitor. This is going to be a HUGE ISSUE for Free-Sync. So you have to stay in the RANGE to make it work (i.e. above 40FPS or below 60FPS). Not as big a deal for 30FPS up to 144FPS. So basically G-SYNC seems to work almost perfectly and Free-Sync is problematic especially on 60Hz monitors. (Worse is playing a game with 60FPS average on a 60Hz monitor. You'd keep going above and below the 60FPS mark meaning FREE-SYNC would turn on and off.) Don't flame me if this is incorrect, but I've done a lot of reading and I think it's correct. (Also not sure how AMD can fix this without introducing a proprietary scaler with lookaside buffer like NVidia's solution.) So overall sounds to me like gsync cost more but gets it done right. Because to me whats the point of high frame rates if your going to have tearing. Tearing is very noticeable to me and is the main reason i currently use vsync with triple buffering to counter that. i just don't think the freesync camp supporters are making enough valid arguments to prove that freesync is better than gsync (other than perhaps the cheaper cost). the rest of the freesync vs gsync ongoing debate http://www.pcper.com/news/Graphics-Cards/C...eeSync-Monitors This post has been edited by Moogle Stiltzkin: Feb 9 2015, 12:16 PM |
|
|
|
|
|
Feb 9 2015, 11:35 AM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,167 posts Joined: Jan 2007 From: ..Tsukuba.. |
|
|
|
Feb 9 2015, 11:45 AM
Show posts by this member only | IPv6 | Post
#648
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,451 posts Joined: Jan 2003 |
QUOTE(Acid_RuleZz @ Feb 9 2015, 11:22 AM) to name a few....wolfenstein: new order (especially this game. low vram at high settings will cause the texture popping issue which is a very obvious and annoying effect where images when loading is blurred only to suddenly pop in after load. It becomes disorienting.) shadow of mordor (if you think 4gb vram is enough, well this game needs 6gb vram to max out it's ultra settings. QUOTE If you want to play Shadow of Mordor on ultra then you’re going to need a pretty high end card as apparently there is a 6GB VRAM requirement, meaning that users may need a very specific GPU in order to play at full detail settings. To make things worse, that’s the recommended requirement for 1080p, rather than 4K, a resolution that you would expect to eat up that much VRAM. Flagship cards from both Nvidia and AMD currently only ship with 4GB of VRAM, with the exception of the GTX Titans and some specific custom made GTX 980 and AMD R9 290x cards. http://www.kitguru.net/gaming/matthew-wils...ultra-settings/ ) Why players should be upset having 3.5gb rather than the promised 4gb vram http://www.guru3d.com/news-story/middle-ea...ess-test,7.html QUOTE Conclusion Our product reviews in the past few months and its conclusion are not any different opposed to everything that has happened in the past few days, the product still performans similar to what we have shown you as hey .. it is in fact the same product. The clusterfuck that Nvidia dropped here is simple, they have not informed the media or their customers about the memory partitioning and the challenges they face. Overall you will have a hard time pushing any card over 3.5 GB of graphics memory usage with any game unless you do some freaky stuff. The ones that do pass 3.5 GB mostly are poor console ports or situations where you game in Ultra HD or DSR Ultra HD rendering. In that situation I cannot guarantee that your overall experience will be trouble free, however we have a hard time detecting and replicating the stuttering issues some people have mentioned. The Bottom line Utilizing graphics memory after 3.5 GB can result into performance issues as the card needs to manage some really weird stuff in memory, it's nearly load-balancing. But fact remains it seems to be handling that well, it’s hard to detect and replicate oddities. If you unequivocally refuse to accept the situation at hand, you really should return your card and pick a Radeon R9 290X or GeForce GTX 980. However, if you decide to upgrade to a GTX 980, you will be spending more money and thus rewarding Nvidia for it. Until further notice our recommendation on the GeForce GTX 970 stands as it was, for the money it is an excellent performer. But it should have been called a 3.5 GB card with a 512MB L3 GDDR5 cache buffer. The solution Nvidia pursued is complex and not rather graceful, IMHO. Nvidia needed to slow down the performance of the GeForce GTX 970, and the root cause of all this discussion was disabling that one L2 cluster with it's ROPs. Nvidia also could have opted other solutions: Release a 3GB card and disable the entire ROP/L2 and two 32-bit memory controller block. You'd have have a very nice 3GB card and people would have known what they actually purchased. Even better, to divert the L2 cache issue, leave it enabled, leave the ROPS intact and if you need your product to perform worse to say the GTX 980, disable an extra cluster of shader processors, twelve instead of thirteen. Simply enable twelve or thirteen shader clusters, lower voltages, and core/boost clock frequencies. Set a cap on voltage to limit overlclocking. Good for power efficiency as well. We do hope to never ever see a graphics card being configured like this ever again as it would get toasted by the media, for what Nvidia did here. It’s simply not the right thing to do. Last note, right now Nvidia is in full damage control mode. We submitted questions on this topic early in the week towards Nvidia US, in specific Jonah Alben SVP of GPU Engineering. On Monday Nvidia suggested a phonecall with him, however due to appointments we asked for a QA session over email. To date he or anyone from the US HQ has not responded to these questions for Guru3D.com specifically. Really, to date we have yet to receive even a single word of information from Nvidia on this topic. We slowly wonder though why certain US press is always so much prioritized and is cherry picked … Nvidia ? So if the market has a 4gb vram card, why settle with 3.5gb ? Especially considering some of the graphics intensive games we know are very demanding in regards to vram. More so for gamers such as myself who buy the highest end graphics card expecting to then be able to play in ultra settings. This is where vram becomes important for this ultra settings to become playable. And this is not yet even considering people who bought the 970 bought it thinking it was a 4gb, but instead later found out it's 3.5gb Which is blatant false advertising, and already lawyers are contemplating a lawsuit against nvidia because of it. This post has been edited by Moogle Stiltzkin: Feb 9 2015, 11:55 AM |
|
|
Feb 9 2015, 01:08 PM
|
![]() ![]() ![]() ![]()
Senior Member
562 posts Joined: Nov 2009 |
QUOTE(Moogle Stiltzkin @ Feb 9 2015, 11:45 AM) to name a few.... 970 is not highest end. Titan is. Never settle for less if you really serious into going ultra everything. better to collect money and get the best. Shadow of Mordor came with preset setting on different GFX. they auto detect best setting for it. If you change those means U are going beyond the card limit. Even played ultra for AC4 BF.wolfenstein: new order (especially this game. low vram at high settings will cause the texture popping issue which is a very obvious and annoying effect where images when loading is blurred only to suddenly pop in after load. It becomes disorienting.) shadow of mordor (if you think 4gb vram is enough, well this game needs 6gb vram to max out it's ultra settings. http://www.kitguru.net/gaming/matthew-wils...ultra-settings/ ) Why players should be upset having 3.5gb rather than the promised 4gb vram http://www.guru3d.com/news-story/middle-ea...ess-test,7.html So if the market has a 4gb vram card, why settle with 3.5gb ? Especially considering some of the graphics intensive games we know are very demanding in regards to vram. More so for gamers such as myself who buy the highest end graphics card expecting to then be able to play in ultra settings. This is where vram becomes important for this ultra settings to become playable. And this is not yet even considering people who bought the 970 bought it thinking it was a 4gb, but instead later found out it's 3.5gb Which is blatant false advertising, and already lawyers are contemplating a lawsuit against nvidia because of it. Back to topic of misinformation yes I believe NVIDIA is at fault for not doing proper test or intentionally misled consumer. I believe this case is as car recall they should just recall all the 970 and come out with new hardware config which reflect the 4gb. |
|
|
Feb 9 2015, 01:20 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
All Stars
18,672 posts Joined: Jan 2003 From: Penang |
QUOTE(zizi393 @ Feb 9 2015, 01:08 PM) 970 is not highest end. Titan is. Never settle for less if you really serious into going ultra everything. better to collect money and get the best. Shadow of Mordor came with preset setting on different GFX. they auto detect best setting for it. If you change those means U are going beyond the card limit. Even played ultra for AC4 BF. Not only that, nVidia also said that customer happy with GTX970, and thus the low 5% return rate (due to the 3.5GB VRAM) for GTX970. This is again misinformation and somewhat lies because nVidia does not tell manufacturer/retail to accept return. Back to topic of misinformation yes I believe NVIDIA is at fault for not doing proper test or intentionally misled consumer. I believe this case is as car recall they should just recall all the 970 and come out with new hardware config which reflect the 4gb. Without letting customer to return, and the forced return rate is still 5% reflected quite badly on nVidia. |
|
|
Feb 9 2015, 02:20 PM
Show posts by this member only | IPv6 | Post
#651
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,451 posts Joined: Jan 2003 |
QUOTE(zizi393 @ Feb 9 2015, 01:08 PM) 970 is not highest end. Titan is. Never settle for less if you really serious into going ultra everything. better to collect money and get the best. Shadow of Mordor came with preset setting on different GFX. they auto detect best setting for it. If you change those means U are going beyond the card limit. Even played ultra for AC4 BF. yes 980 is highest end, but 970 is not bad either. so i would argue ultra settings on 1080p 24'' screen size screen should be possible. for 27'' and higher than 980gtx or even an sli solution would definitely be recommended instead. So what i said about ultra settings possible on this card is valid, but what i left out was the monitor size and resolution, which obviously would be lower in order to achieve that ultra setting.Back to topic of misinformation yes I believe NVIDIA is at fault for not doing proper test or intentionally misled consumer. I believe this case is as car recall they should just recall all the 970 and come out with new hardware config which reflect the 4gb. fact remains most people bought it for a 4vram product which it isn't. nvidia is now susceptible to a public class action lawsuit and rightfully so. *update just to reclarify.... QUOTE Let me clearly state this, the GTX 970 is not an Ultra HD card, it has never been marketed as such and we never recommended even a GTX 980 for Ultra HD gaming either. So if you start looking at that resolution and zoom in, then of course you are bound to run into performance issues, but so does the GTX 980. These cards are still too weak for such a resolution combined with proper image quality settings. Remember, Ultra HD = 4x 1080P. Let me quote myself from my GTX 970 conclusions “it is a little beast for Full HD and WHQD gaming combined with the best image quality settings”, and within that context I really think it is valid to stick to a maximum of 2560x1440 as 1080P and 1440P are is the real domain for these cards. Face it, if you planned to game at Ultra HD, you would not buy a GeForce GTX 970. So when i say ultra, i don't mean ultra HD RESOLUTION !!! because obviously 970gtx and even the 980gtx aren't ultra hd cards (which is ironic because thats the best you can get atm e.g. 980gtx) what i meant was, ultra graphics settings ingame for HD resolution of 1080p or at 1920x1200 (which is what i use). That seems reasonable to me here is a 980gtx vs 970gtx with max graphics settings for shadow of mordor and dragon age inquisition http://www.extremetech.com/extreme/198223-...emory-problem/2 *surprisingly neither card managed to get 60fps at max settings 980 gtx real world testing ( is 4gb vram enough? games tested DAI and shadows of mordor) QUOTE My EVGA GTX980 also shows a slowdown on the Nai benchmark: RUN 1 RUN 2 I've not noticed any frame spikes when going over 3.5GB in games but I've mainly been playing DA:I which seems to cap out at 3GB. I just tried SoM (1920x1200 with everything on Ultra including textures) and it mainly hovered around 3.3GB but there were no obvious hitches when it occasionally hit 4GB. find out more on the debates here http://www.reddit.com/r/pcgaming/comments/...allocation_bug/ For those curious how 970 performs on ultra hd resolution just read here http://www.guru3d.com/news-story/middle-ea...tress-test.html bottomline.... The latest graphics card out in the market atm from both ATI and Nvidia camps, neither properly support ultra hd solutions in single gpu solutions. I didn't look at sli/crossfire yet but most people don't run those expensive setups. also 4gb ram is barely enough for some of the latest game as shadows of mordor and wolfenstein new order have proven. we are now entering the realm of 6-8gb vram requirements soon But considering that HBM is around the corner, maybe that will help alleviate some of the memory issues somewhat ? This post has been edited by Moogle Stiltzkin: Feb 9 2015, 03:35 PM |
|
|
Feb 9 2015, 03:24 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
QUOTE(zizi393 @ Feb 9 2015, 01:08 PM) 970 is not highest end. Titan is. Never settle for less if you really serious into going ultra everything. better to collect money and get the best. Shadow of Mordor came with preset setting on different GFX. they auto detect best setting for it. If you change those means U are going beyond the card limit. Even played ultra for AC4 BF. You are only looking at the surface. It's not uncommon that the most recent PC titles are terrible and unoptimized, that even if you have 980/TitanBlack/780Ti or SLI configs, you still can't run the game at its maximum settings.Back to topic of misinformation yes I believe NVIDIA is at fault for not doing proper test or intentionally misled consumer. I believe this case is as car recall they should just recall all the 970 and come out with new hardware config which reflect the 4gb. Games auto detecting "best setting" has NEVER been accurate. I turn SoM on with a 970 SLI config and it detects 720p with medium settings for me on a 1440p screen. If you call that "best", then wow... SoM isn't the only game that does it, pretty much every single game I've played since the last 6 years has this issue. |
|
|
Feb 9 2015, 03:38 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
|
|
|
|
|
|
Feb 9 2015, 03:43 PM
Show posts by this member only | IPv6 | Post
#654
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,451 posts Joined: Jan 2003 |
QUOTE(marfccy @ Feb 9 2015, 03:38 PM) yeah but the cost.... even i don't have a gsync monitor unfortunately i use a Dell U2413 24'' GB-R led AH-IPS wide screen http://www.tftcentral.co.uk/reviews/dell_u2413.htm mostly because i watch anime and play games. Would suck for me having tn panel. A pro gamer would be more biased for a TN panel with a higher refresh rate 120-144hz coupled with a gsync module. Can't say i blame them, but we each have our own priorities so either a ah-ips or a tn gsync high refresh rate monitor for both roughly the same price (both expensive and yet your still sacrificing one thing over another ) This post has been edited by Moogle Stiltzkin: Feb 9 2015, 03:45 PM |
|
|
Feb 9 2015, 04:02 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
QUOTE(Moogle Stiltzkin @ Feb 9 2015, 03:43 PM) yeah but the cost.... even i don't have a gsync monitor unfortunately well, rmb what they said. high quality IPS panel dont exactly come cheapi use a Dell U2413 24'' GB-R led AH-IPS wide screen http://www.tftcentral.co.uk/reviews/dell_u2413.htm mostly because i watch anime and play games. Would suck for me having tn panel. A pro gamer would be more biased for a TN panel with a higher refresh rate 120-144hz coupled with a gsync module. Can't say i blame them, but we each have our own priorities so either a ah-ips or a tn gsync high refresh rate monitor for both roughly the same price (both expensive and yet your still sacrificing one thing over another ) not to mention the response rate is unacceptable for gamers frankly, GSync to me is good. but for a very specific purpose wont be getting it at all, unless the monitor+module itself is affordable |
|
|
Feb 9 2015, 04:39 PM
Show posts by this member only | IPv6 | Post
#656
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,451 posts Joined: Jan 2003 |
QUOTE(marfccy @ Feb 9 2015, 04:02 PM) well, rmb what they said. high quality IPS panel dont exactly come cheap good point, but i wonder when that will be not to mention the response rate is unacceptable for gamers frankly, GSync to me is good. but for a very specific purpose wont be getting it at all, unless the monitor+module itself is affordable there's already a big questionmark how much those new HBM gpus are going to be, also how much vram it will have all the gpu/monitor new stuff all $_$: This post has been edited by Moogle Stiltzkin: Feb 9 2015, 04:40 PM |
|
|
Feb 9 2015, 05:08 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
QUOTE(Moogle Stiltzkin @ Feb 9 2015, 04:39 PM) good point, but i wonder when that will be 4k tv is getting cheaper and cheaperthere's already a big questionmark how much those new HBM gpus are going to be, also how much vram it will have all the gpu/monitor new stuff all $_$: But 1440p and 4k monitor is expensive zzzz |
|
|
Feb 9 2015, 05:15 PM
|
|
Elite
6,799 posts Joined: Jan 2003 |
QUOTE(terradrive @ Feb 9 2015, 05:08 PM) those 4k tv is only 30hz dude.Moogle Stiltzkin Nvidia Can bring down the cost of Gsync Module by finalizing it to ASIC rather than FPGA SSJBen The lawsuit will fail as ultimately it was priced way lower and the option can be given for tradeup which a lot of those ppl cant afford for a 980. Or a downgrade to a possible 960ti with full 4gb. If you can win against this.. then how Apple 8gb/16gb/32gb/64gb and other ssd/hdd makers etc on the wrongly used terminological ... If dual cards can get away with having written 8gb example on 295x2 and titan X 12gb which is true.. cause its on the cards... so really dont see how this will play out cause ultimately the price was way lower than a 980 and it reflects on its performance figure. The card works as it should for the price that ppl paid for it. For SOM.. seriously at 1440p 4gb aint enough. Tested on a 980 system already vs Mine. Even Dying light is much smoother even-though the texture stream is capped at 3614mb only. This post has been edited by cstkl1: Feb 9 2015, 05:28 PM |
|
|
Feb 9 2015, 05:17 PM
|
![]() ![]()
Junior Member
196 posts Joined: Jan 2010 From: Kuala Lumpur |
|
|
|
Feb 9 2015, 05:31 PM
|
|
Elite
6,799 posts Joined: Jan 2003 |
|
|
Topic ClosedOptions
|
| Change to: | 0.0273sec
0.49
6 queries
GZIP Disabled
Time is now: 28th November 2025 - 12:04 PM |