You DONT NEED to have 100+ fps when you have G-Sync running. That's the beauty of G-Sync, it's synchronizes low frames and sequence it so that each slow frame-timed frames are kept in the G-Sync module buffer and pushed out in correct sequence/interval so that the end result looks smooth on the display. If you strive for 100+ on 4K or even max settings at 2K, no single graphics card in existence is able to produce that on modern games.
My personal settings, in 2K I enable SMAA at most, and still be able to get above 60fps, doom regularly stays above 80fps, and drops to lower 70s when the action gets intense. Now, 70-80fps isnt shitty or out of sync when you have 100hz or 144Hz displays with G-Sync, because G-Sync sequence them into the G-Sync module buffer and display them as smooth frames with indistinguishable frame-times. The reason why most of folks are TARGETING close to 100hz or 144hz is because they want the game to perform as close at the display panel is natively meant to. The problem in this is that you either have to drop quality settings or drop resolution. Which is why most folks use this reason as not to invest in high end graphics cards until <insert whatever year>, but as modern game engine progresses, more complexities like geometric detail, textures, etc are added, and you've back to square one: your most powerful card now back to the point where it cannot get back to your precious 100-144Hz framerate. G-Sync eliminates that need to be close to those 100-144hz target, you can be at 40fps at one point and 110fps, G-Sync eliminates that and sync it back to the display panel refresh rate.
So, even back on my previous 4K60 G-Sync panel, I can be gaming at 35fps on the most punishing settings on GTAV on a single card (980Ti before I sold this), and it still looks as smooth as if it is running on 60fps the display panel is rated to. I have since moved this rig to be running on a 4KTV with 60hz refresh rate, I'll strive to get 60fps when I could but I will be able to see it if it drops to 35fps, the stutter and jerkyness.
In short,
QUOTE
On a 60hz panel. Cap @ 59.7
On a 144hz panel. Cap @ 141-143
Reasoning: NVCP automatically enables V-sync every computer restart, as soon as you hit 60.0/144.0 fps, traditional v-sync will enable and g-sync will disable, introducing input lag. You want g-sync to be running 100% of the time, the only way to accomplish this is to make sure that you never hit 60.0 or 144.0 fps.
You WANT to game at low frame rate (to a certain point) so that G-Sync covers your stutters at those framerates. You DO NOT wanna framerates HIGHER than your G-Sync panel refreshes because V-Sync will automatically kick in and introduce input lag.
My own preference : at 3440x1440 I can live with SMAA. At 4K, I don't even need any AA, the pixels are so fine and small, at such high pixel density, it gives you free AA. Purposely killing your system with TSSAA, or maximum AA with no appreciable gain at higher pixel density displays, is just you making excuse that current crop of graphics cards isn't capable enough at these resolutions.
This post has been edited by stringfellow: Jun 16 2016, 03:41 PM