Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V14

views
     
ruffstuff
post Mar 20 2015, 09:48 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Moogle Stiltzkin @ Mar 20 2015, 08:06 PM)
true but the articles i read already displayed the price as already set, and there is a clear difference in pricing ....  smile.gif

Anyhow.... there was one area i overlooked. And that is how freesync vs gsync perform the minimum refresh rate.
user posted image
A much more indepth article gsync vs freesync  notworthy.gif
http://www.pcper.com/reviews/Displays/AMD-...utside-VRR-Wind
But simply put

over refresh rate
freesync and gsync handle the same, except gsync is locked to vsync on, whereas freesync lets the user decide. So the question is will a high fps be enough to prevent tearing from being noticeable ? Need reviews. Personally i'm sensitive to tearing  rolleyes.gif  even little bits.

below minimum refresh rate

freesync allows you to disable or enable vsync. I heard in this situation disabling vsync is better. gsync on otherhand uses a different method than vsync, but whether it's superior i don't know. Still reading on this  hmm.gif
*
Freesync negotiate the refresh rate range before hand. Thus, you have to set the min and max refresh rate assuming your graphic cards is capable of that min and max fps. While g-sync need handshake in between all range of refresh rate and happened all the time.

AMD claim their method have no latency hit.
ruffstuff
post Mar 23 2015, 08:33 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
I don't think there will be a 980Ti. The Titan x is essentially a 980Ti with Titan branding and pricing. It is nothing like the previous titan. The missing of double precision says it is a 980ti.
ruffstuff
post Mar 26 2015, 01:01 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Mar 26 2015, 11:33 AM)
Doubt it. Nvidia controls titans. This time even more. All titans comes from them.

Bro cyntrix takes cs very seriously. Hmm shld write to them.

Yeah but this time. Zotac hq was the one delaying afaik because of apac thingy. We shld see their cards soon.

I have had more early adopters regret than buying it later. So not gonna fall for that again.

So my case waiting it out for the two units.

On other note acer started selling their ips 1440p 144hz monitor worldwide.
*
Acer Predator XB270HU is coming to Malaysia in April.
ruffstuff
post Apr 2 2015, 12:20 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Apr 1 2015, 04:54 PM)
Lol. Everything i predicted about free sync was true.

Amd lied n now blaming scalers/panel limitation.

As i said nvidia said gsync wasnt easy. They just took a backseat as they knew amd will fail.
*
If you read through adaptive sync white paper, they have similar approach like g-sync when frames drop below panel VRR window.
I believe freesync is only part of adaptive-sync specification, not a full implementation of the spec. I think they don't want to force panel manufacturers to produce new scaler with frame buffer, which eventually will narrow the cost of freesync closer to g-sync.
ruffstuff
post Apr 2 2015, 04:16 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Apr 2 2015, 03:05 PM)
Dude no no no. To make this simple for you.

Adaptive sync just variable refresh rate which what scaler companies implemented within the vesa spec extension.
Thats all. Just a range for vrr. Going below and above is nothing to do with the spec as thats a gaming issue. Its the same for static monitors.

The implementation of it for use with amd gpu is freesync.
The current issue is the implementation of use with gaming. Adding a frame buffer with tuning each panel voltages is not in the extension vesa spec. Its a freesync issue. Amd is not involved in hardware level of the scaler/monitor. Aka zero $$$ as usual. Scaler manufactorer just did it for their gaming series monitor as it was within the spec of edp. They are not going to get involved with additional hardware for gaming. Amd has to do it via driver or start making their own module.

Gsync is the combination of both at hardware level embedded into a panel.

Amd nvr realiazed y nvidia went this route. Now they know. Of course nvidia knew from day one of edp spec that it wasnt suitable for gaming for driver intervention. Dude they have more ppl working on this than amd r&d and most probably explored this route thoroughly. Havent you realized swift was the first gsync monitor for a reason. Their certification process of a monitor is very stringent. Amd however.. is just aslong the monitor is compliant to edp spec.

Did you know, its very rare for nvidia to talk about amd. This time they did. N here is me laughing at i told you so. Amd in the end blamed the scaler. Then nvidia explained y gsync is different than freesync at below the vrr window. Amd clearly didnt put enough man power into this to reliaze the flaws for gaming. 

Hence y nvidia from day one labeled gsync monitors is a gaming monitor. ULMB is a nvidia feature as well. Now back to laughing at amd.

Btw hmm nvidia is up to something with titan x. It seems nerfed on sli in alot of games. Alot of users of x99 are saying x79 had better scaling on 3-4 way. Issue started with 980 n seems titan x inherited the same trend..

Nerfed in prep for 390x??

Current flaw of gsync. With sli dsr is disabled. Ulmb wont work with gsync. The flickering thing on spikes from zero n back up again. But i heard all vrr monitor suffers this.
*
I have nothing to argue on what you wrote. But this is what i have to say,

Nvidia knows that edp alone cannot do what g-sync did outside VRR window. Thus, i was skeptic about people claiming freesync is the same thing with g-sync. It is true within the VRR window, but the story change outside VRR window. This is where g-sync shine.

PCPER guys discover this fact (weird nvidia did not actually announce this even though Tom Perterson did hint about it in recent interview), however, later i found out that the VESA spec of adaptive sync have similar approach to g-sync outside vrr window.

Let me just quote VESA white paper: "If the game’s framerate drops below the refresh rate of the display (eg. during a short period of intensive action), then the new frame will not be ready in time for the display’s blanking interval, and the previous frame is repeated on the display."

From my understand, this is what G-Sync did from what we've been gathered from PCPER testing. This is Adaptive-Sync spec. It is weird enough that Freesync didn't do this in the test.

Thus i came into conclusion (my thoughts), that Freesync is not a full adaptation of adaptive-sync. It just part of adaptive-sync standard. I think you may caught those mobile g-sync thing that, there is no g-sync module on the panel itself. I guess this mobile g-sync works the same just like current free-sync. They make use of the adaptive-sync spec, but once it goes outside vrr, it will stuttered/tearing.

Also, there is discussion on Tek Syndicate also agreed that Freesync is not a full implementation of Adaptive-sync.

The white paper link. http://www.vesa.org/wp-content/uploads/201...aper-140620.pdf


ruffstuff
post Apr 2 2015, 04:53 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Apr 2 2015, 04:35 PM)
Again its not a vesa issue. Its the implementation of driver support with the gpu. I did read that spec and its just a showcase slide of a tech thats dependent on theorectical driver implementation on gpu side. There is no such drivers atm.

Scaler companiss, gpu manufactorers.. Each are doing a job one half the coin with only one side really putting money into it.

Thats y i am telling ya. Adaptive sync is complete. The implementation in gaming is a freesync issue. Thats what they meant. How it handles below the vrr window is a freesync issue. There will be no hardware buffer from the scaler side.

Nvidia is known to off load things to specific hardware feature rather than make it a driver feature. Same thing they did with frame pacing which btw amd didnt even knew was a issue when ppl started testing for it.

Mobile adaptive sync. Check what nvidia said about it. If a set of driver instructions was ideal for gaming .. We would have seen it by now as 99% gaming notebooks out there is powered by nvidia. Ppl are just guessing on the alpha driver which is just a mirror implementation if freesync. I doubt nvidia will crack that buffer issue via driver n release it only for amd to copy. I think nvidia will release a hardware level inbuilt into their gpu. This is my assumption seeing how they do bizz n protect their investment.

Theres too much fanboism in amd side. Too many ppl believe in their world of cheap stuff ideal gaming while being hoodwinked by slow driver updates, bad ref cooling, terrible power consumption...

Look at spending dough for support. You rather pay more with a company that will put that extra dough for good use.

Btw theres no better forum than ocn atm. Everbody is there.
*
I never said it is a VESA issue. It is a Freesync issue.

Agreed. The fanboyism in AMD camp is worst. They being so defensive on the product. I will still say what's bad about nvidia.

This post has been edited by ruffstuff: Apr 2 2015, 04:54 PM
ruffstuff
post Apr 2 2015, 05:12 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Unseen83 @ Apr 2 2015, 05:09 PM)
sweat.gif  really? amd fanboyism is worst...  oh okay but.. when u posting Freesync thing on amd thread everyone seem to ignore u.. is just me and acid who reply(i talk about price difference menwhile Acid kinda agree with ur point and review on Youtube)... right or which post i miss to read other..WORST amd fanboyism been defensive about.. ? hmm  hmm.gif  eh i dun really see point freesync anyway playing my humble game on my humble 4k no issues..  icon_rolleyes.gif
*
It is not in the LYN AMD thread. I just pointed out the hard fact for AMD user to share the information.
It is everywhere else outside LYN, in techforums, AMD camp is being too defensive and simply accuse those reviewer being bias even they put it hard cold fact in front of them.
ruffstuff
post Jun 26 2015, 10:06 AM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(llk @ Jun 26 2015, 10:04 AM)
Got this yesterday
*
how much for reference 980Ti now?
ruffstuff
post Jun 30 2015, 09:10 AM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(vincetee06 @ Jun 30 2015, 07:59 AM)
Hello, guys, i middle in stuck of Ultra Wide monitor or 4K G-sync one, if u own a 980ti, which one will u guy prefer for gaming?
*
1440p ips gsync. ASUS PG279Q.
ruffstuff
post Jun 30 2015, 03:18 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(stringfellow @ Jun 30 2015, 10:41 AM)
Acer 4K 60hz G-Sync if you need it now, Asus PG278AQ 4K 60Hz IPS G-Sync if you can wait for it to release.

Put it this way: if it was me, and I've never seen 4K before, an UltraWide would be okay. If I've seen 4K, it'll be hard on me to go back to anything of lower resolution. Even if they're 144Hz. It's no small feat to get a game to run at 144 frames per second, and it may require you to drop game settings from Ultra or Very High to lower, and personally I dont game any lower than those two settings. Personal preference of course.

Also personal preference, if you had to go immersive gaming, that Ultrawide is better than putting three monitors together for Surround. Personally to me, bezels on monitors takes away immersion.

Most important of all, whichever monitor you choose, get G-sync.
*
Is 90hz 4k ips g-sync too much to ask? I hope that pc monitors can get 4k at 90hz as standard in the future.

As my preference right now, i want to game at above 60fps. So my choice is 1440p ips monitor.

Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0438sec    0.49    7 queries    GZIP Disabled
Time is now: 27th November 2025 - 09:10 PM