Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
12 Pages « < 2 3 4 5 6 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V14

views
     
cstkl1
post Mar 19 2015, 12:22 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(empire23 @ Mar 18 2015, 09:43 PM)
You're generally right.

Generally the use of DP or FP64 is quite limited in the sense that games do not need the precision required of a 64 bit long floating point number.

Very advanced physics packages, finite element analysis, numerical modelling, Monte Carlo simulations and Mesh simulation make up the primary users of high precision floating point math in addition to GPGPU. The only form of usable graphics that would use such precision for most rendering would be Ray-tracing (standard or recursive), and of course your usual 3D rendering packages.

As for implementation in games, you're almost always better off using multiple passes/samples/offsets of an FP32 calculation. FP64 can be used, but seriously it's rare. That's why Nvidia chose to go with a ratio that meant the GTX 980 could only churn out 100 GFLOPS where else the R9 290 could churn out 600. They decided that as a gaming centric product line they were always better off focusing on FP32.
*
Eh u took up being a mod again. Lol thought u were done with lyn.

The guy got out n was dragged back in..

Thanks bro.

cstkl1
post Mar 19 2015, 12:24 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Str33tBoY @ Mar 19 2015, 11:04 AM)
size is not a problem...
so palit jetstream is better den gainward phantom & leadtek hurricane?
btw...
how bout galax?
the exoc also seems nice...
*
Unless things changed a lot last few years but

Palit=gainward.
cstkl1
post Mar 19 2015, 03:29 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Skidd Chung @ Mar 19 2015, 02:34 PM)
Palit = Galaxy, Yuan, Gainward, Vvikoo, XpertVision and Daytona

Biggest graphic card vendor in the world.
*
Duh.

Pcb.

Palit jetstream = gainward hurricane


cstkl1
post Mar 19 2015, 03:36 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Minecrafter @ Mar 19 2015, 03:33 PM)
Gainward Hurricane or Leadtek Hurricane? tongue.gif  laugh.gif
*
Lol phantom.

These name scheming nowadays

Y cant just do like evga/asus schemes. Consistent n we know the difference.


cstkl1
post Mar 20 2015, 10:32 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

U cannot use adaptive sync as its a extension on dp1.2 which nvidia wont support unless its a mandarory spec at 1.3.

Also titan x are oveclocking nuts at stock reaching 1400n just now 1500 asic 87.

Afail nobody can overvoltage yet n awaiting update on precision n afterburner.

As for fan profile. All reviewers are idiots running default max 55 on oc n those same idiots nvr tested vram of som. 3840x1440 ultra is 7.2gb. First timr som ran without a hitch.

Cant wait for my cards to test watchdog msaa 8x supersampling with sweetfx n mod.
cstkl1
post Mar 20 2015, 01:30 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Mar 20 2015, 01:17 PM)
i understand it's optional, but nvidia will gain their fanbase ire if they purposely force users to use gsync as their only option.

fans may have ignored the lack of mantle because there is directx12, but i don't think they will think easily ignore this when it comes to a lack of freesync support, not when a gsync monitor cost $50 more.
what do you think about MFAA X2 or x4 ? I heard it's equivalent in quality to MSAA x2 and X4 but with less of a FPS performance cost.
http://www.extremetech.com/gaming/194629-a...-frame-rate-hit
*
No point in me talking about that as sli cannot use it
Samething no dsr with gsync


cstkl1
post Mar 20 2015, 03:37 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(marfccy @ Mar 20 2015, 03:30 PM)
techpowerup gave a small conclusion on that part, so far it ate close to 6GB VRAM

then COD AW ate about 7+GB? rclxub.gif
user posted image
*
Those dumbass are totally wrong. Base on som. Its 6gb at 1440p n so is wolfy. Fc4 5gb +

So far highest is acu 8x msaa 11.2gb. Watchdog should be the same.

These ass of reviewers dont gave n just run a few mins of benchies.

Actualy vram usage.. One shld game n see how it flux n it affects like hitching.

This post has been edited by cstkl1: Mar 20 2015, 03:38 PM
cstkl1
post Mar 20 2015, 04:26 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(marfccy @ Mar 20 2015, 04:02 PM)
could be, i rmb i cannot run COD AW due to some settings

itll instant crash sweat.gif
dumbass is quite harsh, theyre reviewers after all. there should be some truth to what theyre reviewing

but i do agree, i think SOM take more VRAM than just 5GB

though other reviewers seem to be stuck on 5-6GB range
*
They are all bs based on those few games. Titan x thread already confirmed the 1440p 7.2gb for som n others.

Example evolve @1440p. I get 100-120fps all max out. Vram 5.3gb. Go check what reviews say on 980 sli.

Dying light is 5.5gb but because the engine texture setting are gpu type n vram dependent.. Its custom n optimized when it detects ure gpu. So max vram usage doesnt represent anything

They are dumbass n misleading especially the thermal throttle is based in default max 55% n overclocking till date is with stock voltage.

Dude if u see it as i do on how misleading or just a poor job this reviews are..

Lets see one of the most important feature was whether display port 1.3 was on the card. Nobody talked about it n had to wait till nvidia spec came out.

Hence y frustated. Infact the owners club is where actual performance numbers are being discussed. Samething during og/tb.

This post has been edited by cstkl1: Mar 20 2015, 04:26 PM
cstkl1
post Mar 20 2015, 04:40 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Mar 20 2015, 04:06 PM)
Sums up freesync i think
user posted image

freesync + vsync will cap FPS to the refresh rate your monitor supports e.g. 60hz monitor will cap at 60fps.

freesync + vysnc disabled, no cap, also fps can be much higher, but there is a possibility of screen tearing.
So the way i see it, if you have a 144hz monitor, then you can have vsync on, and it would only cap at 144 fps rather than 60 if you had gotten a 60hz monitor.

But then you'd be getting a tn panel monitor rather than say a ah-ips. So unless you take your gaming seriously, not everyones going to be willing to make that trade off.
overall seems like freesync to me personally is a success. vsync without stutter or any performance hit (exception is your monitor refresh rate is the bottleneck now which is acceptable, because you can buy a 144hz or 60hz monitor according to your preferences).

this is how freesync works on a 144hz monitor with vsync on/off
user posted image

Seems like leaving vsync on makes sense, because it has zero tearing, and the performance difference is negligible.
You can read the full review of freesync here
http://www.techspot.com/review/978-amd-freesync/
ps:
hmm.gif so guess when buying a freesync monitor, you'd be looking not only at the maximum refresh rate, but also the minimum.
*
No idea y u posting adaptive sync tech on this thread.
Anyway that slides are just green turn red with nvidia turn into amd logo.

Freesync works between a certain range. At above refresh rate it goes back to vsync.

As for y ips vs tn n the low input lag.. Lay men terms. The combi with high dpi mouse n motion blur disable.. Texture/fidelity of graphics are not distorted when ure movement is way faster than the enviroment. Benefit.. Ure shooting the guy at very high accuracy even when ure jumping down, thown back etc.. Try playing evolve.

Adaptivr sync as for now has a higher min refresh rate thsn gsync.

So ips benefit n difference need to game on it for extended period to tell

This post has been edited by cstkl1: Mar 20 2015, 04:42 PM
cstkl1
post Mar 20 2015, 04:48 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Mar 20 2015, 04:41 PM)
i think its relevant. do nvidia users want to spend more on gsync rather than freesync that is much cheaper? so hence the discussion :/
*
Nvidia is not supporting it n adaptive sync has a premium as well just not as much as gsync.

there is no monitors out n recommending even to consider at this point is kindda foolish.We dont know whats the impact on scaler drivers on the monitor.

Amd themselves hasnt launch any adaptive sync drivers afaik..

In that sense gsync is working n doing great with ulmb also.
Btw 120hz ips panels are not 3d monitors.

Btw afaik adaptive sync doesnt have ulmb. So interesting how desktop experience is gonna be like.

This post has been edited by cstkl1: Mar 20 2015, 04:49 PM
cstkl1
post Mar 20 2015, 05:03 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Mar 20 2015, 04:41 PM)
i think its relevant. do nvidia users want to spend more on gsync rather than freesync that is much cheaper? so hence the discussion :/

PS: i use nvidia gpus for long while now. now i wonder if i should consider ati for my next gpu .... depends on what nvidia is going to do about freesync support. mantle wasn't that big a deal to me cause dx12 was over the horizon, but freesync is big deal  hmm.gif
i don't play evolve but i did watch youtube. none of the fps genres have interested me in a long while, got bored of that genre lelz. blizzards overwatch looks interesting though  nod.gif
not sure i fully understand freesync.

seems like if it goes above monitor refresh rate, it will auto use v-sync if you had that enabled automatically ?

But what about if it's below the minimum refresh rate ? Because the reviews say that it doesn't work too well under refresh rate, so is there an option to disable when it dips below ? So 2 different settings, above is vsync enabled, but below would be disabled, i wonder if that is the case  hmm.gif
i'm no fanboy, but looking at the facts ....... there are indeed legitimate questions here  hmm.gif
By the way my monitor a Dell u2413 says has DisplayPort 1.2a does that mean it can support freesync ?  ohmy.gif
*
http://www.anandtech.com/show/7582/nvidia-gsync-review/2

Google is ure friend no adaptive sync is not the standard. This y i said no point talking abt it until it is here.

As for ure monitor. U need a scaler update. Adaptive sync is not free. The name implied just that and thought about it n that idea was free. Nothing free here folks. Scalers companies will charge a premium.

Dude u need a lot of googling before posting when all the answers is right there.

1.3 spec is out n its on field run. Adaptive sync is a premium feature. As for future proofing.. Come on. 4k ips 120hz is at the horizon. Read more about gsync n then u see why i am skeptic abt adaptive sync. I would caution anyone not to buy adaptive sync until it matures. Scaler companies n amd dont have the resources that nvidia n asus had to make gsync work.

This post has been edited by cstkl1: Mar 20 2015, 05:07 PM
cstkl1
post Mar 20 2015, 05:12 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Mar 20 2015, 05:06 PM)
cry.gif oo so that was the catch.
*
Edited reply above. Highly unlikely u will get it
Also seriously adaptive sync benefits on personal performance is only in fps.

So think u dont need it. The rest of the genre just fidelity n wont affect game play or user performance. Btw my opinion but as usual everybody has the right to do anything what they want.

This post has been edited by cstkl1: Mar 20 2015, 05:13 PM
cstkl1
post Mar 26 2015, 10:00 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(ReverseDark @ Mar 26 2015, 09:29 AM)
cyntrix and heavy arm zotac titan x on 3999, but l canceled it due to cyntrix ego service nature  sweat.gif
*
??? U ordred via cyntrix or heavy arm. Afaik issue is with zotac n nvidia on apac allocation.

Personally prefer zotac as normally i get high asic from them. Asus got dud asics many times.

This post has been edited by cstkl1: Mar 26 2015, 10:03 AM
cstkl1
post Mar 26 2015, 10:19 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(ReverseDark @ Mar 26 2015, 10:06 AM)
l order from cyntrix, l don mind waiting for the card since they giv me assurance they will absorb the 3% gst offer, l felt good about it, but when l ask for pre-order/deposit receipt, they seems like don get what l'm talking about loll, and gave me the quotation instead, what l don like about them is they keep boasting about how their titan x priced at 3999 and how expensive other brand is and ask me to try my luck to buy titan x at 3999  doh.gif  doh.gif  doh.gif
*
Well lol. Gst thing . Still not sure is it based on order/invoice date or customs arrival date issue.
I guess you were talking to Mr Loh.
Well my experience with them is more on quality of zotac asic n fast rma. So hence y i gave up my asus reserved cards.
Too many times dude asus i got low asics. I need this cards to atleast do 1400. Asus rma now gone to the drain.

So hence my decision to wait it out with zotac. Btw i paid a depo.

This post has been edited by cstkl1: Mar 26 2015, 10:20 AM
cstkl1
post Mar 26 2015, 10:57 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(ReverseDark @ Mar 26 2015, 10:33 AM)
l guess it is base on invoice issue date, my car insurances kena gst even though l pay in this month
been through this few generation of GC, l've seen fairly good result benchmark from zotac, decided to jump ship from Asus and Msi from it  rclxms.gif
too bad this cyntrix is wasting my time for this, l guess l could wait for gm200 cut for real this time!? even though if l'm really getting this titan x, l might as well get it from heavyarm with just extra 3% = 120

been through Asus rma laptop before, HELLHOLEEEE
*
Not cyntrix bro. Zotac hk.
All aib got their cards only last friday/monday from nvidia.
Distro all got this week except zotac.
Apac sales only approved yesterday.

So shld be soon

Anyway still got until next week wed. Msi no show this time. Thats the second brand i like atm.

This post has been edited by cstkl1: Mar 26 2015, 10:58 AM
cstkl1
post Mar 26 2015, 11:33 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(ReverseDark @ Mar 26 2015, 11:28 AM)
I see l see, l don blame cyntrix for the card allocation, but l blame their service nature instead

msi gtx titan x lightning?? laugh.gif
*
Doubt it. Nvidia controls titans. This time even more. All titans comes from them.

Bro cyntrix takes cs very seriously. Hmm shld write to them.

Yeah but this time. Zotac hq was the one delaying afaik because of apac thingy. We shld see their cards soon.

I have had more early adopters regret than buying it later. So not gonna fall for that again.

So my case waiting it out for the two units.

On other note acer started selling their ips 1440p 144hz monitor worldwide.

This post has been edited by cstkl1: Mar 26 2015, 11:33 AM
cstkl1
post Mar 27 2015, 09:28 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(ReverseDark @ Mar 27 2015, 08:32 AM)
I will wait, getting a titan x means l need to change my monitor as well laugh.gif
*
Viewnet has one unit right now. Last in lyp asus. Go get it since ure order was canceled n deposit refunded.

This post has been edited by cstkl1: Mar 27 2015, 09:28 AM
cstkl1
post Mar 27 2015, 10:14 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(ReverseDark @ Mar 27 2015, 09:48 AM)
U mean the titan x? Both asus? Still waiting bank cut off time to reconfirm with cyntrix on the refunds
*
Yup.

If meps before 12pm u get it after 4-8pm.
cstkl1
post Apr 1 2015, 04:54 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

Lol. Everything i predicted about free sync was true.

Amd lied n now blaming scalers/panel limitation.

As i said nvidia said gsync wasnt easy. They just took a backseat as they knew amd will fail.
cstkl1
post Apr 2 2015, 03:05 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(ruffstuff @ Apr 2 2015, 12:20 PM)
If you read through adaptive sync white paper, they have similar approach like g-sync when frames drop below panel VRR window. 
I believe freesync is only part of adaptive-sync specification, not a full implementation of the spec.  I think they don't want to force panel manufacturers to produce new scaler with frame buffer, which eventually will narrow the cost of freesync closer to g-sync.
*
Dude no no no. To make this simple for you.

Adaptive sync just variable refresh rate which what scaler companies implemented within the vesa spec extension.
Thats all. Just a range for vrr. Going below and above is nothing to do with the spec as thats a gaming issue. Its the same for static monitors.

The implementation of it for use with amd gpu is freesync.
The current issue is the implementation of use with gaming. Adding a frame buffer with tuning each panel voltages is not in the extension vesa spec. Its a freesync issue. Amd is not involved in hardware level of the scaler/monitor. Aka zero $$$ as usual. Scaler manufactorer just did it for their gaming series monitor as it was within the spec of edp. They are not going to get involved with additional hardware for gaming. Amd has to do it via driver or start making their own module.

Gsync is the combination of both at hardware level embedded into a panel.

Amd nvr realiazed y nvidia went this route. Now they know. Of course nvidia knew from day one of edp spec that it wasnt suitable for gaming for driver intervention. Dude they have more ppl working on this than amd r&d and most probably explored this route thoroughly. Havent you realized swift was the first gsync monitor for a reason. Their certification process of a monitor is very stringent. Amd however.. is just aslong the monitor is compliant to edp spec.

Did you know, its very rare for nvidia to talk about amd. This time they did. N here is me laughing at i told you so. Amd in the end blamed the scaler. Then nvidia explained y gsync is different than freesync at below the vrr window. Amd clearly didnt put enough man power into this to reliaze the flaws for gaming.

Hence y nvidia from day one labeled gsync monitors is a gaming monitor. ULMB is a nvidia feature as well. Now back to laughing at amd.

Btw hmm nvidia is up to something with titan x. It seems nerfed on sli in alot of games. Alot of users of x99 are saying x79 had better scaling on 3-4 way. Issue started with 980 n seems titan x inherited the same trend..

Nerfed in prep for 390x??

Current flaw of gsync. With sli dsr is disabled. Ulmb wont work with gsync. The flickering thing on spikes from zero n back up again. But i heard all vrr monitor suffers this.

This post has been edited by cstkl1: Apr 2 2015, 03:45 PM

12 Pages « < 2 3 4 5 6 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0395sec    0.17    7 queries    GZIP Disabled
Time is now: 27th November 2025 - 03:59 AM