Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V14

views
     
Moogle Stiltzkin
post Feb 12 2015, 01:12 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(SSJBen @ Feb 11 2015, 11:39 PM)
No, madvr settings does not automatically detect your hardware. It's always on the safe side of things, which means it uses random dithering as a default although I think in one of the recent versions it was changed to ordered dithering.
No, I don't use smooth motion anymore now that I have reclock set the way I want it to. 24fps for movies and most modern animes now a days, then 60fps for 1080i content. Sometimes I turn smooth motion on just for the heck of it, it works nice on some animes (usually those with a lot of screen panning) but most of the time it's just another terrible form of motion interpolation. Never use it for movies though!
*
i feel that at very least they should add more options in the KCP configuration that explain what setting to change depending on what kind of gpu you using hmm.gif

cauz i need to find a guide that is explicitly for KCP so i know which settings to modify to get best setting.

watching the anime the quality is fine on highest preset, but being techie of course i'll nitpick to get the best quality settings smile.gif

anyway.... do you have a guide you can refer me to that adjusts kcp highest preset to tweak it for a more best quality setting hmm.gif is copying some of the imouto guide compatible with kcps highest preset ?



QUOTE(ngkhanmein @ Feb 12 2015, 12:01 AM)
i also using ccleaner & i haven't any special uninstaller to remove my nvidia driver expect AMD. once new driver released, i just install on top from previous driver & disable services regarding nvidia expect NVIDIA Display Driver Service then reboot.

bro, i'm novice for those madVR mpc stuff. i haven't really clean the registry for mpc & madvr that's y my setting still there. i just throw into recycle bin then use ccleaner.


i heard geek uninstaller (freeware) is good for removing left overs when uninstalling
http://www.geekuninstaller.com/

but i'm pretty content with total uninstaller (paid software) which i know 100% works notworthy.gif


QUOTE(ngkhanmein @ Feb 12 2015, 12:01 AM)
chroma upscaling i use NNEDI3 32 neurons, image doubling leave unticked & the rest follow the picture shown (highest setting)

i disable smooth motion. once i open any vids, dropped frames 2 (depend), if drag the seek bar the dropped frames confirm increase. i think that's normal.

importantly, delayed frames & presentation glitches must be 0 for what i know.

any vids u tried also got the vertical lines? maybe u can try nightly mpc-hc.
it's not really mpc problem. it's definitely driver issue, even nvidia admitted it, check the link i posted earlier if you hadn't yet for that bug report.


hm.... but i think smooth motion enable is better .... then have the setting only if judder without it enabled as well.



i recall imouto's guide regarding luma, that high reso and low reso should use different setting, hence why they create profiles based on the source reso being played to use the setting more appropriate accordingly. from what they explained, doing it that way you can get the best quality, rather than trying to have 1 single general setting which doesn't play well when trying to balance between low and high resolution videos.

This post has been edited by Moogle Stiltzkin: Feb 12 2015, 01:14 AM
Moogle Stiltzkin
post Feb 12 2015, 01:17 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(ngkhanmein @ Feb 12 2015, 12:30 AM)
i'm surprised they even intro that card.

960gtx seems more relevant than that if want to consider budget card. at the least it has the latest nvidia gpu tech in if, albeit with the lowest fps compared to 970 and 980.

so why on earth would someone get the 750 ? i don't get it rclxub.gif
Moogle Stiltzkin
post Feb 12 2015, 09:15 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(marfccy @ Feb 12 2015, 03:41 AM)
all these talk on Madvr, reclock, MPC HC.. im using MPC HC and KCP + SVP

stresses my card pretty darn too, 58C on normal video playing sweat.gif

and the lack of guides makes it hard for me to tweak Madvr settings
*
thumbup.gif


i still don't understand why people still swear by vlc, when kcp has obviously become the best easy option sweat.gif still can tweak further sure, but even the default presets are quite good already.

This post has been edited by Moogle Stiltzkin: Feb 12 2015, 09:17 AM
Moogle Stiltzkin
post Feb 12 2015, 05:04 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
by the way, anyone know what they are going to announce for nvidia in a few months time at their event ?

they say it's some product they have been developing for the past 5 years. Is it pascal ?
Moogle Stiltzkin
post Feb 12 2015, 08:08 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(ngkhanmein @ Feb 12 2015, 05:36 PM)
nvidia shield tegra x1. confirm regarding gaming shit like tablet or console.
*
eww.... took them 5 years on that ? lelz doh.gif
Moogle Stiltzkin
post Feb 14 2015, 03:29 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(SSJBen @ Feb 14 2015, 02:52 AM)
Well let's get back on topic. Been trying to mod my 970 BIOS the whole night, haven't really gotten it to where I want it to be.

Anyone else modded their vBIOS willing to share their results?
*
huh? wut exactly are you modding hmm.gif

or did u mean overclocking using those firmware software e.g. msi, asus etc...

This post has been edited by Moogle Stiltzkin: Feb 14 2015, 03:30 AM
Moogle Stiltzkin
post Feb 20 2015, 10:11 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
i predict a big performance boost come either pascal or volta in 2016-2017 so waiting on that ^-^;
Moogle Stiltzkin
post Mar 5 2015, 04:47 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
how does the nvidia shield android box work exactly ?

i had heard it's able to stream games from your pc to the big screen tv in the living room (so you can play on the big tv), if you have the home networking for it.

but then where exactly is the graphics being powered from ? Is it the desktop pc gpu, or will it be the gpu on the shield box itself ?




Also titan X is based on what technology ? Maxwell or Pascal ?

12gb ram seems cool, but for budget minded 6-8gb would have worked just as well hmm.gif And it's gonna get more expensive when switching from GDDR5 to HBM come pascal.

This post has been edited by Moogle Stiltzkin: Mar 5 2015, 04:50 AM
Moogle Stiltzkin
post Mar 5 2015, 09:47 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Mar 5 2015, 09:22 PM)
Malaysia confirm min rm3999-4999.  Hope theres a no limit skyn3t bios. 1500mhz on this will be uber sweet although clocking gain on gtx980 only gives real world performance boost of 8-10%. In comparison to keplers where its easy to get 20% boost in fps.

Hope asus finalizes that 4k ips gsync refresh rate. Hoping for 120hz.

5960x silicon lottery 4.8ghz barely manage to get it.
*
too expensive for maxwell...

i think it's more worth waiting for pascal or even volta.
Moogle Stiltzkin
post Mar 5 2015, 09:54 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Mar 5 2015, 09:49 PM)
dude waiting game gets longer if there is a hitch.

HBM currently is limited at 4gb and afaik only by Hynix.

Really want 4k Gsync IPS 120hz Experiences.
*
4 gb? is that gonna be enuff icon_question.gif ??
Moogle Stiltzkin
post Mar 6 2015, 10:45 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Mar 5 2015, 10:28 PM)
meaning that 6gb,8gb,12gb vram density.. is not ready yet.
thats what pascal is HBM with Unified Mem ( Wondering now how dx12 and this tech will play along)
*
will 6gb hbm be ready for volta if not pascal ? hmm.gif

but if unified memory, at most will tap into ddr3 or ddr4, is that gonna be enough for GPU scenario ? How exactly does that work hmm.gif
Moogle Stiltzkin
post Mar 7 2015, 08:09 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Mar 6 2015, 11:06 AM)
Yeah n with hbm large speeds with unified mem will result same affect ure gonna see on the 970 512mb vs the 3.5gb.
*
ooo that sounds like unified memory isn't as great as i thought it would be.

unless it would work better once pc's start using HBM as well ? or will that not make any difference?
Moogle Stiltzkin
post Mar 8 2015, 03:47 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Mar 7 2015, 11:25 AM)
Not possible as our imc is in the cpu now.
Hbm has shorten the trace lines between imc n ram stacking afaik. 

.
*
any ideas how directx12 multi gpu using different brands e.g. nvidia and amd is going to play out ?

Now i think opengl, mantle or both are also saying they can do just as much.

does that mean for those of us not wanting to spend much on gsync monitor, but still want to use a nvidia card, will end up using nvidia + ati in some sort of multigpu setup.

But sounds as if, then the ATI card will have to be the primary if wanting use the freesync i suspect hmm.gif


Seems like things are going to get pretty complicated rclxub.gif
Moogle Stiltzkin
post Mar 18 2015, 08:55 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
As i suspected... huge gains from moving up from maxwell architecture to pascal
http://www.extremetech.com/gaming/201417-n...al-architecture


QUOTE
Pascal’s next improvement will be its use of HBM, or High Bandwidth Memory. Nvidia is claiming it will offer up to 32GB of RAM per GPU at 3x the memory bandwidth. That would put Pascal at close to 1TB of theoretical bandwidth depending on RAM clock — a huge leap forward for all GPUs.



And i predict pascal to volta the year following that (1 not 2... if the roadmap hasn't changed since last i saw) is going to double the HBM performance.

*update, seems they pushed volta to 2018 hm doh.gif

user posted image

This post has been edited by Moogle Stiltzkin: Mar 18 2015, 08:59 AM
Moogle Stiltzkin
post Mar 18 2015, 09:01 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Mar 18 2015, 08:59 AM)
Watch the vid dude. Its not gaming performance.
That slide is abt nvlink, hbm etc. So it includes multi gpu scability etc.
*
still reading, but this is what i felt would be pretty big deal for gpu

QUOTE
Pascal’s next improvement will be its use of HBM, or High Bandwidth Memory. Nvidia is claiming it will offer up to 32GB of RAM per GPU at 3x the memory bandwidth. That would put Pascal at close to 1TB of theoretical bandwidth depending on RAM clock — a huge leap forward for all GPUs.


But HBM is not cheap, so how much will these new gpus cost hmm.gif


HMC which is even better than that, also costs even more than HBM. but nvidia had to cancel their plans for it for pascal because it wasn't ready yet.

user posted image

user posted image

http://www.extremetech.com/computing/19772...rid-memory-cube



This post has been edited by Moogle Stiltzkin: Mar 18 2015, 09:03 AM
Moogle Stiltzkin
post Mar 18 2015, 09:06 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Mar 18 2015, 09:05 AM)
Nah forget the reading. Jen was talking about deep learning.
There was no talk about game performance although i did take a 30min nap between that n the self driving car thingy.

The slide was abt the digit package n how current limitation is as 4 cards but with nvlink it will be 8.
Watch the launch vid.
*
okay after my strife game. just started.

so what do you predict will be in volta ?

i thought it was 1 year after pascal, now it seems 2 ? don't think i can wait that long hmm.gif

PS: i signed up for the compy for the nvidia card, hopefully i win so get the upgrade sooner laugh.gif


PS: also from what i hear come pascal, multi gpu vram from each graphics cards will be say 4+4 =8 total, rather than just 4 like before. Sounds good but for me, i using 1080p 24'' screen size, so i think i can get by on a single high end pascal gpu and still get ultra settings.

This post has been edited by Moogle Stiltzkin: Mar 18 2015, 09:11 AM
Moogle Stiltzkin
post Mar 18 2015, 10:06 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Mar 18 2015, 09:13 AM)
Dude hbm etc might even be scratched in the future.
Ddr4 is immature. Lots of paths.

Currently for sure. Titan x has bandwidth issues to even challenge 290x/295x2 at 4k. 390x from ground up is targeting 4k. I see a smackdown on titan at 4k. But wce 8gb edition?? Heat issue??.

Meaning at 4k texture streaming theres a bottleneck n having that 12gb vram is gonna be useless. See how 1440p to 4k vs 980. The margin shrinks sometimes alot. Vs 290x2 is way worse

Same issue as having a 960 with 8gb.

If u ask me. Titan x is a 1440p single card that does ok at 4k. 390x if its true is the real 4k card.
*
uh i thought talk was DDR4 was nearing it's max, so they were instead pursuing other options like

wide I/O; HBM, HMC

hmm.gif


spotted the titan x review
http://www.extremetech.com/extreme/201346-...ngle-gpu-market


but seems like it's a maxwell architecture. i rather spend money on the new one pascal or volta, and cost far less doh.gif But then again i don't use 4k reso or bigger than 24'' size screen.

QUOTE
The Titan X is based on Nvidia’s GM200 processor and ships with all 24 of its SMMs enabled (that’s the current term for Nvidia’s compute units). The chip has 3072 CUDA cores, and a whopping 12GB of GDDR5 memory.  To those of you concerned about a GTX 970-style problem, rest assured: There are no bifurcated memory issues here.


QUOTE
The end result of all of this pixel-pushing power is that the Titan X is meant to push 4K more effectively than any single GPU before it. Whether that’s “enough” for 4K will depend, to some extent, on what kind of image quality you consider acceptable.

12gb is sweet, but with HBM over the horizon uh... hmm.gif not to mention maybe 8gb would have been preferable and cheaper too.


but what the titan x does add is probably the first 4k compatible gpu for people who need it right now nod.gif

This post has been edited by Moogle Stiltzkin: Mar 18 2015, 10:17 AM
Moogle Stiltzkin
post Mar 20 2015, 04:30 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Mar 18 2015, 11:06 AM)
Dont see titan x as 4k gpu. If it was jen would have been proud to announce it. He knows theres a bandwidth issue with maxwell on texture streaming. They slapped on thr 12gb just to keep the 999 price tag. Its a useless amount since the texture streaming seems to overwhelm the bandwidth. From what i saw at the 4k.

Infact if u look at the whole launch is about the middleware nvidia is releasing for deep thinking.

Its just a bigger maxwell. Nothing new like og titan.

Amd is totally gunning at 4k. I have to side with the red camp. Expecting a nvidia smackdown derby.
*
from what i see in the benchmark it's a 4k gpu albeit lower quality settings. seems the gaming with high settings the fps comes under 30fps. me personally i feel 45-60 is what i'm willing to have for gaming hmm.gif More so if it's a fps type game.

i haven't really seen what the red camp is coming out with. Presumably i heard they will be the first to launch their HBM gpu first (because they made the right decision to go with HBM before Nvidia did after realizing HMC was taking too long).

by the way freesync monitors are shipping out now doh.gif

According to this illustration freesync looks better than no freesync at all

no freesync
user posted image

freesync enabled
user posted image


http://www.extremetech.com/gaming/201568-a...inally-shipping


sounds like freesync prevents stuttering that comes from using regular old vsync, but i wonder what the FPS impact will be ? Gsync promised better vsync without any FPS performance impact. Is that the case for freesync as well ?

another question also pops in, can you use freesync with a nvidia gpu ? Will nvidia be willing to force users to get gsync monitors to benefit, or will they allow their nvidia camp use freesync as well ? freesync monitors are $50 cheaper than gsync monitors.


This post has been edited by Moogle Stiltzkin: Mar 20 2015, 04:37 AM
Moogle Stiltzkin
post Mar 20 2015, 01:17 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Mar 20 2015, 10:32 AM)
U cannot use adaptive sync as its a extension on dp1.2 which nvidia wont support unless its a mandarory spec at 1.3.


i understand it's optional, but nvidia will gain their fanbase ire if they purposely force users to use gsync as their only option.

fans may have ignored the lack of mantle because there is directx12, but i don't think they will think easily ignore this when it comes to a lack of freesync support, not when a gsync monitor cost $50 more.



QUOTE(cstkl1 @ Mar 20 2015, 10:32 AM)
Cant wait for my cards to test watchdog msaa 8x supersampling with sweetfx n mod.
what do you think about MFAA X2 or x4 ? I heard it's equivalent in quality to MSAA x2 and X4 but with less of a FPS performance cost.
http://www.extremetech.com/gaming/194629-a...-frame-rate-hit

This post has been edited by Moogle Stiltzkin: Mar 20 2015, 01:20 PM
Moogle Stiltzkin
post Mar 20 2015, 04:06 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
Sums up freesync i think
user posted image

freesync + vsync will cap FPS to the refresh rate your monitor supports e.g. 60hz monitor will cap at 60fps.

freesync + vysnc disabled, no cap, also fps can be much higher, but there is a possibility of screen tearing.


So the way i see it, if you have a 144hz monitor, then you can have vsync on, and it would only cap at 144 fps rather than 60 if you had gotten a 60hz monitor.

But then you'd be getting a tn panel monitor rather than say a ah-ips. So unless you take your gaming seriously, not everyones going to be willing to make that trade off.



overall seems like freesync to me personally is a success. vsync without stutter or any performance hit (exception is your monitor refresh rate is the bottleneck now which is acceptable, because you can buy a 144hz or 60hz monitor according to your preferences).

this is how freesync works on a 144hz monitor with vsync on/off
user posted image

Seems like leaving vsync on makes sense, because it has zero tearing, and the performance difference is negligible.


You can read the full review of freesync here
http://www.techspot.com/review/978-amd-freesync/



QUOTE
It's also important to note that as soon as in-game frame rates dip below the minimum refresh rate, the smoothness immediately vanishes. This is understandable, and is the case for both G-Sync and FreeSync displays.


QUOTE
This brings me to my one main complaint about FreeSync, which is really a complaint directed towards FreeSync display manufacturers: a 48 Hz minimum refresh rate, which is the case for LG's FreeSync monitors like the one I used for this review, is too high to get the full benefits out of technology. The minimum refresh rate should be 40 Hz at the very highest, and ideally 30 Hz to match G-Sync.
Again, this is not a downside with FreeSync itself: the specification allows refresh rates of 9 to 240 Hz. But when you have a monitor that supports just 48-75 Hz, you're cutting out nearly half of that ideal variable refresh zone (40-60 Hz), which leads to a noticeable transition from smooth gameplay to stuttering gameplay at the variable refresh boundary of 48 Hz. For the best experience, you want this transition to occur gradually, which is what you get from a lower minimum refresh.


QUOTE
That said, I should make it clear that even at 48-75 Hz, FreeSync does deliver a noticeably smoother, better gaming experience. Opting for a monitor with a 40 Hz minimum delivers an even better experience, while shifting down to 30 Hz with G-Sync sees small gains. As far as this technology is concerned, the larger the refresh range, the better.


QUOTE
But FreeSync provides an extra collection of features that G-Sync doesn't, most notably the ability to choose to have v-sync enabled or disabled for refresh rates outside the variable range.

This is especially handy for situations where frame rates are below the minimum refresh rate: G-Sync's choice to force v-sync on introduces stutter and an extra performance hit below the minimum refresh, which can be resolved on FreeSync by disabling v-sync.


QUOTE
But the main advantage, at least right now, concerns price. As FreeSync is a VESA standard, and doesn't require a proprietary chip like G-Sync does, FreeSync monitors are simply cheaper. Both of the 27" 1440p 144 Hz FreeSync monitors - the $499 Acer XG270HU and the $599 BenQ XL2730Z - are cheaper than the one G-Sync equivalent, Asus' $779 ROG PG278Q. That's a huge saving of $180-280.


hmm.gif so guess when buying a freesync monitor, you'd be looking not only at the maximum refresh rate, but also the minimum.



so can anyone convince me that gsync is better than freesync in terms of pricing, performance and features ? cause i can't think of any rclxub.gif




the way this is headed, seems to me that amd who first came out with mantle, will i reckon be put to rest once DX12 and possibly even the new opengl come out.

Also gsync will soon die opening the way for freesync (cause i can't imagine anyone wanting to pay $180-$280 more for something that works better than......)


So the question remains if nvidia is going to force their userbase to support gsync by not allowing freesync to work on nvidia cards, despite there being no additional cost to adding support (other than maybe making gsync redundant.....). Doesn't seem in the consumers interest if they try to sideline freesync >:{

QUOTE
In an interview by PC Perspective (Via Sweclockers.com) Nvidia employees reveal that green has absolutely no plans to support FreeSync. This statement should put to rest all rumors of whether Nvidia would support DisplayPort Adaptive Sync regardless of the fact that it is a direct competitor of the G-Sync proprietary standard.

http://wccftech.com/nvidia-plans-support-f...-adaptive-sync/



This post has been edited by Moogle Stiltzkin: Mar 20 2015, 04:39 PM

3 Pages < 1 2 3 >Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0413sec    0.43    7 queries    GZIP Disabled
Time is now: 30th November 2025 - 12:28 PM