Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
3 Pages  1 2 3 >Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V14

views
     
Moogle Stiltzkin
post Feb 5 2015, 06:38 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
does anyone know whether to get pascal 2016, or is it worth waiting for volta in 2017 ?

Moogle Stiltzkin
post Feb 5 2015, 06:41 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(joellim @ Feb 5 2015, 11:26 AM)
The gigabyte offer is like you are already screwed by nvidia. Then we turn around and say, please screw me again, i wasnt screwed enough the first time. Pay once to nvidia not enough, now ask me to pay more again!
*
well your not getting screwed for getting a 980gtx, but it definitely is turning and rewarding nvidia by not losing them business though they deserve just that because of the 970 scam (because they sold a gpu claiming it was 4gb vram, when actually it was 3.5 vram all along....)
Moogle Stiltzkin
post Feb 5 2015, 11:24 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(skylinelover @ Feb 5 2015, 08:16 PM)
pascal definitely laugh.gif
*
ah tx.

well.... my 680 gtx kepler 2012 when playing my latest game which was Dragon Age Inquisition using ultra settings, it was unplayable ... fps of 20-30....

on a 60hz 1920x1200 24'' ah-ips lcd monitor, ivy bridge i7 quad core cpu pc gaming rig with 16gb of ddr3 corsair ram sad.gif (cpu and gpu are both water cooled by the way using a radiator with at least 8 fans in my rig)

this is a bummer for me because 20-30 fps at ultra on the latest game kinda sux for a gamer. I'm not even using a ridiculous big size screen or higher than a standard resolution which today is considered norm. That is why i need to update within the next 1-2 years from now urgently sweat.gif

This post has been edited by Moogle Stiltzkin: Feb 5 2015, 11:25 PM
Moogle Stiltzkin
post Feb 8 2015, 03:03 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(skylinelover @ Feb 6 2015, 10:35 PM)
lel now i can start sacrificing playing medium because now working cannot afford 2 buy new GPU every 2 years already laugh.gif doh.gif from 2 years change 4 years dang...other commitments 2 bare is the main reason...so my target is volta in 2 years time sweat.gif  rclxub.gif
*
exactly. besides if we look at the performance charts, every next gen e.g. that usual 2 year product update, the usual performance change is usually roughly 15%

only some of their products had a huge leap in performance over the immediate previous generation especially when it's new architecture.


but it's not only fps % increments to look at, there are other feature sets that should be taken note of

- directx 12 (though not sure how soon the newer games will start having this)
- gsync support (still too expensive to own a monitor with gsync module. The only other alternative is free sync on ati cards, but i don't think their tech is as good as gsync)
- HBM (ati's product will be getting this sooner than nvidia by roughly a year ahead. Because nvidias volta HMC didn't pan out so they switched to HBM as well, so thats why they are slower than ati this time around) anyway this tech is vastly increased memory bandwidth which has been a long time bottleneck on gpus. Though it's presumed volta will be 2nd gen of hbm with even higher bandwidth, possibly double if the rumors to be believed.
- unified memory (sounds good could be a game changer)
- graphics technologies to improve visuals (physx, gameworks on nvidia)


another interesting development, now their saying pascal and future gpus could possibly add vram from a sli configuration. before vram wasn't a single pool. but come pascal onwards it will, but with some rules to it. games need to be coded to support it or something.

This post has been edited by Moogle Stiltzkin: Feb 8 2015, 03:03 AM
Moogle Stiltzkin
post Feb 9 2015, 11:32 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(marfccy @ Feb 8 2015, 03:34 AM)
tbh both are similar, theyre some sort of adaptive v-sync. just that FreeSync leaves the GPU to do the processing compared to Nvidia's GSync module which is an external unit

prolly explains why Nvidia charges extra for the module
*
in terms of actual usage from what i read

freesync

pros:
using vesa standards
presumably cheaper than gsync because no additional module required

cons:
this implementation may have tearing .....


A more accurate description of this issue which i hate about the freesync version
QUOTE
For FreeSync (and the Adaptive Sync standard as a whole), when a game renders at a frame rate above or below this VRR window, the V-Sync setting is enforced. That means on a 60 Hz panel, if your game runs at 70 FPS, then you will have the option to enable or disable V-Sync; you can either force a 60 FPS top limit or allow 70 FPS with screen tearing. If your game runs under the 40 Hz bottom limit, say at 30 FPS, you get the same option: V-Sync on or V-Sync off. With it off, you would get tearing but optimal input/display latency but with it off you would reintroduce frame judder when you cross between V-Sync steps.




gsync

pros:
has no tearing

cons:
high cost due to gsync module added onto gsync monitors. This however may not be applicable for laptops which rumors say is using a variant more closer to freesync method.


a more detailed description why gsync is possibly the superior option
QUOTE
And you'd be wrong, G-Sync does handle this better because at no point do you need to enable V-Sync to avoid tearing above max refresh, thus undoing much of the benefit of this tech to begin with.

At higher than monitor refresh rates, the monitor continues to update at its max refresh rate with no input lag, no tearing because the G-Sync module with the onboard DRAM (y'know, the same magic stuffs AMD and their fanboys thought was extraneous and unnecessary) actively compares, holds, and renders each frame using it as a lookaside buffer.

So, any frames above what the monitor can refresh, the G-Sync module holds, compares, and chooses to display new, hold or throws out.

I guess all that pricey, proprietary hardware was justified after all! biggrin.gif

But yes it is nice to see AMD finally show us something with FreeSync beyond slidedecks, fixed refresh windmills and empty promises, but now they have shown it, they have also confirmed what many of us expected:

G-Sync isn't going anywhere, its better than FreeSync and wehatever premium Nvidia is charging will be justified.


QUOTE
G-Sync above max refresh has the option to hold a frame and wait for the next frame after monitor refresh, effectively reducing the actual frame rate output of the monitor, because the G-Sync module is dynamically changing and controlling the monitors refresh rate. Unlike typical refresh and V-sync, where the GPU is still slave to the monitor's refresh and the next frame must be rendered, regardless of how old it is, based on the timing of the monitor's refresh.

So in a worst case scenario, with uneven frametimes, you might see 6-7ms of input lag/latency on a 120Hz monitor (8.33ms between frame refresh).


QUOTE
the onboard DRAM on the G-Sync PCB acts as a lookaside buffer that allows the G-Sync module to compare, hold, and send frames to display. 768MB is capable of storing a whole lot of 4-20MB frames.

All that awesome knowledge Tom Petersen dumped on us in interviews on this very site starting to pay off!

Guess that G-Sync module and DRAM wasn't extraneous and unnecessary after all!



QUOTE
There are several PROBLEMS which G-Sync fixes.

Screen tearing - happens if you can't synch monitor and GPU (because normal monitors update at a FIXED interval)

LAG - VSYNC is used to fix screen tearing but this causes lag because there's a delay caused by buffering the GPU output to match the monitors update cycle

STUTTER - happens if you have VSYNC on but are outputting below the refresh rate

*So you can fix screen tearing by enabling VSYNC but then get lag (or lag and stutter) or disable VSYNC but get screen tearing.

G-SYNC fixes all these issues AT THE SAME TIME. The only real issue is staying above 30FPS which isn't a huge deal and even that will be fixed with newer PANELS (a panel limitation, not a G-Sync limitation).

**Above max?
I believe this is where G-SYNC is superior but it's hard to confirm. It's my understanding that G-Sync's added fast memory as a "lookaside" buffer allows G-Sync to stay enabled when the GPU can output more than the monitor's maximum.

Thus the monitor still updates ONLY as directed by the GPU software which keeps things SMOOTH. So you can basically stay capped at 60FPS on a 60Hz monitor this way.

FREE-SYNC however as a limitation of the monitor hardware (no proprietary hardware with lookaside buffer) seems forced to DISABLE the asynchronous method and go back to the normal fixed update by the monitor.

This is going to be a HUGE ISSUE for Free-Sync. So you have to stay in the RANGE to make it work (i.e. above 40FPS or below 60FPS). Not as big a deal for 30FPS up to 144FPS.

So basically G-SYNC seems to work almost perfectly and Free-Sync is problematic especially on 60Hz monitors.

(Worse is playing a game with 60FPS average on a 60Hz monitor. You'd keep going above and below the 60FPS mark meaning FREE-SYNC would turn on and off.)

Don't flame me if this is incorrect, but I've done a lot of reading and I think it's correct.

(Also not sure how AMD can fix this without introducing a proprietary scaler with lookaside buffer like NVidia's solution.)





So overall sounds to me like gsync cost more but gets it done right. Because to me whats the point of high frame rates if your going to have tearing. Tearing is very noticeable to me and is the main reason i currently use vsync with triple buffering to counter that.

i just don't think the freesync camp supporters are making enough valid arguments to prove that freesync is better than gsync (other than perhaps the cheaper cost).



the rest of the freesync vs gsync ongoing debate
http://www.pcper.com/news/Graphics-Cards/C...eeSync-Monitors

This post has been edited by Moogle Stiltzkin: Feb 9 2015, 12:16 PM
Moogle Stiltzkin
post Feb 9 2015, 11:45 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(Acid_RuleZz @ Feb 9 2015, 11:22 AM)
Why you want to sell bro? Are u affected by the problem? What game you playing use more than 3.5GB?
*
to name a few....

wolfenstein: new order (especially this game. low vram at high settings will cause the texture popping issue which is a very obvious and annoying effect where images when loading is blurred only to suddenly pop in after load. It becomes disorienting.)

shadow of mordor (if you think 4gb vram is enough, well this game needs 6gb vram to max out it's ultra settings.
QUOTE
If you want to play Shadow of Mordor on ultra then you’re going to need a pretty high end card as apparently there is a 6GB VRAM requirement, meaning that users may need a very specific GPU in order to play at full detail settings. To make things worse, that’s the recommended requirement for 1080p, rather than 4K, a resolution that you would expect to eat up that much VRAM.

Flagship cards from both Nvidia and AMD currently only ship with 4GB of VRAM, with the exception of the GTX Titans and some specific custom made GTX 980 and AMD R9 290x cards.

http://www.kitguru.net/gaming/matthew-wils...ultra-settings/ )



Why players should be upset having 3.5gb rather than the promised 4gb vram
http://www.guru3d.com/news-story/middle-ea...ess-test,7.html

QUOTE
Conclusion

Our product reviews in the past few months and its conclusion are not any different opposed to everything that has happened in the past few days, the product still performans similar to what we have shown you as hey .. it is in fact the same product. The clusterfuck that Nvidia dropped here is simple, they have not informed the media or their customers about the memory partitioning and the challenges they face. Overall you will have a hard time pushing any card over 3.5 GB of graphics memory usage with any game unless you do some freaky stuff. The ones that do pass 3.5 GB mostly are poor console ports or situations where you game in Ultra HD or DSR Ultra HD rendering. In that situation I cannot guarantee that your overall experience will be trouble free, however we have a hard time detecting and replicating the stuttering issues some people have mentioned.

The Bottom line

Utilizing graphics memory after 3.5 GB can result into performance issues as the card needs to manage some really weird stuff in memory, it's nearly load-balancing. But fact remains it seems to be handling that well, it’s hard to detect and replicate oddities. If you unequivocally refuse to accept the situation at hand, you really should return your card and pick a Radeon R9 290X or GeForce GTX 980. However, if you decide to upgrade to a GTX 980, you will be spending more money and thus rewarding Nvidia for it. Until further notice our recommendation on the GeForce GTX 970 stands as it was, for the money it is an excellent performer. But it should have been called a 3.5 GB card with a 512MB L3 GDDR5 cache buffer.

The solution Nvidia pursued is complex and not rather graceful, IMHO. Nvidia needed to slow down the performance of the GeForce GTX 970, and the root cause of all this discussion was disabling that one  L2 cluster with it's ROPs. Nvidia also could have opted other solutions:

Release a 3GB card and disable the entire ROP/L2 and two 32-bit memory controller block. You'd have have a very nice 3GB card and people would have known what they actually purchased.
Even better, to divert the L2 cache issue, leave it enabled, leave the ROPS intact and if you need your product to perform worse to say the GTX 980, disable an extra cluster of shader processors, twelve instead of thirteen.
Simply enable twelve or thirteen shader clusters, lower voltages, and core/boost clock frequencies. Set a cap on voltage to limit overlclocking. Good for power efficiency as well.
We do hope to never ever see a graphics card being configured like this ever again as it would get toasted by the media, for what Nvidia did here. It’s simply not the right thing to do. Last note, right now Nvidia is in full damage control mode. We submitted questions on this topic early in the week towards Nvidia US, in specific Jonah Alben SVP of GPU Engineering. On Monday Nvidia suggested a phonecall with him, however due to appointments we asked for a QA session over email. To date he or anyone from the US HQ has not responded to these questions for Guru3D.com specifically. Really, to date we have yet to receive even a single word of information from Nvidia on this topic.

We slowly wonder though why certain US press is always so much prioritized and is cherry picked … Nvidia ?



So if the market has a 4gb vram card, why settle with 3.5gb ? Especially considering some of the graphics intensive games we know are very demanding in regards to vram.

More so for gamers such as myself who buy the highest end graphics card expecting to then be able to play in ultra settings. This is where vram becomes important for this ultra settings to become playable.

And this is not yet even considering people who bought the 970 bought it thinking it was a 4gb, but instead later found out it's 3.5gb Which is blatant false advertising, and already lawyers are contemplating a lawsuit against nvidia because of it.

This post has been edited by Moogle Stiltzkin: Feb 9 2015, 11:55 AM
Moogle Stiltzkin
post Feb 9 2015, 02:20 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(zizi393 @ Feb 9 2015, 01:08 PM)
970 is not highest end. Titan is. Never settle for less if you really serious into going ultra everything. better to collect money and get the best. Shadow of Mordor came with preset setting on different GFX. they auto detect best setting for it. If you change those means U are going beyond the card limit. Even played ultra for AC4 BF.

Back to topic of misinformation yes I believe NVIDIA is at fault for not doing proper test or intentionally misled consumer. I believe this case is as car recall they should just recall all the 970 and come out with new hardware config which reflect the 4gb.
*
yes 980 is highest end, but 970 is not bad either. so i would argue ultra settings on 1080p 24'' screen size screen should be possible. for 27'' and higher than 980gtx or even an sli solution would definitely be recommended instead. So what i said about ultra settings possible on this card is valid, but what i left out was the monitor size and resolution, which obviously would be lower in order to achieve that ultra setting.

fact remains most people bought it for a 4vram product which it isn't. nvidia is now susceptible to a public class action lawsuit and rightfully so.


*update

just to reclarify....

QUOTE
Let me clearly state this, the GTX 970 is not an Ultra HD card, it has never been marketed as such and we never recommended even a GTX 980 for Ultra HD gaming either. So if you start looking at that resolution and zoom in, then of course you are bound to run into performance issues, but so does the GTX 980. These cards are still too weak for such a resolution combined with proper image quality settings. Remember, Ultra HD = 4x 1080P. Let me quote myself from my GTX 970 conclusions “it is a little beast for Full HD and WHQD gaming combined with the best image quality settings”, and within that context I really think it is valid to stick to a maximum of 2560x1440 as 1080P and 1440P are is the real domain for these cards. Face it, if you planned to game at Ultra HD, you would not buy a GeForce GTX 970.


So when i say ultra, i don't mean ultra HD RESOLUTION !!! because obviously 970gtx and even the 980gtx aren't ultra hd cards (which is ironic because thats the best you can get atm e.g. 980gtx)

what i meant was, ultra graphics settings ingame for HD resolution of 1080p or at 1920x1200 (which is what i use). That seems reasonable to me hmm.gif or maybe actualy ultra is not a good term to use considering that 4k reso option are now considered ultra settings .... so a more appropriate term is MAXIMUM 1080p settings, better? nod.gif

here is a 980gtx vs 970gtx with max graphics settings for shadow of mordor and dragon age inquisition
http://www.extremetech.com/extreme/198223-...emory-problem/2

*surprisingly neither card managed to get 60fps at max settings shocking.gif



980 gtx real world testing ( is 4gb vram enough? games tested DAI and shadows of mordor)
QUOTE
My EVGA GTX980 also shows a slowdown on the Nai benchmark:
RUN 1 RUN 2
I've not noticed any frame spikes when going over 3.5GB in games but I've mainly been playing DA:I which seems to cap out at 3GB. I just tried SoM (1920x1200 with everything on Ultra including textures) and it mainly hovered around 3.3GB but there were no obvious hitches when it occasionally hit 4GB.



find out more on the debates here
http://www.reddit.com/r/pcgaming/comments/...allocation_bug/



For those curious how 970 performs on ultra hd resolution just read here
http://www.guru3d.com/news-story/middle-ea...tress-test.html







bottomline....

The latest graphics card out in the market atm from both ATI and Nvidia camps, neither properly support ultra hd solutions in single gpu solutions. I didn't look at sli/crossfire yet but most people don't run those expensive setups.

also 4gb ram is barely enough for some of the latest game as shadows of mordor and wolfenstein new order have proven. we are now entering the realm of 6-8gb vram requirements soon hmm.gif

But considering that HBM is around the corner, maybe that will help alleviate some of the memory issues somewhat ?

This post has been edited by Moogle Stiltzkin: Feb 9 2015, 03:35 PM
Moogle Stiltzkin
post Feb 9 2015, 03:43 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(marfccy @ Feb 9 2015, 03:38 PM)
yep, pretty much sums up similar to what ive read so far

GSync is superior due to the module itself
*
yeah but the cost.... even i don't have a gsync monitor unfortunately cry.gif

i use a Dell U2413 24'' GB-R led AH-IPS wide screen
http://www.tftcentral.co.uk/reviews/dell_u2413.htm


mostly because i watch anime and play games. Would suck for me having tn panel. A pro gamer would be more biased for a TN panel with a higher refresh rate 120-144hz coupled with a gsync module. Can't say i blame them, but we each have our own priorities nod.gif

so either a ah-ips or a tn gsync high refresh rate monitor for both roughly the same price (both expensive and yet your still sacrificing one thing over another ) rclxub.gif

This post has been edited by Moogle Stiltzkin: Feb 9 2015, 03:45 PM
Moogle Stiltzkin
post Feb 9 2015, 04:39 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(marfccy @ Feb 9 2015, 04:02 PM)
well, rmb what they said. high quality IPS panel dont exactly come cheap

not to mention the response rate is unacceptable for gamers

frankly, GSync to me is good. but for a very specific purpose

wont be getting it at all, unless the monitor+module itself is affordable
*
good point, but i wonder when that will be hmm.gif i also want to know.

there's already a big questionmark how much those new HBM gpus are going to be, also how much vram it will have hmm.gif


all the gpu/monitor new stuff all $_$:

This post has been edited by Moogle Stiltzkin: Feb 9 2015, 04:40 PM
Moogle Stiltzkin
post Feb 9 2015, 05:36 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(terradrive @ Feb 9 2015, 05:08 PM)
4k tv is getting cheaper and cheaper

But 1440p and 4k monitor is expensive zzzz
*
true but from what i read 4k is only going to make a difference depending on the size of your screen.

also most media i have is either dvd, 720p or 1080p, so i don't think it will scale well on 4k resolution, the video quality will be less sharp which is counter intuitive. 4k is definitely the future, but it doesn't help out your older media especially the lower resolution they are, because they will have to scale more to fit a higher resolution. Sure you can change reso from native to downscale but that just reduces quality ... always better to run reso at native closer to the medias reso for a sharper quality image when viewing movies....

4k to me is only going to be feasible when 4k media becomes the norm (anime... tv series... movies...), and for gaming graphics card are powerful enough for this new HD ultra resolution. Right now gpu isn't keeping up with 4k yet especially in regards to 60fps then can forget it (for now anyway) cry.gif

another consideration, 4k netflix requires 25mbps unifi or higher. But i use 10mbps so that is out of the question till broadband prices become more affordable for that kind of speed (which is not going to happen anytime soon unfortunately)

Also if you do get 4k now, by the time 4k becomes the norm, there will be more advanced 4k monitors out on market, so your future proofing to wait till you get to that point was wasted. To me i rather wait for market to mature before opting into 4k once it's stable.




QUOTE(cstkl1 @ Feb 9 2015, 05:15 PM)
Moogle Stiltzkin

Nvidia Can bring down the cost of Gsync Module by finalizing it to ASIC rather than FPGA
sounds great (though honestly donno what that is, cept the price decrease which i'm all for). did they give a timeline when this will happen ? Or is that what we hope will happen ? hmm.gif

This post has been edited by Moogle Stiltzkin: Feb 9 2015, 05:42 PM
Moogle Stiltzkin
post Feb 9 2015, 05:44 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Feb 9 2015, 05:38 PM)
dude gsync is working right now and optimal fully functioning...

If u were Nvidia ... just have to see how freesync works with a mediocre AMD driver team.

Just waiting for AMD as usual if there is a issue to blame the scaler vendors.
*
that is true.... however from the technical explanation it still isn't convincing. maybe if as you say it was right in front of me i can see still judge...

1. can i live with freesync over gsync (or will i still be bitching about it)
2. how much ?

hmm.gif



my gpu history was....

nvidia < ATI radeon...something...< nvidia < nvidia 8800gts < nvidia 680gtx



i'm also not a fanboy. i go where i see is the winner. I think fanboyism is stupid because as a consumer we should always get best peformance/price ratio product to meet our needs, and not buy just for the sake of a brand rolleyes.gif

though after that 970gtx vram scandal, and ati later came out with a 4gb vram means just that + a price reduction, i can't blame people for getting refunds and switching sides rolleyes.gif (nvidia brought this on themselves)

This post has been edited by Moogle Stiltzkin: Feb 9 2015, 05:51 PM
Moogle Stiltzkin
post Feb 10 2015, 11:19 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(S4PH @ Feb 9 2015, 11:03 PM)
AMD likes to show off brute gpu power where Nvidia try to reduce power consumption, trust me my electricity bill is different when i started using gtx970
*
in nvidia there is option for power management.

i set mine to adaptive, only use the gpu as much as needed, then throttle when not really required to go all out.

I set specific rules for games i play to use maximum power when i have those games open.

To me this is a good balance between maximum performance and power management.


Not sure what ATI does in regards to that hmm.gif

Moogle Stiltzkin
post Feb 11 2015, 07:25 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
347.52 WHQL

this driver still has bug for mpc-hc when using mad vr option diffusion option 1.

for a temporary fix i use diffusion option 2 for now doh.gif

Moogle Stiltzkin
post Feb 11 2015, 09:10 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(ngkhanmein @ Feb 11 2015, 07:49 PM)
previously i always use ordered dithering but madvr crashes. this issue once a while happened suddenly.

now i using error diffusion - option 1
used colored noise - ticked
change dither for every frame - ticked

wink.gif bro, what kinda bug u mean?
*
got vertical lines. can't watch at all, is too obvious a bug sweat.gif
Moogle Stiltzkin
post Feb 11 2015, 09:33 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(ngkhanmein @ Feb 11 2015, 09:19 PM)
i watching .mkv 720p H.265 don't see vertical lines uisng madVR v0.87.13 + MPC-HC.1.7.8.20.x86 + LAVFilters-0.63.0-75 everything 32bit

let me test few more times. later i feedback to u.
*
my system
windows 8.1 pro 64bit
ivy bridge 2012 pc gaming rig. i7 3770 non-k CPU, with 16gb ddr3 corsair platinum ram. nvidia geforce 680gtx graphics card


latest drivers with the help of
1. windows update
2. Iobit driver booster 2
3. display driver uninstall *help to do a proper clean install for graphics driver
4. nvidia driver from the original site (i don't use driver booster to update this lelz)
5. ccleaner + ccenhancer to clear junk and fix registry for errors



my media software.... *all 32bit... although my os uses 64bit.....because madVR doesn't support 64-bit.

reclock 1.8.8.4
xySubFilter 3.1.0.705
MPC-HC 1.7.8 stable
madvr 0.87.13
lav filters 0.63.0




*sample media tested

yibis 720p 8bit encoded using x264



guide used to configure my media software settings for 10bit playback to watch anime and videos in general *click spoiler to unhide the guide intructions
http://myanimelist.net/forum/?topicid=516729


in addition to that, under external filters in mpc, i made sure that the following in this order were set to preferred
- lav splitter
- lav video decoder
- lav audio decoder
- xysubfilter



PS: if anyone want to use a similar highend setting build for video media playback, KCP is an easy installer/configurator for preset settings to achieve mor or less the same but much easily, also newbie friendly.
http://haruhichan.com/forum/showthread.php...waii-Codec-Pack


KCP has pretty much taken over CCCP as the newbie easy mode for media video playback setup.

This post has been edited by Moogle Stiltzkin: Feb 11 2015, 09:52 PM
Moogle Stiltzkin
post Feb 11 2015, 09:48 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Feb 11 2015, 09:46 PM)
I'm using option 1, no vertical scan line bug for me on any source (driver 347.25 and .52). Be it 8bit or 10bit.
*
thats weird hmm.gif

the bug looks like this
» Click to show Spoiler - click again to hide... «


did u see the settings guide i posted, is your configuration different than mine ?

Anyway this is the specific bug that many others are reporting. i'm surprise you don't have this problem hmm.gif
http://bugs.madshi.net/view.php?id=250

QUOTE
cyberbeing-
I just received a response from NVIDIA.

They were able to reproduce the issue, and have assigned the bug to their development team for investigation.



what nvidia gpu are you using ? Mines 680gtx.

This post has been edited by Moogle Stiltzkin: Feb 11 2015, 09:58 PM
Moogle Stiltzkin
post Feb 11 2015, 10:02 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Feb 11 2015, 09:57 PM)
No, not the same. Please don't use Niyawa's config anymore, it's outdated and his settings doesn't work correctly after Madvr ver 0.87.10.

Just use the latest KCP beta, enable the highest preset option and then change dithering to option 1. Also, define your monitor as an 8 bit panel under Devices > properties.

Alternatively use - https://imouto.my/madvr/
Though I don't like that he uses Lanczos for chroma and image upscaling. If you follow his guide, make sure you change it to Jinc (3 taps).

I have SLI 970s. But previously on a 560Ti, I did not have this bug either.
*
but my monitor is 10bit albeit with frc method though doh.gif

imouto has a profile in madvr setup based on source resolution. i probably should do that, but lazy. -skipped.

i also skipped the dispalgui 3dlut config cause i don't own a led backlight calibrator unfortunately.


QUOTE(SSJBen @ Feb 11 2015, 09:57 PM)
Just use the latest KCP beta, enable the highest preset option and then change dithering to option 1. Also, define your monitor as an 8 bit panel under Devices > properties.
yeah was looking into it before made the switch. just wasn't sure how kcp configuration compared to niyawa's hmm.gif


by the way what do you think about mpc be ? is it worth getting that over the mpc hc nightly builds ? hmm.gif

This post has been edited by Moogle Stiltzkin: Feb 11 2015, 10:05 PM
Moogle Stiltzkin
post Feb 11 2015, 10:20 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Feb 11 2015, 10:08 PM)
I'm also using a "10-bit" monitor, but madvr only recognizes 8bit (despite it saying "& above"). It doesn't matter though, as long as you set it to 8-bit then it'll work fine.


ah right. wasn't aware of this ohmy.gif


QUOTE(SSJBen @ Feb 11 2015, 10:08 PM)
Yes, switch to KCP. Like I said, his settings does not work anymore for both madvr and even reclock (try using AC3 encoder to real time encode all your source to DD 5.1 with his reclock settings, you'll either get sync delays or no sound). madvr had A LOT of changes going from 0.86.x to 0.87. Previous settings are pretty much void by now.


a lot of my media has ac3 5.1 but it's set to output to 2.1 stereo so maybe that is why i ddn't noticed doh.gif but i take your point. definitely should go KCP now more so.

QUOTE(SSJBen @ Feb 11 2015, 10:08 PM)
*EDIT*
I used MPC-BE for a while, it's okay. But one thing I didn't like was that it had a bug where if I switched to my TV, the player will somehow cause massive framerate drops WITH the exact same settings. This is despite both displays being 8 bit, so the only difference is the size and panel, both of which doesn't do anything to change media player performance. I switch back to my monitor and wham, no frame drops. Sometimes the bug will stick until I restart the computer though.


yeh just read other user input on mpc be, seems i rather stick to the regular kcp build using mpc hc
QUOTE
You have to use the normal KCP version. The MPC-BE no longer works with changing audio in LAV Splitter since v1.4.3 when they forced a filter that's called "Audio Switcher" always on; When this filter is on the player will ignore those lav splitter settings

Just use the normal KCP version or go complain to MPC BE team. I don't care much for MPC BE.




QUOTE(SSJBen @ Feb 11 2015, 10:08 PM)
I don't know if they have fixed this bug or not, I'll have to check. But I run my MPC pretty much without menus or borders, so having a black skin or not makes no difference to me.
when i play games and watch mpc, i too also remove borders n stuff. but normally i leave default lelz.



anyhow how does kcp deal with low reso sources ? does their preset configuration handle that properly as well ? Or is that something extra i have to add myself.

from what i read, some people suggest using madvr profiles to handle different resolution sources differently to achieve the best result accordingly, which makes sense. The other proposed fix was to use the now less commonly used ffdshow and set it to preferred for low reso media hmm.gif

Moogle Stiltzkin
post Feb 11 2015, 10:54 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Feb 11 2015, 10:32 PM)
KCP is basically just madvr + LAV filters + reclock + xy/vs sub rolled into one. It's made easy, their highest preset is higher than that of Niyawa's old highest madvr settings. So by default on their highest preset, it already handles low res content pretty well IMO.

I don't use ffdshow anymore, I think Jinc with 3 taps is pretty good already. NNEDI3 is even better but too bad until today, Nvidia cards cannot work properly with it.
*
QUOTE(ngkhanmein @ Feb 11 2015, 10:40 PM)
i'm using 6+2 bit monitor. try https://github.com/nikola/htpc-updater/releases/tag/0.8.1

check whether got issue. my suggestion is delete everything & install again. i tried b4 it works.

followed this method >>> http://wiki.mikejung.biz/MadVR

maybe ur gpu suitable option "2"
*
i uninstalled all my media software using total uninstaller (which also deletes and trace registry and files). Also ran ccenhancer/ccleaner to delete files/registries etc... reboot pc...

then install KCP and put preset highest..... also included reclock.... and set to mix audio with stereo as default.....

thats the only changes i did so far.


Tested on media play fine, but i don't think KCP profile/presets are designed to fine tune to adapt madvr profiles based on the resolution of source being used. Meaning that there is room for improvement here if doing manual custom tweak i suppose.

also under the madvr options rendering > dithering it uses ordered dirthering for the highest preset ....... is that supposed to be better quality compared to error diffusion 1 ?

Also i tested enable diffusion 1 and still see the vertical line bug rclxub.gif



PS: for KCP how do you normally update when a newer KCP is out ? hmm.gif


also just saw htpc updater. wow thats very useful, wonder if it's compatible with KCP hmm.gif


This post has been edited by Moogle Stiltzkin: Feb 11 2015, 11:08 PM
Moogle Stiltzkin
post Feb 11 2015, 11:18 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
according to ngkhamein's guide link
http://wiki.mikejung.biz/MadVR#MadVR_Dithe...sing_Wiki_Pages

QUOTE
MadVR Dithering and Error Diffusion

MadVR has added a few new dithering options. Error Diffusion Option 1 and Error Diffusion Option 2. Both of these new options are very demanding, and require a DX11 GPU to use. If you stick with the ordered or random options you will save yourself about 4ms of render time. If you have a GTX 970 or better then I highly recommend enabling Error Diffusion Option 1

I tested both of the options under the main dither choices and there is very little performance gain / loss if you enable or disable the options. From what I could tell there was maybe a 1ms gain / loss for render time if you check the options so only enable them if you feel like the video looks better. I have use colored noise and change dither for every frame enabled.

The GTX 970 can handle Error Diffusion Option 2 without any issues. Even combined with Smooth Motion the render times are under 15ms for 1080p content.


QUOTE
I tested both of the options under the main dither menu and there is very little performance gain / loss between Error Diffusion 1 and Error Diffusion 2, at least if you have a newer Nvidia GTX 970 or GTX 960. You must have a DX11 GPU to enable the highest two options, so if you have an older or not very powerful GPU, stick with either ordered or random dithering.

I chose to go with Error Diffusion Option 2 for Dithering, with both of the options enabled. Personally I like option 2 over option 1, but everyone has their own preferences, so choose whatever option looks better to you. In terms of performance, the fastest option is to use "None", Random, Ordered, Error Diffusion Option 2, and finally Error Diffusion Option 1. If you have a higher end GPU like the GTX 970, GTX 980, or any kind of SLI setup then you should go with Error Diffusion Option 2.

When playing a 1080p video I noticed that Error Diffusion Option 1 cost about 1ms more than Error Diffusion Option. By that I mean the average frame render time was about 6.10ms with option 1 enabled, and 5.10ms with option 2 enabled. While 1ms may not seem like a lot, it can mean the difference between dropped frames, or no dropped frames, so keep that in mind if you start to notice slow playback.


So basically i should ideally use error diffusion option 2, but KCP highest preset opts for ordered instead which doesn't sound like it's the best quality setting doh.gif

if that is just 1 example where KCP highest preset isn't actually the best setting, how can i trust they didn't misconfigure elsewhere :/ ?



but to be fair i'm using a 680gtx, so maybe ordered is the best setting for my gpu..... but i wonder is KCP setting to diffusion option 2 for those with 960/970 or better when it detects those graphics cards ? Or is ordered a perma setting for highest preset regardless of what gpu you have ? does anyone know hmm.gif ?


QUOTE
With smooth motion in madVR there really is no need for reclock. Also it just adds one big headache to the whole thing.
also wonder whether reclock is still relevant if using madVR smooth motion hmm.gif

This post has been edited by Moogle Stiltzkin: Feb 11 2015, 11:26 PM

3 Pages  1 2 3 >Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0465sec    0.61    7 queries    GZIP Disabled
Time is now: 27th November 2025 - 10:05 PM