Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
125 Pages « < 11 12 13 14 15 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V15 (new era pascal), ALL HAIL NEW PASCAL KING GTX1080 out now

views
     
shikimori
post Jul 28 2015, 12:05 AM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(arslow @ Jul 27 2015, 10:14 PM)
Ugh, everyday I'm getting less and less interested in replacing my 2500k with a skylake platform.  I guess what I'm gonna do with my rig overhaul budget(about 4k or so) is just get a new case and go all out on gpu...maybe get a 1080ti or whatever they decide to name it lol.

My u2412m is only 3 years old now. Have promised myself to not change it till the warranty is over, but it's getting harder and harder to do so, whatnot with the existence of 27" ips 144hz WQHD monitors these days!!!!
*
go get yourself a 144hz IPS monitor . You wont regret it man , game that you play feels really buttery smooth without sacrificing colors and viewing angle . I have not experience g-sync or freesync but 144hz is really worth it provided you have a decent gpu

Think of it like having the COD esque movement feeling when playing non COD games even on RTS or games like Diablo 3 .
TSskylinelover
post Jul 28 2015, 12:12 AM

Future Crypto PlayeršŸ˜„šŸ‘ŠDriver AbamsadošŸ˜ŽšŸ˜Ž
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Jul 27 2015, 04:01 PM)
ooo i'll google that up then.

then this nvlink ? sounds like not only do i buy the pascal gpu, but also need a new motherboard with nvlink as well ?

they say it would look this basically
user posted image
user posted image
user posted image
Molex's NeoScale: An example of a modern, high bandwidth mezzanine connector
Too much to quote, the rest is here
http://www.anandtech.com/show/7900/nvidia-...ecture-for-2016
anyway sounds like nvlink mobo isn't a pre-requisite to use a pascal, can still use pcie. but the question, would using nvlink for a single gpu be worth it ? or is it only going to help for multi gpu setups ? I'm not a fan of multi gpus cause of driver support issues doh.gif so just wondering if upgrading to nvlink mobo is worth it for a single gpu setup. i rather wait for a cannonlake + before i upgradeĀ  sweat.gif
*
Cannonlake wait till hair drop wont come out so soon laugh.gif doh.gif pulled the trigger from lynfield 2 haswell last february with no regret haha

As 4 pascal, i still can hold out my kepler till Q3 next year if everything went according 2 plan. Being 30s means i no longer able 2 buy gpu every year unlike student days using sugar daddy money. Haha.

P/s : having 1500 per month used 2 be heaven like being student but now salary double that not enough 2 cover myself doh.gif probably need quad only enough and dont start talking about after having children with me shakehead.gif

That is unless i live malay way of 5 children with low grade milk powder hahahaha i know my 15 yrs longest serving malay supervisor ever with slightly higher salary than me already working in 2 years can support his 5 kids comfortably and maybe with some help from the G perhaps aiks

This post has been edited by skylinelover: Jul 28 2015, 12:24 AM
Moogle Stiltzkin
post Jul 28 2015, 01:15 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Jul 27 2015, 10:27 PM)
haswell benefit

if u want more
faster encoding via the huge dmi bandwidth.
sata 3 native ports
usb 3 ports

skylake benefits
NVME ssd
native usb 3.1 ports ( not sure just assuming at this point)

skylake still deciding should i go nvme. definately no obvious gain other then watching ssd benchmarks.
zero benefit on gaming to be honest.

usb 3.1 seriously is there any device out there thats taking advantage of its direct access to dmi gen 1 speeds??

nvlinkĀ  its not something ure gonna see beneficial on daily commercial gaming setups.
it will how ever open up to a possible octa titan pascal. also with such tech i am pretty sure nvidia has solve some multi gpu scaling here.
also interesting part about nvlink is .. the ability of gpu to access ram and vram of the other gpu directly if i am not mistaken.
the tech is gonna solve current issues with multi gpu.
*
what is the performance improvement of skylake over ivy bridge ? hmm.gif do i need an upgrade to skylake to benefit for pascal. if not for compute performance, then for at least a nvlink, but would that make any difference for single gpu setups, or would pcie3 be enough ?



QUOTE(shikimori @ Jul 28 2015, 12:05 AM)
go get yourself a 144hz IPS monitor . You wont regret it man , game that you play feels really buttery smooth without sacrificing colors and viewing angle . I have not experience g-sync or freesync but 144hz is really worth itĀ  provided you have a decent gpu

Think of it like having the COD esque movement feelingĀ  when playing non COD games even on RTS or games like Diablo 3 .
*
the thing is, there are a few technologies coming not long after. things i can think of is quantum dot which they would add as a film layer onto monitors for improved color. and who knows, maybe the led back light too would have an upgrade from the now common w-led to something like a gr-r led or better hmm.gif for even better colors. but if current ips w-led technology is more than sufficient, then yeah the acer predator 144hz gsync ips monitor seems to be the best gaming monitor out atm. the only thing i can critique is the monitor build which has a light reflective plastic frame. but still considering everything else passes scrutiny over at tft central, the few flaws seem worth glossing over smile.gif



QUOTE(skylinelover @ Jul 28 2015, 12:12 AM)
Cannonlake wait till hair drop wont come out so soon laugh.gif doh.gif pulled the trigger from lynfield 2 haswell last february with no regret haha

As 4 pascal, i still can hold out my kepler till Q3 next year if everything went according 2 plan. Being 30s means i no longer able 2 buy gpu every year unlike student days using sugar daddy money. Haha.

P/s : having 1500 per month used 2 be heaven like being student but now salary double that not enough 2 cover myself doh.gif probably need quad only enough and dont start talking about after having children with me shakehead.gif

That is unless i live malay way of 5 children with low grade milk powder hahahaha i know my 15 yrs longest serving malay supervisor ever with slightly higher salary than me already working in 2 years can support his 5 kids comfortably and maybe with some help from the G perhaps aiks
*
well kepler was out in 2012, and pascal is out in 2016. so a 4 year upgrade seems due for me. or can possible delay by 1-2 year more for a volta (though i rather not do that). besides there just isn't enough info for what volta has to even warrant waiting. not to mention because volta was delayed, i suspect pascal more or less will be the base of what volta was, but is the product they will rush out first to cover the delay. so i'm placing my bet the performance difference won't be too huge from pascal and volta. besides theres always going to be something better. but i think pascal will be powerful enough to satisfy my pc gaming requirements for a long time

I think pascal is most likely to have a big performance upgrade to warrant upgrading in this point in juncture. i'm not the type of tech enthusiast who upgrades every next year, not rich enough for that rolleyes.gif

but i can tell you know that kepler 680gtx for me is not enough for current games. i tried ultra settings on dragon age inquistion and it was totally unplayable. had to lower quality settings to medium/high sad.gif

not playing at 144hz gsync (but instead my paltry 60ghz triple sync vsync mode) is one thing; but not being able to play at ultra settings on a 1080p resolution on a 24'' lcd ips screen is a bit too much for me to ignore doh.gif

This post has been edited by Moogle Stiltzkin: Jul 28 2015, 01:18 AM
arslow
post Jul 28 2015, 09:25 AM

Look at all my stars!!
*******
Senior Member
3,544 posts

Joined: Sep 2008


QUOTE(cstkl1 @ Jul 27 2015, 10:27 PM)
haswell benefit

if u want more
faster encoding via the huge dmi bandwidth.
sata 3 native ports
usb 3 ports

skylake benefits
NVME ssd
native usb 3.1 ports ( not sure just assuming at this point)

skylake still deciding should i go nvme. definately no obvious gain other then watching ssd benchmarks.
zero benefit on gaming to be honest.

usb 3.1 seriously is there any device out there thats taking advantage of its direct access to dmi gen 1 speeds??

nvlink  its not something ure gonna see beneficial on daily commercial gaming setups.
it will how ever open up to a possible octa titan pascal. also with such tech i am pretty sure nvidia has solve some multi gpu scaling here.
also interesting part about nvlink is .. the ability of gpu to access ram and vram of the other gpu directly if i am not mistaken.
the tech is gonna solve current issues with multi gpu.
*
Yeah, usb 3.1 is like the only reason I can see to upgrade to skylake lol. Basically would love to change just my motherboard and keep the cpu, but obviously Intel would never let that zzz.

Games are barely cpu bottlenecked these days, I really feel like the only real reason I would upgrade my cpu would be if my mobo or cpu dies....

Nvlink...not too crazy about it as I've never been interested with multi gpu.

Will definitely be upgrading to pascal though from my Kepler.
arslow
post Jul 28 2015, 09:28 AM

Look at all my stars!!
*******
Senior Member
3,544 posts

Joined: Sep 2008


QUOTE(shikimori @ Jul 28 2015, 12:05 AM)
go get yourself a 144hz IPS monitor . You wont regret it man , game that you play feels really buttery smooth without sacrificing colors and viewing angle . I have not experience g-sync or freesync but 144hz is really worth it  provided you have a decent gpu

Think of it like having the COD esque movement feeling  when playing non COD games even on RTS or games like Diablo 3 .
*
Would love to do that, but feel like squeezing as much as possible from the current monitor before moving on to something better.

And asus and acer aren't exactly the best brands in terms of QC. How I wish dell made a nice 27" IPS 144hz WQHD screen...
TSskylinelover
post Jul 28 2015, 10:00 AM

Future Crypto PlayeršŸ˜„šŸ‘ŠDriver AbamsadošŸ˜ŽšŸ˜Ž
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Jul 28 2015, 01:15 AM)
well kepler was out in 2012, and pascal is out in 2016. so a 4 year upgrade seems due for me. or can possible delay by 1-2 year more for a volta (though i rather not do that). besides there just isn't enough info for what volta has to even warrant waiting. not to mention because volta was delayed, i suspect pascal more or less will be the base of what volta was, but is the product they will rush out first to cover the delay. so i'm placing my bet the performance difference won't be too huge from pascal and volta. besides theres always going to be something better. but i think pascal will be powerful enough to satisfy my pc gaming requirements for a long time

I think pascal is most likely to have a big performance upgrade to warrant upgrading in this point in juncture. i'm not the type of tech enthusiast who upgrades every next year, not rich enough for that  rolleyes.gif

but i can tell you know that kepler 680gtx for me is not enough for current games. i tried ultra settings on dragon age inquistion and it was totally unplayable. had to lower quality settings to medium/high  sad.gif

not playing at 144hz gsync (but instead my paltry 60ghz triple sync vsync mode) is one thing; but not being able to play at ultra settings on a 1080p resolution on a 24'' lcd ips screen is a bit too much for me to ignore  doh.gif
*
Haha same boat here. I also was tempted with gtx680 last time but decided 2 skip over kepler rehash because just started my career. So cannot simply splash unlike students days. Now i target 3 year gap cycle instead of 2. Since i already hop in 1440p zone, its either high end or mid end SLI next year. Haha.


cstkl1
post Jul 28 2015, 10:06 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

Moogle Stiltzkin
Well depending task but generally clock to clock

haswell DC has 10% single thread gain
Skylake has around 5 percent to haswell.

so 15.5% faster. Also comparing ivybridge to haswell is unfair as it depends on the task. Things like avx/avx2 encoding .. haswell is about 20-30 percent faster.

Skylake real benefit is the 20 pcie lane. So u can run sli with nvme. Also now mobile gpus can have either sli or native 16x single gpu with m.2 pcie ssd.

The cons is ddr4 is too immature. Any platform that has ddr3/ddr4 dimm slots.. dont expect too much from it. Same like p55.
DDR4 atm too immature. Generally i hoping for 4800 CL18-21

but expect rams to evolve slower this time since most rams in the world nowadays is either micron, samsung or sk hynix.
The later two seems to be concentrating on die stacking.


This post has been edited by cstkl1: Jul 28 2015, 10:09 AM
Moogle Stiltzkin
post Jul 28 2015, 10:26 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Jul 28 2015, 10:06 AM)
Moogle Stiltzkin
Well depending task but generally clock to clock

haswell DC has 10% single thread gain
Skylake has around 5 percent to haswell.

so 15.5% faster. Also comparing ivybridge to haswell is unfair as it depends on the task. Things like avx/avx2 encoding .. haswell is about 20-30 percent faster.

Skylake real benefit is the 20 pcie lane. So u can run sli with nvme. Also now mobile gpus can have either sli or native 16x single gpu with m.2 pcie ssd.

The cons is ddr4 is too immature. Any platform that has ddr3/ddr4 dimm slots.. dont expect too much from it. Same like p55.
DDR4 atm too immature. Generally i hoping for 4800 CL18-21

but expect rams to evolve slower this time since most rams in the world nowadays is either micron, samsung or sk hynix.
The later two seems to be concentrating on die stacking.
*
i'm waiting for hmc but there is hardly any news when intel will add support to their chipsets and motherboards for this. ddr4 seems like something gonna die real soon :/ especially now that stuff like hbm is coming out.

though i doubt hbm will be used on motherboards because it was designed for graphics usage. hmc was designed for pc. but from what i heard hmc is not a jedec standard ? not sure why hmm.gif

so 15.5 % fps more ? from cpu hmm.gif

i guess will have to test out first what fps i get on pascal. i'll be using on a 1080p 1920x1200 reso 24'' i think thats more than enuff to hit my 60fps cap when using triple buffering vsync hmm.gif


by the way i preordered starcraft 2 legacy of the void. the graphics is not intensive at all .... lel... even my kepler can drive this game at ultra doh.gif fuggin blizzard.....

gotta put my hopes in new doom game to really push the gpu laugh.gif though not sure how the game will be like. think carmack left that company :[


As i suspected amd is gonna rush out their pirate gpu
http://wccftech.com/amd-r9-400-series-gpus...arctic-islands/


but there is already speculation out how things will be like then *change 970 to pascal
user posted image

laugh.gif


QUOTE
An update from sweclockers.com has revealed the code name of the next-next-gen AMD Radeon R9 400 Series GPUs. It goes without saying that the naming is pretty much irrelevant at such an early stage. The revealed codename for the series is Arctic Islands and the actual GPU name could be of any island present there. There is currently zero information regarding the Radeon R9 400 Series apart from the fact it will be based on a 20nm or lower node (most probably 16/14nm FinFET).


small nm fabrication is good but i doubt it will be enough. have they had time to develop a new architecture to be more power efficient and still have great performance ? cause i hadn't heard anything about that. Unlike volta and pascal we at least long ago heard they were working on a new architecture hmm.gif

This post has been edited by Moogle Stiltzkin: Jul 28 2015, 10:29 AM
TSskylinelover
post Jul 28 2015, 10:57 AM

Future Crypto PlayeršŸ˜„šŸ‘ŠDriver AbamsadošŸ˜ŽšŸ˜Ž
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Jul 28 2015, 10:26 AM)
i'm waiting for hmc but there is hardly any news when intel will add support to their chipsets and motherboards for this. ddr4 seems like something gonna die real soon :/ especially now that stuff like hbm is coming out.

though i doubt hbm will be used on motherboards because it was designed for graphics usage. hmc was designed for pc. but from what i heard hmc is not a jedec standard ? not sure why  hmm.gif

so 15.5 % fps more ? from cpu  hmm.gif

i guess will have to test out first what fps i get on pascal. i'll be using on a 1080p 1920x1200 reso 24'' i think thats more than enuff to hit my 60fps cap when using triple buffering vsync  hmm.gif
by the way i preordered starcraft 2 legacy of the void. the graphics is not intensive at all .... lel... even my kepler can drive this game at ultra  doh.gif  fuggin blizzard.....

gotta put my hopes in new doom game to really push the gpu  laugh.gif  though not sure how the game will be like. think carmack left that company :[
As i suspected amd is gonna rush out their pirate gpu
http://wccftech.com/amd-r9-400-series-gpus...arctic-islands/
but there is already speculation out how things will be like then *change 970 to pascal
user posted image

laugh.gif
small nm fabrication is good but i doubt it will be enough. have they had time to develop a new architecture to be more power efficient and still have great performance ? cause i hadn't heard anything about that. Unlike volta and pascal we at least long ago heard they were working on a new architecture  hmm.gif
*
Haha i more stoked with doom 3 than doom 4. Especially now with my heavy workload daily just 2 get double earnings from student allowance. I am less enthusiastic the doom 4 but if reviews are great, i will buy the game surely. I definitely miss unis days more since jumping in workforce world. Argh dang it.
takeshiru
post Jul 28 2015, 12:46 PM

Getting Started
**
Junior Member
91 posts

Joined: Jun 2010
Hello guys..

Hope to get some opinion here..currently using strix980.. And a 1080 monitor..
Selling my 980 off and upgrading to 980ti? Or get the acer predator?

Im leaning toward the monitor as my current one isnt great.. If i get the 980ti , could be a waste using it on a dell nom gsync monitor right?
arslow
post Jul 28 2015, 01:49 PM

Look at all my stars!!
*******
Senior Member
3,544 posts

Joined: Sep 2008


QUOTE(takeshiru @ Jul 28 2015, 12:46 PM)
Hello guys..

Hope to get some opinion here..currently using strix980.. And a 1080 monitor..
Selling my 980 off and upgrading to 980ti? Or get the acer predator?

Im leaning toward the monitor as my current one isnt great.. If i get the 980ti , could be a waste using it on a dell nom gsync monitor right?
*
I would upgrade the monitor rather than the gpu if I were you.
Moogle Stiltzkin
post Jul 28 2015, 02:30 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(takeshiru @ Jul 28 2015, 12:46 PM)
Hello guys..

Hope to get some opinion here..currently using strix980.. And a 1080 monitor..
Selling my 980 off and upgrading to 980ti? Or get the acer predator?

Im leaning toward the monitor as my current one isnt great.. If i get the 980ti , could be a waste using it on a dell nom gsync monitor right?
*
i'd rather spend money on acer predator.

980 to 980ti isn't as good as 980 + acer predator.

10-20% fps gains is nothing compared to 144hz refresh rate, gsync, 1440p resolution, 27'' IPS !!!

besides if you already got a 980 why get a 980ti ? when they just reported that pascal is next year 2016 with double the transistor count and most likely to have gains of 50-60% over a titan x ?

anyway get the acer predator you won't regret flex.gif

yaphong
post Jul 28 2015, 10:06 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(Moogle Stiltzkin @ Jul 28 2015, 02:30 PM)
i'd rather spend money on acer predator.

980 to 980ti isn't as good as 980 + acer predator.

10-20% fps gains is nothing compared to 144hz refresh rate, gsync, 1440p resolution, 27'' IPS !!!

besides if you already got a 980 why get a 980ti ? when they just reported that pascal is next year 2016 with double the transistor count and most likely to have gains of 50-60% over a titan x ?

anyway get the acer predator you won't regret  flex.gif
*
Yeah but with 980 it is hard to achieve 144 fps at 1440p. Mine is just nice for 60 fps at 1440p for most games at full settings.
yaphong
post Jul 28 2015, 10:09 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(takeshiru @ Jul 28 2015, 12:46 PM)
Hello guys..

Hope to get some opinion here..currently using strix980.. And a 1080 monitor..
Selling my 980 off and upgrading to 980ti? Or get the acer predator?

Im leaning toward the monitor as my current one isnt great.. If i get the 980ti , could be a waste using it on a dell nom gsync monitor right?
*
I thought of upgrading my 980 to 980Ti too. However even if I can sell of my current card at RM1900, I still need to top up additional RM1400 to get 980Ti Strix (Strix to Strix for fair comparison) and this is just for about 20% to 40% fps gain. For RM1400 I think it is much better to spend on PS4 hahaha
SSJBen
post Jul 28 2015, 11:25 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 28 2015, 02:30 PM)
i'd rather spend money on acer predator.

980 to 980ti isn't as good as 980 + acer predator.

10-20% fps gains is nothing compared to 144hz refresh rate, gsync, 1440p resolution, 27'' IPS !!!

besides if you already got a 980 why get a 980ti ? when they just reported that pascal is next year 2016 with double the transistor count and most likely to have gains of 50-60% over a titan x ?

anyway get the acer predator you won't regretĀ  flex.gif
*
Problem with Acer? Great panel, nice tech. But the cheapest ass looking glossy plastic and bezels as thick as 2008 monitors. Oh and don't get me started on the QC.... sweat.gif
Over RM2k+ for a monitor, it is only right to expect something much better. It barely costs half as much to produce the monitor.

This is why I want Dell and LG to come up with their Gsync monitors. LG already makes their own IPS panels anyway, their curved screens has been doing pretty decent considering how niche that market is. So what the hell is stopping them?

This post has been edited by SSJBen: Jul 28 2015, 11:26 PM
takeshiru
post Jul 29 2015, 01:39 AM

Getting Started
**
Junior Member
91 posts

Joined: Jun 2010
QUOTE(Moogle Stiltzkin @ Jul 28 2015, 02:30 PM)
i'd rather spend money on acer predator.

980 to 980ti isn't as good as 980 + acer predator.

10-20% fps gains is nothing compared to 144hz refresh rate, gsync, 1440p resolution, 27'' IPS !!!

besides if you already got a 980 why get a 980ti ? when they just reported that pascal is next year 2016 with double the transistor count and most likely to have gains of 50-60% over a titan x ?

anyway get the acer predator you won't regret  flex.gif
*
Yea strongly agree too..but acer Qc seems questionable.. But the best ips so far..
takeshiru
post Jul 29 2015, 01:41 AM

Getting Started
**
Junior Member
91 posts

Joined: Jun 2010
QUOTE(yaphong @ Jul 28 2015, 10:09 PM)
I thought of upgrading my 980 to 980Ti too. However even if I can sell of my current card at RM1900, I still need to top up additional RM1400 to get 980Ti Strix (Strix to Strix for fair comparison) and this is just for about 20% to 40% fps gain. For RM1400 I think it is much better to spend on PS4 hahaha
*
Test market with mine recently..lucky if u get 1.8k with it..haha usually lower..
Moogle Stiltzkin
post Jul 29 2015, 02:00 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 28 2015, 11:25 PM)
Problem with Acer? Great panel, nice tech. But the cheapest ass looking glossy plastic and bezels as thick as 2008 monitors. Oh and don't get me started on the QC....Ā  sweat.gif
Over RM2k+ for a monitor, it is only right to expect something much better. It barely costs half as much to produce the monitor.

This is why I want Dell and LG to come up with their Gsync monitors. LG already makes their own IPS panels anyway, their curved screens has been doing pretty decent considering how niche that market is. So what the hell is stopping them?
*
i thought the glossy part was only the frame not the actual screen hmm.gif

if screen also glossy i might reconsider :/ cause that would be friggin annoying... (not fan of glossy).


*update

there i double checked

QUOTE
Panel Coating
Light AG coating



QUOTE
Glossy black bezel and stand, some red trim on base



QUOTE
Anti-Glare Coating (AG)

The most common type of protective coating is ā€˜Anti-Glare’ (AG). This is often described as a matte coating as it is non-reflective to the user since it diffuses rather than reflects ambient light. It provides a method for manufacturers to avoid glare on the viewing surface from other light sources and has been used in the LCD monitor market for many years since the first TFT displays started to emerge.

This matte coating is included as an outer polarizing later which has been coarsened by mechanical or chemical processes. This achieves a surface coating which is not smooth and so can diffuse ambient light rather than reflect it. What is particularly important to understand is that this AG coating can be applied to panels with varying thicknesses, which has an impact on the anti-glare properties, but also on the underlying image of the screen. Where the coating is particularly thick and aggressive, the image from the screen can deteriorate as the light being emitted is also affected. This can have some impact on contrast and colour vibrancy and the perceived image can sometimes look dull as a result. Sharpness degradation can also occur in some extreme cases where AG coating is too thick. Users may also sometimes see the graininess of the coating, particularly when viewing white or light backgrounds. This can be particularly distracting for office work and images can look grainy or dirty if the coating is too aggressive. I would point out that not everyone would even notice this at all, and many users are perfectly happy with their screens even where aggressive AG is used. It’s just something to be wary of in case you have found problems with image quality in the past or are susceptible to it.

In other cases, AG coating is applied but it is light and far less obtrusive. The polarizer is less rough and has a lower haze value. Sometimes users refer to it as ā€œsemi-glossā€ to distinguish the difference between these and the heavy AG coatings. This provides anti-glare properties but does not result in the grainy appearance of images. It is not a fully glossy solution though.

AG coating has long been the choice for nearly all professional-grade displays as well, helping to avoid issues with reflections and external light sources which are vital for colour critical work. In addition it should be noted that AG coating is less susceptible to dust, grease and dirt marks which can become an issue on reflective glossy coating alternatives.

http://www.tftcentral.co.uk/articles/panel_coating.htm



so the only issue is the bezel and stand. but honestly, as long as the panel itself is matte then i personally is that big a deal. Of course if another brand came up with a similar specs and performed just as well, but none of that glossy nonsense (when will they learn doh.gif ) then yeah i would recommend that instead.

but till then this is the way to go 2016 notworthy.gif


PS:
according to online sources this particular model also supports internal programmable 14-bit 3D lookup tables (LUTs) for calibration. so you can caliberate your settings directly into the monitor, so even video sources like when watching mpc would benefit from calibrating this monitor.

the only color cons i can think for this monitor is it uses the cheaper w-led, rather than gb-r led. Though considering everything else minus the glossy bezel/stand, i still think it's still way better than a tn panel. and it's got 14bit internal lut which is also to me important to have as bare minimum :} *need calibrator though





This post has been edited by Moogle Stiltzkin: Jul 29 2015, 02:12 AM
Maxieos
post Jul 29 2015, 02:39 AM

Look at all my stars!!
*******
Senior Member
3,754 posts

Joined: May 2008
May I know what's the different upgrading driver by going to nvidia website and download compare to going to windows updates and update the intel and nvidia driver ?
Moogle Stiltzkin
post Jul 29 2015, 09:17 AM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(Maxieos @ Jul 29 2015, 02:39 AM)
May I know what's the different upgrading driver by going to nvidia website and download compare to going to windows updates and update the intel and nvidia driver ?
*
i use this app to uninstall drivers
http://www.guru3d.com/files-details/displa...r-download.html


but before that i already downloaded the drivers first from
http://www.geforce.com/drivers


so in the DDU process it reboots you safe mode, then proceeds to uninstall the driver. then it reboots back to windows.

now can safely install from the drivers downloaded direct from nvidia doh.gif



125 Pages « < 11 12 13 14 15 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0222sec    0.83    6 queries    GZIP Disabled
Time is now: 30th November 2025 - 07:20 PM