Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
6 Pages < 1 2 3 4 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V15 (new era pascal), ALL HAIL NEW PASCAL KING GTX1080 out now

views
     
SSJBen
post Jul 25 2015, 08:33 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(nettelim @ Jul 25 2015, 08:21 PM)
» Click to show Spoiler - click again to hide... «

» Click to show Spoiler - click again to hide... «


2x 970 is best lane up in current time.
Not considering 980/980Ti because is it not price wise. RM3.2k~3.7k for one card is too much.
btw im getting USED card, it is hard to find 980ti in used market.

So it is not a good timing to get 2k/4k now?
one 970 and stick with 1080, wait for better tech to support?
*
In Msia, no it's not a good time. What you are paying now is severely overpriced for non-legit reasons.
2x970 is literally the same price as an aftermarket 980Ti in the states, over in Msia..? Lol 2x aftermarket 970s is barely a single reference 980Ti.

Live in the US? It's a heck of a time to jump onto 980Ti + 1440p + Gsync setup.
SSJBen
post Jul 26 2015, 03:17 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


32GB is for their Quadro cards.

8GB is pretty much a given for the next x70/x80 cards, whatever Nvidia is going to call them (1080 would be hilarious).
Question now is, if Hynix has enough stock left for Nvidia or not since AMD will get he major bulk of HBM2 in Q1 2016.

Obviously things can change very quickly, it's business after all.
SSJBen
post Jul 26 2015, 09:14 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 26 2015, 09:03 PM)
noticed the pascal has mixed precision fp16/32/64
so was reading on what exactly it has to do with gaming :/
bottomline
http://wccftech.com/nvidia-gm200-gpu-fp64-performance/
anyway with pascal will have improved fp64 as well as mixed precision.

http://wccftech.com/nvidia-pascal-gpu-17-b...rrives-in-2016/
*
One thing bro, don't quote wccftech too much. Their articles are all opinions (yet they think it's facts).
SSJBen
post Jul 27 2015, 02:33 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 27 2015, 06:05 AM)
will the new doom game be out by the time pascal arrives  hmm.gif

cause thats the game i want to be eye candy pimping on with the pascal  drool.gif

also theres star citizen as well.
*
I believe it would be a late Q2 or mid-Q3 2016 release, just my estimated guess following Bethesda's fiscal release. They have their Q1 covered with Fallout 4 already.

If all goes well and Nvidia stays on track, Pascal should come out by Q3 2016. There are rumors circulating that Nvidia will release big Pascal first, as oppose to what they did with Kepler and Maxwell. Just a rumor though, depends on the market as always.
SSJBen
post Jul 27 2015, 03:23 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(terradrive @ Jul 27 2015, 09:28 AM)
Yes, got one they claimed Fury Nano benchmarked but the article inside shows it's calculated  doh.gif
*
Yup.
And remember all the claims they made about Kepler before launch... lol, many of which is untrue other than the "state-the-obvious" remarks. doh.gif


QUOTE(Moogle Stiltzkin @ Jul 27 2015, 03:05 PM)
what about this doubling of transistors. i heard something about most of that is mostly going to be for hpc compute or something rather than mostly gaming performance, so their performance estimate for gaming was somehwere between 50-60% vs titanx. any ideas :/ ?
do you mean their highend model will come out first ? that suits me fine. but i rather avoid a titan x model, and rather opt for a 980ti equivalent :/ i rather save money when possible xd.
*
It makes sense, 50-60% is quite similar to that of Kepler from Fermi and Maxwel from Kepler. HBM while interesting, I don't think we will see most of its potential until 2017, when DX12 and Vulkan is much more matured. NVLink apparently will be focused for supercomputers only, not sure if it will make it to the consumer grade cards or not? There's no confirmation on this.

Yeah, there is a rumor circulating around that PK100 will make the scenes first, instead of PK104. I honestly... doubt it? laugh.gif
SSJBen
post Jul 28 2015, 11:25 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 28 2015, 02:30 PM)
i'd rather spend money on acer predator.

980 to 980ti isn't as good as 980 + acer predator.

10-20% fps gains is nothing compared to 144hz refresh rate, gsync, 1440p resolution, 27'' IPS !!!

besides if you already got a 980 why get a 980ti ? when they just reported that pascal is next year 2016 with double the transistor count and most likely to have gains of 50-60% over a titan x ?

anyway get the acer predator you won't regretĀ  flex.gif
*
Problem with Acer? Great panel, nice tech. But the cheapest ass looking glossy plastic and bezels as thick as 2008 monitors. Oh and don't get me started on the QC.... sweat.gif
Over RM2k+ for a monitor, it is only right to expect something much better. It barely costs half as much to produce the monitor.

This is why I want Dell and LG to come up with their Gsync monitors. LG already makes their own IPS panels anyway, their curved screens has been doing pretty decent considering how niche that market is. So what the hell is stopping them?

This post has been edited by SSJBen: Jul 28 2015, 11:26 PM
SSJBen
post Jul 29 2015, 05:32 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 29 2015, 02:00 AM)
i thought the glossy part was only the frame not the actual screen  hmm.gif

if screen also glossy i might reconsider :/ cause that would be friggin annoying... (not fan of glossy).
*update

there i double checked
http://www.tftcentral.co.uk/articles/panel_coating.htm
so the only issue is the bezel and stand. but honestly, as long as the panel itself is matte then i personally is that big a deal. Of course if another brand came up with a similar specs and performed just as well, but none of that glossy nonsense (when will they learn  doh.gif ) then yeah i would recommend that instead.

but till then this is the way to go 2016  notworthy.gif

PS:
according to online sources this particular model also supports internal programmable 14-bit 3D lookup tables (LUTs) for calibration. so you can caliberate your settings directly into the monitor, so even video sources like when watching mpc would benefit from calibrating this monitor.

the only color cons i can think  for this monitor is it uses the cheaper w-led, rather than gb-r led. Though considering everything else minus the glossy bezel/stand, i still think it's still way better than a tn panel. and it's got 14bit internal lut which is also to me important to have as bare minimum :}  *need calibrator though
*
Oh yes, I do mean the bezels and the cheap stand that wobbles pretty badly. It may not seem like an issue, but the glossy bezels do cause reflection and personally, it is extremely annoying for me. I know, because I owned several Samsung and Viewsonic monitors back from 2009 when glossy was like icing on the cake, everyone must make something glossy. doh.gif

I agree the panel is indeed very good, there's really not much qualms about it. AUO has been doing a great job lately, over the last few years. They've really stepped up their game.

But like I said, the issue lies with Acer's QC. You can read up on other forums like reddit, neogaf, OCN even, there are numerous issues like terrible backlight bleed that just happens over a few weeks of usage. There many stories on dead pixels too, pretty much unacceptable for a monitor of such high price.

SSJBen
post Jul 29 2015, 11:16 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


One other tip.

If you don't use Nvidia HD audio, untick it.
If you don't use GFE and Shadowplay, untick it.
If you don't use 3D, untick it.

Seriously, the only things that are absolutely needed is the display driver and PhysX, everything else is optional.
People keep getting into errors because they just press next blindly.
SSJBen
post Jul 30 2015, 02:11 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(goldfries @ Jul 30 2015, 12:50 AM)
Physx is optional too. biggrin.gif
*
Well theoretically yes.. but at least its better to have it than not have it for compatibility sake, unlike the other stuff.
SSJBen
post Aug 5 2015, 03:55 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(eatsleepnDIE @ Aug 5 2015, 03:23 PM)
» Click to show Spoiler - click again to hide... «


lol i dont care about my cpu (well actually i do but less then my gpu) coz i could buy an aio for it and forget about it for at least a year...

but for my gpu...it is hard to make it cool (it is cool and hot at the same time if you catch my drift  tongue.gif  ) coz a complete water cooled setup will require a maintenance which im too lazy to do...so i guess the easiest path is aio with the gpu bracket like kraken g10 or evga hybrid solution.

all in all, i hope malaysia will get snow soon so i could oc my gpu to crazy level with stock cooler lol.
*
The day Malaysia gets snow is when the world will perish... logically and scientifically speaking.

But on a more on-topic note, I don't know where the assumption comes from if snow = super low temps for PC? Okay sure, definitely lower than in hot countries but the perception of living in a cold country and having low temps on PC is wrong. I stayed in the US during winter, several times. Guess what? We have to turn on the heaters.

The house is actually dryer and hotter with the heaters than in Malaysia where you'd just have the fan on instead. Ironic? Yeah.

Don't turn on the heaters? Well, if you enjoy wearing 3 layers of clothes, then be my guest. biggrin.gif


Also, one of the main reason for going water is for looks. I don't know why people found it pleasing to have 2 AIOs in their system, it's so tacky and n00bish looking. I mean, I'm not trying to even sound like a snob here but it is what it is. Might as well just optimize for air cooling instead, because good air cooling is nearly as effective as AIOs (in some cases, even better), cost less and guess what? It won't break (where as AIOs will, just read up the numerous amount of stories about AIOs dying prematurely).
SSJBen
post Aug 5 2015, 06:03 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Aug 5 2015, 04:09 PM)
weird how nvidia gpu is performing different on linux and windows
http://www.phoronix.com/scan.php?page=arti...n10-linux&num=2
*short story

play game only on windows tongue.gif lel
i'd spend my buck on pc, gpu, hdds, etc etc.... water cooling really is the last resort when you run out of things to upgrade. that and mkb  laugh.gif

i understand the temptation of the bling, but in all honestly the primary concern should be the temp. just run a few temp program to track your cpu and gpu temps. if it goes 70-80+ should really go water cooling for sure :]

or even the sound issue. for me this was important :}
*
There are 3 main reasons for water, temps obviously, looks and noise.

I personally don't like AIOs at all. They're inconsistent, could either be noisy, look terrible, or just like I said... prematurely fail without you knowing.
That's why I said if AIO is the last resort for lower temps, that I do not agree at all. A very well optimized case for airflow and airblow is just as good as AIOs, if not better (the latter, if you bring cost into the discussion).


QUOTE(eatsleepnDIE @ Aug 5 2015, 04:09 PM)
» Click to show Spoiler - click again to hide... «


sir, i am a NOOB and my chassis is mitx type so i dont care about the internal looks  tongue.gif

errr...i thought it is normal to turn on the heater during winter? because outside is cold and you want the warmth inside the house. Well yeah the wishing for the snow to cool the rig are plain stupid so I apologise for that lol.

I dont know about the AIO dying prematurely, never occurred to me thank god for that but I still favored the AIO coz the simpleness and troubled free (again this may varied).

Again, this is my current opinion, maybe will change later  tongue.gif
*
Keep those fingers crossed. I'm not trying to hate on AIOs, but when I open up and see how low quality the stuff are in there, it cringes me when people go around and say AIOs are awesome. They didn't even know what they paid for. AIO isn't even leak-proof to begin with, yet it is being marketed as being so...?

Good that your AIO has not died. But just keep an eye on it when you start hearing the pump squeel. Have seen several systems died because of a failed AIO and the system was left running at very high temps for a long period of time.
SSJBen
post Aug 6 2015, 02:10 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Aug 5 2015, 07:07 PM)
Anyway regarding waterblocks apparently the one meant for titan x also works for the 980ti for the EKWB branded one. So some waterblocks will work with other gpu.... but i wouldn't hold my breath if it would on pascal. Moment pascal comes out, my 680 water block is as good as an expensive paper weight  doh.gif
*
Zero chances of any existing blocks working for Pascal. The fact that the PCBs will definitely be smaller and also the GPU + HBM dies being overall bigger, there's just no way for any existing block to be compatible for Pascal.

Yeah, one problem with GPU waterblocks is that it isn't compatible forever. More so in Malaysia where the WC community is quite puny, selling it off is also quite difficult. US and EU has a way larger market where even blocks back from the Fermi days can still be sold relatively easily.
SSJBen
post Aug 6 2015, 09:37 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(skylinelover @ Aug 6 2015, 08:15 PM)
Haha. Devils canyon FTW. Skylake did not impress me. laugh.gif doh.gif
*
Actually... there's more to the story than just avg fps and 10% higher numbers.
In a nut shell (since this isn't an Intel thread), Skylake > Devils Canyon/Haswell by a noticeable margin when it comes to min.fps and most importantly, frame latency.

It's even more significant if comparing Skylake to SB/IVB though. So... yeah, don't just look at the avg.fps and draw the conclusion.


*PS
Where's the Intel thread anyway?


*EDIT
For reference only:




See how frame hitching on i5 6600k is as good as an i7 4790k at lower clockspeeds? OC that thing mildly and wham! It's a clear noticeable difference!

This post has been edited by SSJBen: Aug 6 2015, 09:46 PM
SSJBen
post Aug 7 2015, 06:05 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


I believe Pascal to Volta will be like Kepler to Maxwell. More improvements in performance-per-watt instead of just pure increase in rendering power.

But understand that in 3 years time, the landscape of developing games will be different than how it is now. Just like 3 years ago (and also the 3 years prior), developing games was different than it is today.

On the side note, Skylake is a sensible upgrade for SB users, not saying that SB isn't fast enough anymore because it still is. But I do believe that games in the next year+ will indeed start to be CPU-bound for SB users, it's already starting to actually. It's not going to be a large number of games that does that, but don't be surprised is all I'm saying.

Once again I'd like to remind everyone that you HAVE TO look at frame latency when factoring the performance difference between GPUs and CPUs. Avg. FPS simply DOES NOT tell the whole story. Avg.60fps is not really 60fps if you get erratic drops to low 40s for like half a second every 3 or 4 minutes of game time. People seem to forget this point.
SSJBen
post Aug 9 2015, 02:03 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(skylinelover @ Aug 8 2015, 05:40 PM)
Lolz. Here since when bcum skylake toks. laugh.gif doh.gif Anyway my next cycle of upgrading will be 4k gaming. That means 2017 and beyond. laugh.gif rclxms.gif Monitor gpu and cpu all in 1 loot. Hahahaha.
*
MYR 5.0 to USD1 that time... whistling.gif
SSJBen
post Aug 9 2015, 10:30 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Aug 9 2015, 02:56 PM)
tweakguide updated their nvidia graphics settings guide here
http://www.tweakguides.com/NVFORCE_1.html
very useful tips like
smile.gif
by the way just wondering, for AF do you all do global ? cause seems like then i'd have to set manually for the game disable af if it has in options  hmm.gif  so isn't just simpler leave 3d app, and only if the game doesn't have the option then manually set for that app ?
*
Not all games uses AF and it doesn't just apply to older games stuck with Trilinear or Bilinear filtering.

Recent games like Witcher 3 did not use AF on release day and it was only secretly added into the config text file after patch 1.04. It wasn't even working until 1.05 and there is no way to manually change it (as changing the values does nothing).

So yeah, I think it is best to set AF to its default Quality setting for global and then individually overide it with High Quality on games that do have AF support.
SSJBen
post Aug 12 2015, 05:47 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(skylinelover @ Aug 12 2015, 12:28 PM)
Woohoo. CGPA 4 ringgit equals 1 USD. At this rate, i think i rather jump 2 volta in 2017. It will need some sort of miracles 2 ever get back 2 zaman 2.8 equals a dollar. laugh.gif doh.gif
*
Lol 2017? We should plan to refuge by then instead if nothing changes over the next year.
SSJBen
post Aug 13 2015, 03:14 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Witcher 3 had very, very, little FMVs. The game is only 30GB+ including the latest patches yet is one of the biggest games in recent history. Yet, it has some of the most terrible animation rigging in a AAA game, ever. doh.gif

FMVs would have helped Witcher 3 immensely to be honest.

CDPR does not have Fox Engine, just sayin'.
SSJBen
post Aug 18 2015, 10:47 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Aug 18 2015, 12:17 PM)
the first dx12 game as far i know
Ashes of the Singularity
http://www.extremetech.com/gaming/212314-d...go-head-to-head
:/ is the game any fun ?
*
No, it's a boring as fudge RTS. It's also terribly unbalanced at the moment.
SSJBen
post Aug 21 2015, 12:16 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Desprado @ Aug 20 2015, 11:20 PM)
Dam i bought MSI GTX 980 again.

It has 80.4 asiq quality and Elpida Vram

I am surprised that i am running this card @1545Mhz and memory clock 3900mhz (7800mhz).

I can even push more further the vram but it is Elpida  so i am scared to that.
*
Anything over 7500 for Elpida is awesome. 7800 is icing on the cake already, awesome that you got it that high.

6 Pages < 1 2 3 4 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0252sec    1.42    7 queries    GZIP Disabled
Time is now: 27th November 2025 - 05:07 AM