Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
6 Pages  1 2 3 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V15 (new era pascal), ALL HAIL NEW PASCAL KING GTX1080 out now

views
     
Moogle Stiltzkin
post Jul 11 2015, 11:57 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(queenc @ Jul 10 2015, 07:37 AM)
you guys normally set auto update driver using geforce experience or manually update it?
*
i use DDU from guru3d to do a clean install everytime there is an update.


QUOTE(shikimori @ Jul 9 2015, 06:53 AM)
why not go for lower card 970 or 980 its not that bad if gaming for 1080p

980ti kinda overkill lol at least for now . Unless you are talking about 1440p (just nice) or 4k gaming (need to lower some settings)
nah i don't agree 980ti is overkill for a 1080p.


review of 980ti
http://www.trustedreviews.com/nvidia-gefor...-results-page-2


most of the review benchmarks show that for a 1440p at ultra setting can most part reach the sweet spot of 60fps+ on average.

but if your planning using a 144hz gsync monitor, you can go above the capped 60fps when using vsync, by using the gsync. So you can take advantage of the higher fps.

only old monitors without gsync, you'd be interested only that the gpu performs on average 60fps ideally (because your locked there if your using triple buffer vsync).

So i'd say 980ti even on 1080p would be great with gsync ideally the acer predator gsync 144hz ips monitor :}

1440p for 27'' monitor is these days quite capable of being driven by the gpu especially by the 980 ti.

4k seems you need to lower your graphics settings. and especially lower AA which is less needed at this resolution.

the hardocp review pointed out what kind of settings tweaks was needed when trying to get a playable setting at the highest quality possible
http://www.hardocp.com/article/2015/06/15/...rd_gpu_review/7




Anyway there is actually a topic talking about this issue 1080p @60hz (vsync) monitor vs @144hz (gsync)
https://www.reddit.com/r/buildapc/comments/...last_for_1080p/


PS: another reason i think 980ti is needed even for 1080p, back when i bought the 680gtx, it barely lasted 2-3 years before i was not able to ultra setting some of the games by then cry.gif so i rather reduce the chances of that by future proofing when possible whenever i get a gpu upgrade every 5 years or so.


This post has been edited by Moogle Stiltzkin: Jul 11 2015, 12:27 PM
Moogle Stiltzkin
post Jul 11 2015, 06:31 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(shikimori @ Jul 11 2015, 04:08 PM)
well if you are talking about 1080p at 144hz monitor I suppose its fine but who wants to be stuck at that resolution ?
once you got a taste of  WQHD or UHD its hard going back

Also , I have to disagree on future proofing graphic card .  With the likes of NVIDIA 780 being outperform or same by 970 , the gap for new card would be higher .  Not to mention the introduction of HBM ram I think I probably have to change cards every year  sad.gif

Funny thing is for processor I'm still stuck with sandybridge cant see any reason to upgrade apart from  power efficiency  , ddr4
*
too many people to quote lel.

970 might be fine for 1080p for less than half the price which is substantial for most people, especially regular gamers (not truly enthusiast high end level pc gear).

but i think the 970s were 3.5gb vram hmm.gif

some gamers at ultra use almost 6vram like mordor. 1440p recommended is minimum 4gb, with 6gb being ideal. 4k need minimum 6gb. 2gb vram is hardly enough these days i know for a fact because i have a 680gtx cry.gif i notice every now and then there was an error where it said ran out of vram (no joke). Hasn't happened lately, maybe because of newer gpu drivers hmm.gif ?


anyway here is the direct comparison 980ti vs 970
http://gpuboss.com/gpus/GeForce-GTX-980-Ti...GeForce-GTX-970



here is a dragon age inquisition benchmark for 1080 and 1440p resolution using different settings from low, medium, high and ultra.
http://www.guru3d.com/articles_pages/drago...k_review,7.html


notice ultra for 970 is 50fps


for 1440p it was only 33fps for ultra
http://www.guru3d.com/articles-pages/drago...k-review,8.html





bottomline


970

pro
---------
- more than half the price of a 980ti
- sufficient for 1080p even at ultra for a heavy title like DAI



cons
-------

3.5vram instead of 4vram. How this may affect gaming
http://www.extremetech.com/extreme/198223-...emory-problem/2

ultra settings on intensive games like DAI will only reach between 40-50fps. whereas a 980ti would hit 60fps and probably higher.

if you invested in a gsync monitor with 144hz, you'd definitely want a higher end card like a 980ti to achieve much better fps thx to gsync technology :}




Moogle Stiltzkin
post Jul 11 2015, 06:51 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(shikimori @ Jul 11 2015, 06:36 PM)
sad.gif gsync ips 1440p monitor finally in malaysia

27" XB270HU-IPS RM2599 at Idealtech .... Sigh, if only IPMART would refund my money as soon as possible would grab this monitor without any hesitation
*
rm2599 is not bad for a ips 27'' with gsync.... and if thats not enough @ 144hz !!! oh and with stellar reviews especially their low latency, low input lag, and overdrive done correctly without issues (normal settings)
http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm






i paid like almost rm 3k for my u2413 ah-ips gb-led 24'' lcd 60hz (no gsync)


sure the acer is a white led backlight, but honestly, i'd trade the gb-r led backlight for the acer monitor because of the gsync and the other stellar performance for gamers. my minimum threshold is has to be ips, because i can't stand tn panels.

a asus rog tn panel can achieve 1ms response time. The acer ips was tested with 5.5ms gtg response time. This is a stellar result for an ips panel.

This post has been edited by Moogle Stiltzkin: Jul 11 2015, 06:53 PM
Moogle Stiltzkin
post Jul 13 2015, 10:48 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE
But I'm going to be frank with everyone here, the 3.5GB vram thing? It's not an issue. Honestly, it's been blown out of propotion. Stop looking at just numbers and charts, they cannot tell the whole story.


well thats why i read the reviews and they point out that games like mordor when ultra max settings will indeed use up lots of vram as high as near 6gb.

yes most games aren't as aggressive like that, but it's there, and most noticely for higher resolutions, or newer games like that on ultra settings.

sure if people don't mind lowering setting to not ultra then shouldn't be an issue. but i don't fall under the category of users that like playing with anything other than ultra doh.gif

but ignoring the vram capacity altogether that clearly affects some titles as non issue is clearly wrong way to go about it :/ as some of us would then rather get a card with 6gbvram or more.


so using an example try convincing people get a 4gb vram fiji over say a 980ti with 6gbvram especially at the same price point. thats a tough sell rolleyes.gif

not only do most titles perform slightly better fps for 980ti, but it also has more vram. and according to reviewers the capacity does matter. not me who said this, but professional reviewers go look :}


QUOTE
Concluding

Our product reviews in the past few months and its conclusion are not any different opposed to everything that has happened in the past few days, the product still performans similar to what we have shown you as hey .. it is in fact the same product. The clusterfuck that Nvidia dropped here is simple, they have not informed the media or their customers about the memory partitioning and the challenges they face. Overall you will have a hard time pushing any card over 3.5 GB of graphics memory usage with any game unless you do some freaky stuff. The ones that do pass 3.5 GB mostly are poor console ports or situations where you game in Ultra HD or DSR Ultra HD rendering. In that situation I cannot guarantee that your overall experience will be trouble free, however we have a hard time detecting and replicating the stuttering issues some people have mentioned.


The Bottom line

Utilizing graphics memory after 3.5 GB can result into performance issues as the card needs to manage some really weird stuff in memory, it's nearly load-balancing. But fact remains it seems to be handling that well, it’s hard to detect and replicate oddities. If you unequivocally refuse to accept the situation at hand, you really should return your card and pick a Radeon R9 290X or GeForce GTX 980. However, if you decide to upgrade to a GTX 980, you will be spending more money and thus rewarding Nvidia for it. Until further notice our recommendation on the GeForce GTX 970 stands as it was, for the money it is an excellent performer. But it should have been called a 3.5 GB card with a 512MB L3 GDDR5 cache buffer.



sources
http://www.guru3d.com/news-story/middle-ea...tress-test.html

http://wccftech.com/shadow-of-mordor-ultra...xture-6gb-vram/

http://www.overclock3d.net/articles/gpu_di...s_6gb_of_vram/1


PS: that said, i seriously doubt blizzard games will ever be pushing the gpu much rolleyes.gif not holding my breath for legacy of the void. i suspect it's just gonna be as gpu intensive as the old sc2. it's only games like mordor, crysis that they push the limits.


This post has been edited by Moogle Stiltzkin: Jul 13 2015, 11:26 AM
Moogle Stiltzkin
post Jul 14 2015, 05:40 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
oo.,...ooo

http://www.tweaktown.com/news/46420/amd-pr...idia/index.html


amd's gonna squeeze nvidia's supply yawn.gif hope the prices don't rise....
Moogle Stiltzkin
post Jul 14 2015, 09:21 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(TheHitman47 @ Jul 14 2015, 05:48 AM)
"priority access", looks like nvidia getting a taste of your own medicine.  tongue.gif
*
well thats what happened when they placed the wrong bet on hmc :/

we thought those fiji low stock was bad. now with rumors that hbm2 will also be low on supply, and amd going to get priority. thats gonna suck for nvidia doh.gif

i'm wondering though should i get a gp100 or a gp104 ? not really sure whats the difference other than the former will come first. will it follow the same strategy that the higher end card comes out first ? cause i rather get that card if thats the case doh.gif

Is my Intel 3770 ivy bridge going to be enough :/ ? don't really feel like upgrading that until cannonlake. Also wondering how the amd zen will turn out, whether it will rock the boat in the cpu market kek.
Moogle Stiltzkin
post Jul 15 2015, 09:12 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 15 2015, 05:45 PM)
Looks like MSI 980Ti Gaming is in the local market now. RM3339 RSP. lol.

On the other hand, the EVGA 980Ti Classy was sold out within 2 hours on Newegg. doh.gif
Didn't even have a chance to add into cart.
*
does that come with water cooling block ? hmm.gif

nm found it

QUOTE
The EVGA GeForce GTX 980 Ti Hydro Copper is designed for watercooling enthusiasts. The Hydro Copper waterblock is a full-cover block that spans the entire length of the graphics card, complete with a swappable inlet/outlet for compatibility with custom watercooling solutions.

It is also factory overclocked, with a 1140MHz base speed and 1228MHz boost speed.

http://www.evga.com/Products/Product.aspx?pn=06G-P4-4999-KR

799.99 USD = 3,043.20 MYR


even with gst is only 3225.792, so why is the msi more expensive hmm.gif shipping costs?

This post has been edited by Moogle Stiltzkin: Jul 15 2015, 09:35 PM
Moogle Stiltzkin
post Jul 15 2015, 10:37 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 15 2015, 10:30 PM)
Retailers don't need to earn meh? tongue.gif
*
no cause i want free stuff *snicker snicker rolleyes.gif
user posted image



QUOTE(llk @ Jul 15 2015, 10:10 PM)
Basically EVGA Hydro is a reference card with EK waterblock whereby MSI Gaming is fully custom non reference card
ah i see. ty for the clarification notworthy.gif

This post has been edited by Moogle Stiltzkin: Jul 15 2015, 10:39 PM
Moogle Stiltzkin
post Jul 26 2015, 01:48 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE
If these alleged rumors hold true Pascal is looking like its going to be a massive leap in performance compared to leap in performance. If the full fat GP100 chip is going to have 17 billion transistors then that is over double the amount compared to GM200 and over 3 times the amount compared to GM204. Also 32 GB VRAM!? I can't wait to see what kind of setup you're going to need to even come close using up that much frame buffer. Also Pascal will be utilizing TSMC's all new 16nm FinFet + technology allowing for 65% higher speed, 2 times the density and 70% less power than that of 28HPM which is Maxwells production process. So expect to see even lower power usage, ultra low temperatures and cards finally breaking the 2GHZ barrier on air. Personally I was on the fence with upgrading to two 980 Tis from my 980s but I think Im just going to sit and wait a bit longer!
What do you guys think?


http://www.fudzilla.com/news/graphics/3830...ion-transistors

http://wccftech.com/nvidia-pascal-gpu-17-b...rrives-in-2016/



QUOTE
With Pascal GPU, NVIDIA will return to the HPC market with new Tesla products. Maxwell, although great in all regards was deprived of necessary FP64 hardware and focused only on FP32 performance. This meant that the chip was going to stay away from HPC markets while NVIDIA offered their year old Kepler based cards as the only Tesla based options. Pascal will not only improve FP64 performance but also feature mixed precision that allows NVIDIA cards to compute at 16-bit at double the accuracy of FP32. This means that the cards will enable three tiers of compute at FP16, FP32 and FP64. NVIDIA’s far future Volta GPU will further leverage the compute architecture as it is already planned to be part of the SUMMIT and Sierra super computers that feature over 150 PetaFlops of compute performance and launch in 2017 which indicates the launch of Volta just a year after Pascal for the HPC market.

http://wccftech.com/nvidia-volta-gpus-ibm-...supercomputers/



$_$; pascal is the one to wait for.

though i'm wondering what exactly is the launch date for volta. last time they said a year after pascal. then it was bumped to 2. now their saying 2017 again... lel....



so basically... the fiji mostly played catch up. didn't really outright defeat the 980ti. but they did bring in hbm (to cope with the curve of performance, this was a necessary step at some point). but that was about it. also they capped at 4gb vram due to hbm1 technology constraints .... less than the 6gb vram on a 980ti. but even worse was the price/performance they matched exactly to the 980ti which i felt was a poor decision choice, cause it was more likely people would buy a 980ti because of that.

if the rumors are true, pascal is not only adding hbm, their also going to add a huge jump in performance. and if that wasn't enough, we thought it would be 8gb vram, but now it could possible be even up to 16-32gb ......

even i think 16+ seems a bit unnecessary at this point. i'm just fine with 8gb for future proofing. or do 4k gaming ultra textures really need that much ? it would be interesting to see when those reviews come out smile.gif


now question is price ... hmm.gif



QUOTE(goldfries @ Jul 25 2015, 12:21 AM)
I tested FreeSync monitor, and I couldn't product screen tearing. tongue.gif
*
did you test regarding any issues when the fps dips below the minimum vrr ? how does that affect gaming, and how noticeable was it hmm.gif i'm interested to know.


This post has been edited by Moogle Stiltzkin: Jul 26 2015, 01:56 PM
Moogle Stiltzkin
post Jul 26 2015, 02:58 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(Minecrafter @ Jul 26 2015, 02:35 PM)
Hori crap. shocking.gif I'm sure skylinelover will like this. tongue.gif  biggrin.gif

Well,the card might run out of processing power even before reaching its max vRAM capacity,but since "add a huge jump in performance",you'll never know. hmm.gif
*
i'm very skeptical it will be 32gb.... maybe in a titan x tier level sort of product perhaps. if i had to guess it would be between 8-16gb. and even 16gb is crazy. 8gb is pretty sweet doh.gif

double transistor count (compared to a titan x and a fiji).... drool.gif

hopefully the prices aren't above the opening price of a 980ti :/ can only hope.
Moogle Stiltzkin
post Jul 26 2015, 03:20 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 26 2015, 03:17 PM)
32GB is for their Quadro cards.
okay that makes more sense.



QUOTE
8GB is pretty much a given for the next x70/x80 cards, whatever Nvidia is going to call them (1080 would be hilarious).
Question now is, if Hynix has enough stock left for Nvidia or not since AMD will get he major bulk of HBM2 in Q1 2016.
will be interesting to see how things pan out here. worse case scenario, limited stock = higher price = i'm screwed rclxub.gif


QUOTE
Obviously things can change very quickly, it's business after all.
actually there was some chatter on very subject. like nvidia had to come out with something good rather than rest on it laurels. because amd can come out with something after fiji straight after. i think there was mention they were aiming for this though i'm not fully sure.

QUOTE
Except that HBM2 is a drop-in replacement for HBM1 on AMD's Fury parts, so even before Greenland arrives, AMD could get a Fury-X revision to market with 8, or more gigs of HBM memory. The interposer work is done for AMD and its the bottom chip in the HBM stack that hosts the control logic for the HBM die stacks above, and the interposer memory traces for HBM2 are not increasing. A Fury-XA revision may be available just as the HBM2 memory stacks arrive hot off that final assembly line. That and some Tweaks could put Fury over the Ti's performance metrics, and maybe with process refinements and some more overclocking on any Fury-X revisions. AMD has exclusivity on that lines process, and its HBM, Nvidia has got to take the available to all standards and make a line of its own, outside of AMD's line that has AMD's name on it!
http://www.pcper.com/news/Graphics-Cards/R...B-HBM2#comments



i think the problem with fiji was, hbm by itself wasn't going to be a game changer where it mattered the most. or the benefits wouldn't be significantly transparent to the user (e.g. huge leaps in fps gains... for example)

if the pascal rumors are true, then they would have not only intro'd hbm, but also added other stuff that would indeed increase performance significantly that would have the wow factor. just pray it's true smile.gif


so the size is it still on track to be similar to the fiji ? here's the pic shown
user posted image


This post has been edited by Moogle Stiltzkin: Jul 26 2015, 03:30 PM
Moogle Stiltzkin
post Jul 26 2015, 08:10 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(skylinelover @ Jul 26 2015, 07:46 PM)
that is the end of the days of using long ass big ass card laugh.gif

guess i selling off my huge casing 4 the new drawer casing then doh.gif
*
will definitely be more room for the cabling :}

since i use fancy water cooling radiator, i have to stick with my full atx pc aluminium casing :}

but it would be interesting to see other peoples rigs how small it can be to now fit the new card smile.gif
Moogle Stiltzkin
post Jul 26 2015, 09:03 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
noticed the pascal has mixed precision fp16/32/64


so was reading on what exactly it has to do with gaming :/

QUOTE
You can deduce the difference between double precision floating point (FP64) and single precision floating point (FP32) from the name. FP64 results are significantly more precise than FP32. This added precision in the results is crucial for scientific research, professional applications and servers. And less so in video games. Even though FP64 is used in games in a very limited subset of functions, the bulk of video game and graphics code relies on FP32. As such this added precision in turn requires more capable hardware which would net higher costs by increasing the size of the chip while simultaneously increasing power consumption.


bottomline
QUOTE
So, since the GTX Titan Black has a peak of 5.1 TFLOPS single precision floating point performance, a 3:1 ratio means that double precision compute goes down to 1.7 TFLOPs. And with AMD’s Hawaii XT which has a peak of 5.6 TFLOPs of FP32 compute performance, a 2:1 ratio means that it will go down to a more respectable 2.8 TFLOPs of FP64 compute performance. This advantage in FP64 compute is why AMD succeeded in capturing the top spot in the Green500 list of the world’s most power efficient supercomputers with it’s Hawaii XT powered FirePro S9150 server graphics cards.

The FP32 to FP64 ratio in Nvidia’s GM204 and GM206 Maxwell GPUs, powering the GTX 980, 970 and 960 is 32:1. Which means the GPU will be 32 times slower when dealing with FP64 intensive operations compared to FP32. As we’ve discussed above this is mostly OK for video games but downright unacceptable for professional applications.

If Nvidia’s GM200 does end up with a similarly weak double precision compute capablity the card will have very limited uses in the professional market. However in theory the reduction of FP64 hardware resources on the chip should make it more power efficient in games and FP32 compute work. Even though I’m not entirely convinced that it’s a worthwhile trade off. Especially for a card that is poised to go into the next generation Qaudro flagship compute cards.


http://wccftech.com/nvidia-gm200-gpu-fp64-performance/



anyway with pascal will have improved fp64 as well as mixed precision.
QUOTE
With Pascal GPU, NVIDIA will return to the HPC market with new Tesla products. Maxwell, although great in all regards was deprived of necessary FP64 hardware and focused only on FP32 performance. This meant that the chip was going to stay away from HPC markets while NVIDIA offered their year old Kepler based cards as the only Tesla based options. Pascal will not only improve FP64 performance but also feature mixed precision that allows NVIDIA cards to compute at 16-bit at double the accuracy of FP32. This means that the cards will enable three tiers of compute at FP16, FP32 and FP64. NVIDIA’s far future Volta GPU will further leverage the compute architecture as it is already planned to be part of the SUMMIT and Sierra super computers that feature over 150 PetaFlops of compute performance and launch in 2017 which indicates the launch of Volta just a year after Pascal for the HPC market.

http://wccftech.com/nvidia-pascal-gpu-17-b...rrives-in-2016/

This post has been edited by Moogle Stiltzkin: Jul 26 2015, 09:10 PM
Moogle Stiltzkin
post Jul 27 2015, 06:05 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 26 2015, 09:14 PM)
One thing bro, don't quote wccftech too much. Their articles are all opinions (yet they think it's facts).
*
will the new doom game be out by the time pascal arrives hmm.gif

cause thats the game i want to be eye candy pimping on with the pascal drool.gif

also theres star citizen as well.
Moogle Stiltzkin
post Jul 27 2015, 03:05 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 27 2015, 02:33 PM)
I believe it  would be a late Q2 or mid-Q3 2016 release, just my estimated guess following Bethesda's fiscal release. They have their Q1 covered with Fallout 4 already.

If all goes well and Nvidia stays on track, Pascal should come out by Q3 2016. There are rumors circulating that Nvidia will release big Pascal first, as oppose to what they did with Kepler and Maxwell. Just a rumor though, depends on the market as always.
what about this doubling of transistors. i heard something about most of that is mostly going to be for hpc compute or something rather than mostly gaming performance, so their performance estimate for gaming was somehwere between 50-60% vs titanx. any ideas :/ ?



QUOTE(SSJBen @ Jul 27 2015, 02:33 PM)
If all goes well and Nvidia stays on track, Pascal should come out by Q3 2016. There are rumors circulating that Nvidia will release big Pascal first, as oppose to what they did with Kepler and Maxwell. Just a rumor though, depends on the market as always.
do you mean their highend model will come out first ? that suits me fine. but i rather avoid a titan x model, and rather opt for a 980ti equivalent :/ i rather save money when possible xd.

This post has been edited by Moogle Stiltzkin: Jul 27 2015, 03:09 PM
Moogle Stiltzkin
post Jul 27 2015, 04:01 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 27 2015, 03:23 PM)
Yup.
And remember all the claims they made about Kepler before launch... lol, many of which is untrue other than the "state-the-obvious" remarks. doh.gif
It makes sense, 50-60% is quite similar to that of Kepler from Fermi and Maxwel from Kepler. HBM while interesting, I don't think we will see most of its potential until 2017, when DX12 and Vulkan is much more matured. NVLink apparently will be focused for supercomputers only, not sure if it will make it to the consumer grade cards or not? There's no confirmation on this.

Yeah, there is a rumor circulating around that PK100 will make the scenes first, instead of PK104. I honestly... doubt it? laugh.gif
*
ooo i'll google that up then.

then this nvlink ? sounds like not only do i buy the pascal gpu, but also need a new motherboard with nvlink as well ?

they say it would look this basically
user posted image

QUOTE
Coming to the final pillar then, we have a brand new feature being introduced for Pascal: NVLink. NVLink, in a nutshell, is NVIDIA’s effort to supplant PCI-Express with a faster interconnect bus. From the perspective of NVIDIA, who is looking at what it would take to allow compute workloads to better scale across multiple GPUs, the 16GB/sec made available by PCI-Express 3.0 is hardly adequate. Especially when compared to the 250GB/sec+ of memory bandwidth available within a single card. PCIe 4.0 in turn will eventually bring higher bandwidth yet, but this still is not enough. As such NVIDIA is pursuing their own bus to achieve the kind of bandwidth they desire.

The end result is a bus that looks a whole heck of a lot like PCIe, and is even programmed like PCIe, but operates with tighter requirements and a true point-to-point design. NVLink uses differential signaling (like PCIe), with the smallest unit of connectivity being a “block.” A block contains 8 lanes, each rated for 20Gbps, for a combined bandwidth of 20GB/sec. In terms of transfers per second this puts NVLink at roughly 20 gigatransfers/second, as compared to an already staggering 8GT/sec for PCIe 3.0, indicating at just how high a frequency this bus is planned to run at.


user posted image

QUOTE
Multiple blocks in turn can be teamed together to provide additional bandwidth between two devices, or those blocks can be used to connect to additional devices, with the number of bricks depending on the SKU. The actual bus is purely point-to-point – no root complex has been discussed – so we’d be looking at processors directly wired to each other instead of going through a discrete PCIe switch or the root complex built into a CPU. This makes NVLink very similar to AMD’s Hypertransport, or Intel’s Quick Path Interconnect (QPI). This includes the NUMA aspects of not necessarily having every processor connected to every other processor.

But the rabbit hole goes deeper. To pull off the kind of transfer rates NVIDIA wants to accomplish, the traditional PCI/PCIe style edge connector is no good; if nothing else the lengths that can be supported by such a fast bus are too short. So NVLink will be ditching the slot in favor of what NVIDIA is labeling a mezzanine connector, the type of connector typically used to sandwich multiple PCBs together (think GTX 295). We haven’t seen the connector yet, but it goes without saying that this requires a major change in motherboard designs for the boards that will support NVLink. The upside of this however is that with this change and the use of a true point-to-point bus, what NVIDIA is proposing is for all practical purposes a socketed GPU, just with the memory and power delivery circuitry on the GPU instead of on the motherboard.



user posted image
Molex's NeoScale: An example of a modern, high bandwidth mezzanine connector


QUOTE
DIA is touting is that the new connector and bus will improve both energy efficiency and energy delivery. When it comes to energy efficiency NVIDIA is telling us that per byte, NVLink will be more efficient than PCIe – this being a legitimate concern when scaling up to many GPUs. At the same time the connector will be designed to provide far more than the 75W PCIe is spec’d for today, allowing the GPU to be directly powered via the connector, as opposed to requiring external PCIe power cables that clutter up designs.

With all of that said, while NVIDIA has grand plans for NVLink, it’s also clear that PCIe isn’t going to be completely replaced anytime soon on a large scale. NVIDIA will still support PCIe – in fact the blocks can talk PCIe or NVLink – and even in NVLink setups there are certain command and control communiques that must be sent through PCIe rather than NVLink. In other words, PCIe will still be supported across NVIDIA's product lines, with NVLink existing as a high performance alternative for the appropriate product lines. The best case scenario for NVLink right now is that it takes hold in servers, while workstations and consumers would continue to use PCIe as they do today.


Too much to quote, the rest is here
http://www.anandtech.com/show/7900/nvidia-...ecture-for-2016



anyway sounds like nvlink mobo isn't a pre-requisite to use a pascal, can still use pcie. but the question, would using nvlink for a single gpu be worth it ? or is it only going to help for multi gpu setups ? I'm not a fan of multi gpus cause of driver support issues doh.gif so just wondering if upgrading to nvlink mobo is worth it for a single gpu setup. i rather wait for a cannonlake + before i upgrade sweat.gif


This post has been edited by Moogle Stiltzkin: Jul 27 2015, 04:03 PM
Moogle Stiltzkin
post Jul 28 2015, 01:15 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Jul 27 2015, 10:27 PM)
haswell benefit

if u want more
faster encoding via the huge dmi bandwidth.
sata 3 native ports
usb 3 ports

skylake benefits
NVME ssd
native usb 3.1 ports ( not sure just assuming at this point)

skylake still deciding should i go nvme. definately no obvious gain other then watching ssd benchmarks.
zero benefit on gaming to be honest.

usb 3.1 seriously is there any device out there thats taking advantage of its direct access to dmi gen 1 speeds??

nvlink  its not something ure gonna see beneficial on daily commercial gaming setups.
it will how ever open up to a possible octa titan pascal. also with such tech i am pretty sure nvidia has solve some multi gpu scaling here.
also interesting part about nvlink is .. the ability of gpu to access ram and vram of the other gpu directly if i am not mistaken.
the tech is gonna solve current issues with multi gpu.
*
what is the performance improvement of skylake over ivy bridge ? hmm.gif do i need an upgrade to skylake to benefit for pascal. if not for compute performance, then for at least a nvlink, but would that make any difference for single gpu setups, or would pcie3 be enough ?



QUOTE(shikimori @ Jul 28 2015, 12:05 AM)
go get yourself a 144hz IPS monitor . You wont regret it man , game that you play feels really buttery smooth without sacrificing colors and viewing angle . I have not experience g-sync or freesync but 144hz is really worth it  provided you have a decent gpu

Think of it like having the COD esque movement feeling  when playing non COD games even on RTS or games like Diablo 3 .
*
the thing is, there are a few technologies coming not long after. things i can think of is quantum dot which they would add as a film layer onto monitors for improved color. and who knows, maybe the led back light too would have an upgrade from the now common w-led to something like a gr-r led or better hmm.gif for even better colors. but if current ips w-led technology is more than sufficient, then yeah the acer predator 144hz gsync ips monitor seems to be the best gaming monitor out atm. the only thing i can critique is the monitor build which has a light reflective plastic frame. but still considering everything else passes scrutiny over at tft central, the few flaws seem worth glossing over smile.gif



QUOTE(skylinelover @ Jul 28 2015, 12:12 AM)
Cannonlake wait till hair drop wont come out so soon laugh.gif doh.gif pulled the trigger from lynfield 2 haswell last february with no regret haha

As 4 pascal, i still can hold out my kepler till Q3 next year if everything went according 2 plan. Being 30s means i no longer able 2 buy gpu every year unlike student days using sugar daddy money. Haha.

P/s : having 1500 per month used 2 be heaven like being student but now salary double that not enough 2 cover myself doh.gif probably need quad only enough and dont start talking about after having children with me shakehead.gif

That is unless i live malay way of 5 children with low grade milk powder hahahaha i know my 15 yrs longest serving malay supervisor ever with slightly higher salary than me already working in 2 years can support his 5 kids comfortably and maybe with some help from the G perhaps aiks
*
well kepler was out in 2012, and pascal is out in 2016. so a 4 year upgrade seems due for me. or can possible delay by 1-2 year more for a volta (though i rather not do that). besides there just isn't enough info for what volta has to even warrant waiting. not to mention because volta was delayed, i suspect pascal more or less will be the base of what volta was, but is the product they will rush out first to cover the delay. so i'm placing my bet the performance difference won't be too huge from pascal and volta. besides theres always going to be something better. but i think pascal will be powerful enough to satisfy my pc gaming requirements for a long time

I think pascal is most likely to have a big performance upgrade to warrant upgrading in this point in juncture. i'm not the type of tech enthusiast who upgrades every next year, not rich enough for that rolleyes.gif

but i can tell you know that kepler 680gtx for me is not enough for current games. i tried ultra settings on dragon age inquistion and it was totally unplayable. had to lower quality settings to medium/high sad.gif

not playing at 144hz gsync (but instead my paltry 60ghz triple sync vsync mode) is one thing; but not being able to play at ultra settings on a 1080p resolution on a 24'' lcd ips screen is a bit too much for me to ignore doh.gif

This post has been edited by Moogle Stiltzkin: Jul 28 2015, 01:18 AM
Moogle Stiltzkin
post Jul 28 2015, 10:26 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Jul 28 2015, 10:06 AM)
Moogle Stiltzkin
Well depending task but generally clock to clock

haswell DC has 10% single thread gain
Skylake has around 5 percent to haswell.

so 15.5% faster. Also comparing ivybridge to haswell is unfair as it depends on the task. Things like avx/avx2 encoding .. haswell is about 20-30 percent faster.

Skylake real benefit is the 20 pcie lane. So u can run sli with nvme. Also now mobile gpus can have either sli or native 16x single gpu with m.2 pcie ssd.

The cons is ddr4 is too immature. Any platform that has ddr3/ddr4 dimm slots.. dont expect too much from it. Same like p55.
DDR4 atm too immature. Generally i hoping for 4800 CL18-21

but expect rams to evolve slower this time since most rams in the world nowadays is either micron, samsung or sk hynix.
The later two seems to be concentrating on die stacking.
*
i'm waiting for hmc but there is hardly any news when intel will add support to their chipsets and motherboards for this. ddr4 seems like something gonna die real soon :/ especially now that stuff like hbm is coming out.

though i doubt hbm will be used on motherboards because it was designed for graphics usage. hmc was designed for pc. but from what i heard hmc is not a jedec standard ? not sure why hmm.gif

so 15.5 % fps more ? from cpu hmm.gif

i guess will have to test out first what fps i get on pascal. i'll be using on a 1080p 1920x1200 reso 24'' i think thats more than enuff to hit my 60fps cap when using triple buffering vsync hmm.gif


by the way i preordered starcraft 2 legacy of the void. the graphics is not intensive at all .... lel... even my kepler can drive this game at ultra doh.gif fuggin blizzard.....

gotta put my hopes in new doom game to really push the gpu laugh.gif though not sure how the game will be like. think carmack left that company :[


As i suspected amd is gonna rush out their pirate gpu
http://wccftech.com/amd-r9-400-series-gpus...arctic-islands/


but there is already speculation out how things will be like then *change 970 to pascal
user posted image

laugh.gif


QUOTE
An update from sweclockers.com has revealed the code name of the next-next-gen AMD Radeon R9 400 Series GPUs. It goes without saying that the naming is pretty much irrelevant at such an early stage. The revealed codename for the series is Arctic Islands and the actual GPU name could be of any island present there. There is currently zero information regarding the Radeon R9 400 Series apart from the fact it will be based on a 20nm or lower node (most probably 16/14nm FinFET).


small nm fabrication is good but i doubt it will be enough. have they had time to develop a new architecture to be more power efficient and still have great performance ? cause i hadn't heard anything about that. Unlike volta and pascal we at least long ago heard they were working on a new architecture hmm.gif

This post has been edited by Moogle Stiltzkin: Jul 28 2015, 10:29 AM
Moogle Stiltzkin
post Jul 28 2015, 02:30 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(takeshiru @ Jul 28 2015, 12:46 PM)
Hello guys..

Hope to get some opinion here..currently using strix980.. And a 1080 monitor..
Selling my 980 off and upgrading to 980ti? Or get the acer predator?

Im leaning toward the monitor as my current one isnt great.. If i get the 980ti , could be a waste using it on a dell nom gsync monitor right?
*
i'd rather spend money on acer predator.

980 to 980ti isn't as good as 980 + acer predator.

10-20% fps gains is nothing compared to 144hz refresh rate, gsync, 1440p resolution, 27'' IPS !!!

besides if you already got a 980 why get a 980ti ? when they just reported that pascal is next year 2016 with double the transistor count and most likely to have gains of 50-60% over a titan x ?

anyway get the acer predator you won't regret flex.gif

Moogle Stiltzkin
post Jul 29 2015, 02:00 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 28 2015, 11:25 PM)
Problem with Acer? Great panel, nice tech. But the cheapest ass looking glossy plastic and bezels as thick as 2008 monitors. Oh and don't get me started on the QC....  sweat.gif
Over RM2k+ for a monitor, it is only right to expect something much better. It barely costs half as much to produce the monitor.

This is why I want Dell and LG to come up with their Gsync monitors. LG already makes their own IPS panels anyway, their curved screens has been doing pretty decent considering how niche that market is. So what the hell is stopping them?
*
i thought the glossy part was only the frame not the actual screen hmm.gif

if screen also glossy i might reconsider :/ cause that would be friggin annoying... (not fan of glossy).


*update

there i double checked

QUOTE
Panel Coating
Light AG coating



QUOTE
Glossy black bezel and stand, some red trim on base



QUOTE
Anti-Glare Coating (AG)

The most common type of protective coating is ‘Anti-Glare’ (AG). This is often described as a matte coating as it is non-reflective to the user since it diffuses rather than reflects ambient light. It provides a method for manufacturers to avoid glare on the viewing surface from other light sources and has been used in the LCD monitor market for many years since the first TFT displays started to emerge.

This matte coating is included as an outer polarizing later which has been coarsened by mechanical or chemical processes. This achieves a surface coating which is not smooth and so can diffuse ambient light rather than reflect it. What is particularly important to understand is that this AG coating can be applied to panels with varying thicknesses, which has an impact on the anti-glare properties, but also on the underlying image of the screen. Where the coating is particularly thick and aggressive, the image from the screen can deteriorate as the light being emitted is also affected. This can have some impact on contrast and colour vibrancy and the perceived image can sometimes look dull as a result. Sharpness degradation can also occur in some extreme cases where AG coating is too thick. Users may also sometimes see the graininess of the coating, particularly when viewing white or light backgrounds. This can be particularly distracting for office work and images can look grainy or dirty if the coating is too aggressive. I would point out that not everyone would even notice this at all, and many users are perfectly happy with their screens even where aggressive AG is used. It’s just something to be wary of in case you have found problems with image quality in the past or are susceptible to it.

In other cases, AG coating is applied but it is light and far less obtrusive. The polarizer is less rough and has a lower haze value. Sometimes users refer to it as “semi-gloss” to distinguish the difference between these and the heavy AG coatings. This provides anti-glare properties but does not result in the grainy appearance of images. It is not a fully glossy solution though.

AG coating has long been the choice for nearly all professional-grade displays as well, helping to avoid issues with reflections and external light sources which are vital for colour critical work. In addition it should be noted that AG coating is less susceptible to dust, grease and dirt marks which can become an issue on reflective glossy coating alternatives.

http://www.tftcentral.co.uk/articles/panel_coating.htm



so the only issue is the bezel and stand. but honestly, as long as the panel itself is matte then i personally is that big a deal. Of course if another brand came up with a similar specs and performed just as well, but none of that glossy nonsense (when will they learn doh.gif ) then yeah i would recommend that instead.

but till then this is the way to go 2016 notworthy.gif


PS:
according to online sources this particular model also supports internal programmable 14-bit 3D lookup tables (LUTs) for calibration. so you can caliberate your settings directly into the monitor, so even video sources like when watching mpc would benefit from calibrating this monitor.

the only color cons i can think for this monitor is it uses the cheaper w-led, rather than gb-r led. Though considering everything else minus the glossy bezel/stand, i still think it's still way better than a tn panel. and it's got 14bit internal lut which is also to me important to have as bare minimum :} *need calibrator though





This post has been edited by Moogle Stiltzkin: Jul 29 2015, 02:12 AM

6 Pages  1 2 3 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0534sec    0.06    7 queries    GZIP Disabled
Time is now: 27th November 2025 - 03:47 AM