Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
125 Pages « < 22 23 24 25 26 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V15 (new era pascal), ALL HAIL NEW PASCAL KING GTX1080 out now

views
     
cstkl1
post Aug 28 2015, 12:54 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Aug 28 2015, 12:35 PM)
from what i heard dx12 makes multi gpus much more compatible.

DirectX 12 Will Allegedly Allow Multi-GPU use Between Nvidia and AMD Cards

http://www.maximumpc.com/directx-12-will-a...-and-amd-2015/#!
if i'm not mistaken this also would mean 2 gpus with 4gb vram each would effectively be 4+4 = 8vram, rather than 4gb vram prior dx12  hmm.gif
*
Think wont work with nvidia
Nvidia tends to do alot of firmware n architecture thingy.
Hence y lower overhead on cpu before from driver.

So thats y why u see on their roadmap theres unified mem/virtual mem etc.

Dx12 feature set etc was done really quickly n i am pretty sure it threw some wrenches into nvidia roadmap plans. Since nvidia tends to do two gen forward r&d

This post has been edited by cstkl1: Aug 28 2015, 12:56 PM
Moogle Stiltzkin
post Aug 28 2015, 01:02 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Aug 28 2015, 12:54 PM)
Think wont work with nvidia
Nvidia tends to do alot of firmware n architecture thingy.
Hence y lower overhead on cpu before from driver.

So thats y why u see on their roadmap theres unified mem/virtual mem etc.

Dx12 feature set etc was done really quickly n i am pretty sure it threw some wrenches into nvidia roadmap plans. Since nvidia tends to do two gen forward r&d
*
oh that sucks :{

so.... if wont work with amd gpus... then what about other nvidia gpus ? hmm.gif

when i attend this event, will see if they mention on that hmm.gif
cstkl1
post Aug 28 2015, 01:04 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Aug 28 2015, 01:02 PM)
oh that sucks :{

so.... if wont work with amd gpus... then what about other nvidia gpus ?  hmm.gif

when i attend this event, will see if they mention on that  hmm.gif
*
That event. Will still be about gsync, dx12, etc same as 980ti event.


Moogle Stiltzkin
post Aug 28 2015, 01:05 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Aug 28 2015, 01:04 PM)
That event. Will still be about gsync, dx12, etc same as 980ti event.
*
are you refering to the one in singapore ? or was there alrdy one in my :/ ? i must have missed that one then sad.gif
cstkl1
post Aug 28 2015, 01:06 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Aug 28 2015, 01:05 PM)
are you refering to the one in singapore ? or was there alrdy one in my :/ ? i must have missed that one then  sad.gif
*
Both.
Nvidia is doing more campaigning since they are flushed with cash. Not killing off amd but more like ensuring amd wont bounce back.
Minecrafter
post Aug 28 2015, 02:46 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(Moogle Stiltzkin @ Aug 28 2015, 12:53 PM)
yeah multi gpu more bang for your buck. but i still wouldn't go multi gpu :} expensive. not to mention i don't think this will change multi gpus being more dependent on driver updates compared to single gpu solutions.
*
Yeah,the optimization problems will still be there. yawn.gif Bu still,i don't think it's worth it to SLI or CFX mid-range or low-range cards. tongue.gif
Rei7
post Aug 28 2015, 03:34 PM

Game, anime and headphones ❤️
******
Senior Member
1,669 posts

Joined: Apr 2011



QUOTE(Moogle Stiltzkin @ Aug 28 2015, 12:35 PM)
if i'm not mistaken this also would mean 2 gpus with 4gb vram each would effectively be 4+4 = 8vram, rather than 4gb vram prior dx12  hmm.gif
*
Yeah full 8gb with DX12, so i've heard. No more 50% only for the 2nd card and stuff.
Exciting times, but still depends on developers to fully utilize DX12.


SSJBen
post Aug 28 2015, 03:35 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Dell just announced their first G-sync monitor: S2716DG.
1440p, 144hz, G-sync and ULMB.

At first I was like awhhh yissss!! Then, I found out it's a TN panel. Fudge. doh.gif

/waitingcontinues


*EDIT*
Yes, I know the TN panels on these sets will be of the highest of end panels. And yes I know IPS has its own set of issues too, backlight bleed and IPS glow are notorious on them. But damn, having used IPS for several years now, it is difficult to go back down to a different color gamut. It's just weird not being able to tell gradients apart, something which even the best TN panels cannot cope with.

This post has been edited by SSJBen: Aug 28 2015, 03:38 PM
TSskylinelover
post Aug 29 2015, 02:42 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Minecrafter @ Aug 28 2015, 02:46 PM)
Yeah,the optimization problems will still be there.  yawn.gif  Bu still,i don't think it's worth it to SLI or CFX mid-range or low-range cards. tongue.gif
*
Haha well said laugh.gif rclxms.gif

QUOTE(Rei7 @ Aug 28 2015, 03:34 PM)
Yeah full 8gb with DX12, so i've heard. No more 50% only for the 2nd card and stuff.
Exciting times, but still depends on developers to fully utilize DX12.
*
DX12 still baby stage. So i aint jumping 2 soon yet. Hoping crysis 4 will be the first ever DX12 game since the previous sequence of farcry 1 first dx9 game crysis 1 first dx10 game and finally crysis 2 first dx11 game. laugh.gif rclxms.gif
TSskylinelover
post Aug 29 2015, 02:45 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(cstkl1 @ Aug 28 2015, 01:06 PM)
Both.
Nvidia is doing more campaigning since they are flushed with cash. Not killing off amd but more like ensuring amd wont bounce back.
*
Haha this i like

QUOTE(SSJBen @ Aug 28 2015, 03:35 PM)
Dell just announced their first G-sync monitor: S2716DG.
1440p, 144hz, G-sync and ULMB.

At first I was like awhhh yissss!! Then, I found out it's a TN panel. Fudge. doh.gif

/waitingcontinues
*EDIT*
Yes, I know the TN panels on these sets will be of the highest of end panels. And yes I know IPS has its own set of issues too, backlight bleed and IPS glow are notorious on them. But damn, having used IPS for several years now, it is difficult to go back down to a different color gamut. It's just weird not being able to tell gradients apart, something which even the best TN panels cannot cope with.
*
Same lo. Never looked back at TN anymore after owning IPS for more than a year already. Haha.
SUSngkhanmein
post Sep 1 2015, 12:00 AM

カラカラ Karakara
*******
Senior Member
7,727 posts

Joined: Jan 2010
From: Ara Damansara, Petaling Jaya & Batu Pahat, Johor.


http://www.guru3d.com/news-story/nvidia-wa...n-settings.html

"Currently is seems that Nvidias Maxwell architecture (Series 900 cards) does not really support Asynchronous compute in DX12 at a proper hardware level. Meanwhile AMD is obviously jumping onto this a being HUGE and they quickly prepared a PDF slide presentation with their take on the importance of all this. Normally I'd share add the slides into a news item, but this is 41 page of content slides, hence I made it available as separate download.

In short, here's the thing, everybody expected NVIDIA Maxwell architecture to have full DX12 support, as it now turns out, that is not the case. AMD offers support on their Fury and Hawaii/Grenada/Tonga (GCN 1.2) architecture for DX12 asynchronous compute shaders. The rather startling news is that Nvidia's Maxwell architecture, and yeah that would be the entire 900 range does not support it. I can think of numerous scenarios as to where asynchronous shaders would help."
Unseen83
post Sep 1 2015, 04:35 AM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(ngkhanmein @ Sep 1 2015, 12:00 AM)
http://www.guru3d.com/news-story/nvidia-wa...n-settings.html

"Currently is seems that Nvidias Maxwell architecture (Series 900 cards) does not really support Asynchronous compute in DX12 at a proper hardware level. Meanwhile AMD is obviously jumping onto this a being HUGE and they quickly prepared a PDF slide presentation with their take on the importance of all this. Normally I'd share add the slides into a news item, but this is 41 page of content slides, hence I made it available as separate download.

In short, here's the thing, everybody expected NVIDIA Maxwell architecture to have full DX12 support, as it now turns out, that is not the case. AMD offers support on their Fury and Hawaii/Grenada/Tonga (GCN 1.2) architecture for DX12 asynchronous compute shaders. The rather startling news is that Nvidia's Maxwell architecture, and yeah that would be the entire 900 range does not support it. I can think of numerous scenarios as to where asynchronous shaders would help."
*
but but Nvidia says...

http://blogs.nvidia.com/blog/2015/01/21/wi...10-nvidia-dx12/

" We’re more than ready. GPUs built on our Maxwell GPU architecture – such as our recently released GeForce GTX 970 and GeForce GTX 980 – fully support DX12. "

This post has been edited by Unseen83: Sep 1 2015, 05:06 AM
JohnLai
post Sep 1 2015, 11:02 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
Hmm.......well......Oxide better ensure its claim is true.....otherwise, if nvidia 'magically' fixes the async issue (even if it up to 31/32 queue), Oxide reputation will be questionable.
SUSngkhanmein
post Sep 1 2015, 11:35 AM

カラカラ Karakara
*******
Senior Member
7,727 posts

Joined: Jan 2010
From: Ara Damansara, Petaling Jaya & Batu Pahat, Johor.


QUOTE(Unseen83 @ Sep 1 2015, 04:35 AM)
but but Nvidia says...

http://blogs.nvidia.com/blog/2015/01/21/wi...10-nvidia-dx12/

" We’re more than ready. GPUs built on our Maxwell GPU architecture – such as our recently released GeForce GTX 970 and GeForce GTX 980 – fully support DX12. "
*
support doesn't mean will include everything cool2.gif
Moogle Stiltzkin
post Sep 1 2015, 12:33 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
well it was speculated that nvidia does have async compute, but that most likely it's emulated, considering this recent issue by oxide, then that could explain their current results.

doh.gif but that is still speculation, though a rather logical one judging by the results so far.
Ronzph
post Sep 1 2015, 05:24 PM

Getting Started
**
Junior Member
120 posts

Joined: Jul 2009
From: Pluto



This --> MSI 980TI Lightning

Any1 Getting this ???

TSskylinelover
post Sep 1 2015, 05:27 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Sep 1 2015, 12:33 PM)
well it was speculated that nvidia does have async compute, but that most likely it's emulated, considering this recent issue by oxide, then that could explain their current results.

doh.gif but that is still speculation, though a rather logical one judging by the results so far.
*
Haha i guess is very true we better wait pascal laugh.gif
SUSngkhanmein
post Sep 1 2015, 05:55 PM

カラカラ Karakara
*******
Senior Member
7,727 posts

Joined: Jan 2010
From: Ara Damansara, Petaling Jaya & Batu Pahat, Johor.


QUOTE(Moogle Stiltzkin @ Sep 1 2015, 12:33 PM)
well it was speculated that nvidia does have async compute, but that most likely it's emulated, considering this recent issue by oxide, then that could explain their current results.

doh.gif but that is still speculation, though a rather logical one judging by the results so far.
*
even i'm NV fan-boy also can't denied DX12 is goin to favor for red side. previously NV driver is superior but now different story. inikali DX11 we owned but DX12 lain kali-lah.. sweat.gif


Unseen83
post Sep 1 2015, 06:20 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(ngkhanmein @ Sep 1 2015, 11:35 AM)
support doesn't mean will include everything cool2.gif
*
the Quote: Focus on "Fully support" cool2.gif

We’re more than ready. GPUs built on our Maxwell GPU architecture – such as our recently released GeForce GTX 970 and GeForce GTX 980 – fully support DX12.

http://blogs.nvidia.com/blog/2015/01/21/wi...10-nvidia-dx12/

This post has been edited by Unseen83: Sep 1 2015, 06:20 PM
Demonic Wrath
post Sep 1 2015, 08:41 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Funny how NVIDIA actually losses performance when going for DX12 API at Ashes of Singularity benchmark.. maybe they optimized their DX11 drivers too well?

Support async or not, what's important is the actual FPS of the game.. if anyone is quoting Ashes of Singularity benchmark saying AMD has better implementation, check again.. R9 390X is also performing close to R9 Fury too. (source: http://www.pcgameshardware.de/Ashes-of-the...tX-11-1167997/) It is just that AMD DX11 implementation is so bad that it makes DX12 looks very good.

One thing we know for sure, currently NVIDIA has the market share (82%!). Who knows what will happen to future DX12 games, especially those GameWorks titles.

125 Pages « < 22 23 24 25 26 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0218sec    0.48    6 queries    GZIP Disabled
Time is now: 1st December 2025 - 12:13 AM