Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
3 Pages  1 2 3 >Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V15 (new era pascal), ALL HAIL NEW PASCAL KING GTX1080 out now

views
     
Demonic Wrath
post Jul 19 2015, 08:32 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(reconvision @ Jul 16 2015, 09:55 PM)
Anyone here using PSU that is consider quiet during heavy gaming? my 980ti is causing my seasonic m12ii to sound so loud Is getting very unpleasant.
*
Yes..m12ii will be loud when under load. I used it (750w) for my GTX970 before, only 350w load also consider loud. Finally changed to Enermax Revolution87 750w, now ok already.
Demonic Wrath
post Sep 1 2015, 08:41 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Funny how NVIDIA actually losses performance when going for DX12 API at Ashes of Singularity benchmark.. maybe they optimized their DX11 drivers too well?

Support async or not, what's important is the actual FPS of the game.. if anyone is quoting Ashes of Singularity benchmark saying AMD has better implementation, check again.. R9 390X is also performing close to R9 Fury too. (source: http://www.pcgameshardware.de/Ashes-of-the...tX-11-1167997/) It is just that AMD DX11 implementation is so bad that it makes DX12 looks very good.

One thing we know for sure, currently NVIDIA has the market share (82%!). Who knows what will happen to future DX12 games, especially those GameWorks titles.
Demonic Wrath
post Sep 1 2015, 11:21 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(Unseen83 @ Sep 1 2015, 10:18 PM)
yeah funny indeed...  but NVIDIA is telling/pressure Oxide to disable async compute/shader feature on the bench.

" The rather startling news is that Nvidia's Maxwell architecture, and yeah that would be the entire 900 range does not support it in the way AMD does. I can think of numerous scenarios as to where asynchronous shaders would help."

http://www.guru3d.com/news-story/nvidia-wa...n-settings.html

add-on: oh bro your signature correction on "MSI NVIDIA GeForce GTX970 Gaming 4GB GDDR5" is 3.5GB GDDR5  icon_rolleyes.gif
*
Huh? You mean the GTX970 has physically only 3.5GB GDDR5? Ok...thanks for info

This post has been edited by Demonic Wrath: Sep 1 2015, 11:22 PM
Demonic Wrath
post Sep 2 2015, 08:50 AM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Since both all graphics hardware vendor doesn't have a card that can fully support DX12, just wait till 2016 or 2017 to get a full DX12 feature support card.. but then again, maybe that time got DX12.1 feature?

Edit: For those who really needs full DX12 support, just wait till DX12 games are released first then only see benchmarks..

This post has been edited by Demonic Wrath: Sep 2 2015, 09:35 AM
Demonic Wrath
post Sep 2 2015, 12:34 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

From AoTS benchmark, it seems that NVIDIA's DX11 without async support performs faster/similarly to AMD's DX12 with async compute. (source) edit: see frame rate by batch type.

Secondly, Anandtech (source) previously show that NVIDIA does benefit from DX12 implementation too. StarSwarm is also developed by Oxide.

This post has been edited by Demonic Wrath: Sep 2 2015, 12:35 PM
Demonic Wrath
post Sep 4 2015, 08:19 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Apparently from latest update, the supposedly "Asyncompute.exe" program doesn't send compute workload to the compute queue on NVIDIA cards..whereas PhsyX games do async compute? source
Demonic Wrath
post Sep 5 2015, 03:50 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

So latest update on the Async situation:-

QUOTE
We actually just chatted with Nvidia about Async Compute, indeed the driver hasn’t fully implemented it yet, but it appeared like it was. We are working closely with them as they fully implement Async Compute. We’ll keep everyone posted as we learn more. - Kollock (Oxide Games)
source
Demonic Wrath
post Sep 11 2015, 09:18 AM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(Rim22 @ Sep 11 2015, 09:12 AM)
Please recomend me the lowest price nvidia but enough to play all games apps today?
*
Just state your budget and I'm sure someone can help you to select the best performance/price card.. specifying all game apps is very vague (you want to run highest setting at 4K? or lowest setting at 720p? or PS4 like performance?)
Demonic Wrath
post Sep 12 2015, 01:44 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(cstkl1 @ Sep 11 2015, 11:38 PM)


Btw running shadowplay record So fps kindda drop by 5-7.

Real world game nvr dropped below 42.
Seriously heavy usage of cpu.
*
I completed Batman AK on PC long time ago. Very smooth gameplay..

Installed on my SSD to play (pagefile also need to be on SSD, or else will stutter too). Heavy write on SSD due to writing on pagefile >.<

This post has been edited by Demonic Wrath: Sep 12 2015, 01:44 PM
Demonic Wrath
post Oct 20 2015, 10:02 AM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(SSJBen @ Oct 19 2015, 03:22 PM)
Yeah. It's just another extra time-wasting step now to first extract each new driver available from GFE.  sweat.gif
*
Extract new driver from GFE? Currently GFE downloads the driver exe same as what you download from Geforce.com website.

You can see all the drivers downloaded from GFE here: C:\ProgramData\NVIDIA Corporation\NetService

Like Najmod said, you can probably get the driver from other reliable 3rd party source such as Techpowerup or Guru3D since they have been hosting the driver for quite a while now..
Demonic Wrath
post Nov 22 2015, 06:20 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(SSJBen @ Nov 18 2015, 08:18 PM)
No man. AA doesn't kill performance in Fallout 4. It isn't using MSAA at all. Just temporal AA of which is post processed, doesn't affect performance significantly.

What kills performance is godrays and shadow distance.
*
Actually what kills the performance is Shadow Distance only... Putting it to Ultra makes my FPS drop to 30-40s even when the GPU usage is only ~60%..
Demonic Wrath
post Nov 27 2015, 08:25 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(cstkl1 @ Nov 27 2015, 07:35 PM)


Should have tried the tutorial first
*
Why not use HBAO+?

Also, no SLI?
Demonic Wrath
post Nov 28 2015, 09:11 AM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(cstkl1 @ Nov 27 2015, 09:24 PM)
Oops on the first
Second no profile
*
Actually there's a SLI profile for Rainbow Six Siege.. It's not working because the new R6siege exe (rainbowsix.exe) shares the same profile as old Rainbow six game.

Need to fix using NVIDIA inspector.
Demonic Wrath
post Dec 10 2015, 04:04 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(tan_pang @ Dec 10 2015, 09:04 AM)
Hi, want to ask do anyone have more info regarding the NVlink?

Since that it is said to remove the bottleneck caused by PCIe, does it mean it need a brand new motherboard design for it to work?
*
The whitepaper details on the GPU to GPU connection (p2p) only. The connection to CPU is still through PCI-E.

NVLink for CPU-GPU is being implemented by IBM for supercomputers only.

IMHO, if NVIDIA want to implement NVLink succesfully, then it can be through:-
i) akin to SLI bridge
ii) Dual GPU cards

It seems unlikely that NVIDIA will introduce 2 variants of GPU i.e. 1 for PCI-e interface, 1 for "NVLink" interface..

This post has been edited by Demonic Wrath: Dec 10 2015, 04:08 PM
Demonic Wrath
post Dec 12 2015, 11:41 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(heavy rain @ Dec 12 2015, 09:04 PM)
Can i know where can i sent my card for rma? MSI GTX 670 P.E OC

i have two 670 P.E in sli and when i enabled sli the driver keep on crashing and recover. im using the latest driver. i test both card individually and one of the card make the driver crash and recover. The other card seem fine. Already sent the card for rma previously and this is the second time it have problem  sweat.gif
*
Last time my MSI GTX460 also need RMA, I contacted NVIDIA straight and they forwarded my request to MSI for RMA. A lot faster..
Demonic Wrath
post Dec 17 2015, 10:14 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(heavy rain @ Dec 15 2015, 08:24 PM)
is your gtx460 still in a warranty at that time?
can you still sent to rma if your card is out of warranty? i dont think my gtx 670 is still in a warranty
*
Oh...my 460 was still under warranty that time..
Demonic Wrath
post Dec 19 2015, 08:28 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(skylinelover @ Dec 19 2015, 08:26 AM)
GTX970 with 3.5GB ram should do laugh.gif rclxms.gif
*
Yea buy a GTX970 and physically take out 512mb.
Demonic Wrath
post Dec 30 2015, 06:58 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Lol....price for GPU really crazy now...last year bought GTX970 MSI gaming is RM 1400+.. now 2nd hand is RM 1500.. new is RM 1800

who says computer parts is not investment ;p
Demonic Wrath
post Mar 3 2016, 01:27 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(targon @ Feb 29 2016, 09:50 PM)
Things that are surely to be certain are:

-Product cycles are getting longer these days, hence you can expect new launches between 2 generation levels Longer timeframe.
-Technology companies are not stupid. they will milk out (max) cash flow out of each generational products before launching a new one. Like this Mr Huang of NVidia. (he got the business brain in him)
*
On the technical side,
1) Chip design nowadays are getting more and more complicated (billions transistor count) and there are limitations in fabrication process and also heat issue.
2) Debugging these chips will take longer time. If they fast track it and release a buggy chip, they stand to lose more from RMA and bad rep.
3) It is right to salvage chip defect and reuse as lower end parts. Less wastage, recover R&D cost and fabrication cost.
4) Architecture performance improvement and/or power saving is normally achieved by fine tuning the architecture, better power gating or new ways to perform the task. Over the years, technology company already fine tuned a lot of aspect in their architecture, leaving less and less things to fine tune. Intel's CPU core improvement is generally only ~10% from architecture to architecture (instead of having more cores, it's really hard to increase performance for each core given the same thermal envelope).

From the above points, you can see design process is getting longer and longer, this also translate to R&D cost getting higher and higher. This is why some company choose to rebrand their products. IMO, Nvidia is already doing a lot of enhancement on both hardware architecture designs and software side (CUDA/Gameworks) in order to optimize the performance possible. (Anyone may argue open source is better, but let's face it, almost no investor and hardware vendor will be willing to invest too much time/effort on open source things. Open source = broad compatibility = hard & longer to optimize.)

TLDR, long product cycle for GPU is good since this forces the developer to use smarter technique and optimize to achieve better results instead of just brute-forcing it. This is why console graphics are generally on par/or better with PC graphics, even at a (much) lower performance capability.
Demonic Wrath
post Apr 30 2016, 10:16 AM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(skylinelover @ Apr 30 2016, 07:52 AM)
So that made me ask the question will 980 / 980ti be gimped in less than a year time hmm.gif hmm.gif
*
Hmm...in the video it mentioned it is not being gimped. Why would you think otherwise.

3 Pages  1 2 3 >Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0292sec    0.48    7 queries    GZIP Disabled
Time is now: 25th November 2025 - 10:09 PM