Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
128 Pages « < 46 47 48 49 50 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V14

views
     
S4PH
post Feb 25 2015, 11:13 PM

adam_s4ph
******
Senior Member
1,167 posts

Joined: Jan 2007
From: ..Tsukuba..


QUOTE(skylinelover @ Feb 25 2015, 04:43 PM)
What an epic downgrade laugh.gif doh.gif i waiting pascal this year or volta next year hahahaha
*
when u do ur epic 770 4gb sell to me cheap2 thumbup.gif
Unseen83
post Feb 26 2015, 02:47 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


smile.gif so there time where AMD fan and Nvidia Fan @r EndUser goes best bang/Budget... soon there be added Fan categories Cross-SLI Fan...

http://wccftech.com/dx12-nvidia-amd-asynchronous-multigpu/
S4PH
post Feb 26 2015, 04:35 PM

adam_s4ph
******
Senior Member
1,167 posts

Joined: Jan 2007
From: ..Tsukuba..


QUOTE(Unseen83 @ Feb 26 2015, 02:47 PM)
smile.gif so there time where AMD fan and Nvidia Fan @r EndUser goes best bang/Budget... soon there be added Fan categories Cross-SLI Fan... 

http://wccftech.com/dx12-nvidia-amd-asynchronous-multigpu/
*
I won't perform well for sure
Unseen83
post Feb 26 2015, 04:54 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(S4PH @ Feb 26 2015, 04:35 PM)
I won't perform well for sure
*
hmm.gif is not yet out.. smile.gif window 10 first than DX12 let not judge anything yet till is all out... Mantle seem to work and increase the minimum Fps when using Mantle.. help with cpu/gpu bottleneck and i imagine to have GCN core and CUDA wub.gif

QUOTE(ngkhanmein @ Feb 26 2015, 04:33 PM)
i don't wanna mix with amd. amd sux..
*
hm i own 2 Gpu from AMD.. yes it got it weakness and issues but i overcome it(to always use DDU to install AMD driver..).. and now steadfastly it crossfire inside my Rig stable/gaming in 4K(med or High setting) with short profile edit boom i can made unsupported crossfire game support play with less lag(4k)
x
Nvidia well on Desktop Asus GTX 760 DCUII i never have any problem not even Bsod.. or driver problem never need to use DDU smile.gif awesome... but on GTX780M for my MSI GT70 20D.. i have to use DDU to make sure clean driver intalation and there time to time driver related crash or Bsod.. but like wise i overcome it smile.gif this also goes to my AsusG75VX notebook GTX675M some Driver issues/crash... but with DDU/ CCleaner help overcome it Reg issues..

Conclusion... i have not Experience any SUX from them... blink.gif to me non GPU maker is perfect.. Enduser we need both of them better if there3 or 4 more in the World GPU.. smile.gif
S4PH
post Feb 26 2015, 05:35 PM

adam_s4ph
******
Senior Member
1,167 posts

Joined: Jan 2007
From: ..Tsukuba..


QUOTE(Unseen83 @ Feb 26 2015, 04:54 PM)
hmm.gif  is not yet out.. smile.gif window 10 first than DX12 let not judge anything yet till is all out...  Mantle seem to work and increase the minimum Fps when using Mantle.. help with  cpu/gpu bottleneck and i imagine to have GCN core and CUDA  wub.gif
hm i own 2 Gpu from AMD.. yes it got it weakness and issues but i overcome it(to always use DDU to install AMD driver..).. and now steadfastly it crossfire inside my Rig stable/gaming in 4K(med or High setting) with short profile edit boom i can made unsupported crossfire game support play with less lag(4k) 
x
Nvidia well on Desktop Asus  GTX 760 DCUII i never have any problem not even Bsod.. or driver problem never need to use DDU smile.gif awesome... but on GTX780M for my MSI GT70 20D.. i have to use DDU to make sure clean driver intalation and there time to time driver related crash or Bsod.. but like wise i overcome it smile.gif  this also goes to my AsusG75VX notebook GTX675M some Driver issues/crash... but with DDU/ CCleaner help overcome it Reg issues..

Conclusion... i have not Experience any SUX from them...  blink.gif  to me non GPU maker is perfect.. Enduser we need both of them better if there3 or 4 more in the World GPU..  smile.gif
*
Let's wait n c , it will be a boost for older hardware that support dx12
defaultname365
post Feb 26 2015, 10:30 PM

Windows® 8.1 | Xbox 360™ | PlayStation® 4
******
Senior Member
1,098 posts

Joined: Nov 2006
user posted image

hmm.gif

SUSgogo2
post Feb 27 2015, 08:27 AM

gogo2
********
All Stars
18,672 posts

Joined: Jan 2003
From: Penang


QUOTE(defaultname365 @ Feb 26 2015, 10:30 PM)

hmm.gif
*
confirmed Titan VR
TSskylinelover
post Feb 27 2015, 10:15 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
Lets go hahahaha
defaultname365
post Feb 27 2015, 10:48 AM

Windows® 8.1 | Xbox 360™ | PlayStation® 4
******
Senior Member
1,098 posts

Joined: Nov 2006
The future of gaming will be gamers wearing 'headgears' bobbing their head around. tongue.gif Never tried VR so...

Anyways, looking forward to the announcement, should be accompanied by some awesome showcase of some game title.

On a second note, yikes on the lawsuit and stuff... icon_question.gif My take on it is that customers are always right and they paid for what they believe is 'proper 4.0' and not '3.5 + 0.5'. The big boss saying there was a miscommunication and it is actually a feature just added more fuel to the fire.

Sadly, it is not going to be an easy way out for Nvidia on this... unless some kind of recall/refund/compensation/offer on the next GPU purchase perhaps is provided. Easier said than done for sure.

And even sadder is that the 980 Ti has been pushed to 2016.

http://www.ecumenicalnews.com/article/next...d-to-2016-28355

I am just hoping for at least a 6GB variant of Maxwell... laugh.gif pointless at 1080p but I plan to jump to 4k. Pretty sure in future even at 1080p higher than 4Gb VRAM usage will be in place.
SSJBen
post Feb 27 2015, 01:06 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


What Jen Hsun is trying to tell everybody is that "gimping" is now an official feature from Nvidia, in addition to PhysX (which is also gimped btw), ShadowPlay, GFE and whatever else they have.
Minecrafter
post Feb 27 2015, 03:02 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(defaultname365 @ Feb 26 2015, 10:30 PM)
user posted image

hmm.gif
*
It's a car where you will be able to game on the go.

» Click to show Spoiler - click again to hide... «

SSJBen
post Feb 27 2015, 04:56 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Minecrafter @ Feb 27 2015, 03:02 PM)
It's a car where you will be able to game on the go.

» Click to show Spoiler - click again to hide... «

*
» Click to show Spoiler - click again to hide... «

TSskylinelover
post Feb 27 2015, 05:28 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(defaultname365 @ Feb 27 2015, 10:48 AM)
The future of gaming will be gamers wearing 'headgears' bobbing their head around.  tongue.gif  Never tried VR so... 

Anyways, looking forward to the announcement, should be accompanied by some awesome showcase of some game title.

On a second note, yikes on the lawsuit and stuff...  icon_question.gif  My take on it is that customers are always right and they paid for what they believe is 'proper 4.0' and not '3.5 + 0.5'. The big boss saying there was a miscommunication and it is actually a feature just added more fuel to the fire.

Sadly, it is not going to be an easy way out for Nvidia on this... unless some kind of recall/refund/compensation/offer on the next GPU purchase perhaps is provided. Easier said than done for sure.

And even sadder is that the 980 Ti has been pushed to 2016.

http://www.ecumenicalnews.com/article/next...d-to-2016-28355

I am just hoping for at least a 6GB variant of Maxwell...  laugh.gif pointless at 1080p but I plan to jump to 4k. Pretty sure in future even at 1080p higher than 4Gb VRAM usage will be in place.
*
Haha. That means no pascal or volta till 2017. Argh dang it.
DeepMemory
post Feb 27 2015, 06:08 PM

Regular
******
Senior Member
1,805 posts

Joined: Oct 2010
Hey guys, just got a new Zotac GTX 960 non-AMP edition from Storekini. The temps maxed out at 81 degrees while my previous Asus AMD Radeon HD7850 maxed out at 63 degrees. Is it normal? What should I check?
cstkl1
post Feb 27 2015, 06:28 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(defaultname365 @ Feb 27 2015, 10:48 AM)
The future of gaming will be gamers wearing 'headgears' bobbing their head around.  tongue.gif  Never tried VR so... 

Anyways, looking forward to the announcement, should be accompanied by some awesome showcase of some game title.

On a second note, yikes on the lawsuit and stuff...  icon_question.gif  My take on it is that customers are always right and they paid for what they believe is 'proper 4.0' and not '3.5 + 0.5'. The big boss saying there was a miscommunication and it is actually a feature just added more fuel to the fire.

Sadly, it is not going to be an easy way out for Nvidia on this... unless some kind of recall/refund/compensation/offer on the next GPU purchase perhaps is provided. Easier said than done for sure.

And even sadder is that the 980 Ti has been pushed to 2016.

http://www.ecumenicalnews.com/article/next...d-to-2016-28355

I am just hoping for at least a 6GB variant of Maxwell...  laugh.gif pointless at 1080p but I plan to jump to 4k. Pretty sure in future even at 1080p higher than 4Gb VRAM usage will be in place.
*
Read more y its a feature dude n y its a breakthrough before posting.

defaultname365
post Feb 27 2015, 07:06 PM

Windows® 8.1 | Xbox 360™ | PlayStation® 4
******
Senior Member
1,098 posts

Joined: Nov 2006
QUOTE(cstkl1 @ Feb 27 2015, 06:28 PM)
Read more y its a feature dude n y its a breakthrough before posting.
*
Lol, I know... I stand by what I said. smile.gif

Let's say a car is advertised to run at 400-HP. When it starts going beyond 350-HP (e.g. 360-HP), it starts to slowdown tremendously. So how is it that it is advertised as a 400-HP car? It is supposed to run just fine at over 350-HP. laugh.gif

So whatever "feature" it is, breakthrough or not, it is not making users happy. The so-called "feature" at the end of the day is not mentioned to the user upfront and they feel cheated for what they believed they paid for.




cstkl1
post Feb 27 2015, 07:28 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(defaultname365 @ Feb 27 2015, 07:06 PM)
Lol, I know...  I stand by what I said.  smile.gif

Let's say a car is advertised to run at 400-HP. When it starts going beyond 350-HP (e.g. 360-HP), it starts to slowdown tremendously. So how is it that it is advertised as a 400-HP car? It is supposed to run just fine at over 350-HP.  laugh.gif 

So whatever "feature" it is, breakthrough or not, it is not making users happy. The so-called "feature" at the end of the day is not mentioned to the user upfront and they feel cheated for what they believed they paid for.
*
No not that.

http://www.pcper.com/reviews/Graphics-Card...tations-GTX-970

Read dude.

Understand why this is actually a first. Before this it was never possible. By design following previous gpus including those from AMD.... GTX970 should have been a 3gb card.

So its a feature.. cause its a breakthrough. When you disable the L2 you take off the ROP and MC with it. and afaik normally it a Even number.



This post has been edited by cstkl1: Feb 27 2015, 07:33 PM
Minecrafter
post Feb 27 2015, 07:33 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(SSJBen @ Feb 27 2015, 04:56 PM)
» Click to show Spoiler - click again to hide... «

*
» Click to show Spoiler - click again to hide... «

defaultname365
post Feb 27 2015, 07:50 PM

Windows® 8.1 | Xbox 360™ | PlayStation® 4
******
Senior Member
1,098 posts

Joined: Nov 2006
QUOTE(cstkl1 @ Feb 27 2015, 07:28 PM)
No not that.

http://www.pcper.com/reviews/Graphics-Card...tations-GTX-970

Read dude.

Understand why this is actually a first. Before this it was never possible. By design following previous gpus including those from AMD.... GTX970 should have been a 3gb card.

So its a feature.. cause its a breakthrough. When you disable the L2 you take off the ROP and MC with it. and afaik normally it a Even number.
*
It's a feature, sure. A breakthrough, sure. A first, sure.

But why did Nvidia keep quiet for months only for one user to point out the oddity and thus a cascade of in-depth look into how the 970 was built? hmm.gif

Point is, they should have been upfront. This feature/breakthrough could no doubt benefit future GPUs; conventional way providing just e.g. 7Gb VRAM, but with this method, able to bump the GPU to have "8Gb VRAM" with the last 0.5Gb being slower. But DO mention it upfront instead. Not all gamers pour over Nvidia architecture slideshows to learn how their GPUs are built.

We are dealing with an issue of misinformation here to customers who bought what they 'believed' in and there is no way to defend Nvidia.

Good job on the feature, bad job on relaying the information.
cstkl1
post Feb 27 2015, 07:53 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(defaultname365 @ Feb 27 2015, 07:50 PM)
It's a feature, sure. A breakthrough, sure. A first, sure.

But why did Nvidia keep quiet for months only for one user to point out the oddity and thus a cascade of in-depth look into how the 970 was built?  hmm.gif 

Point is, they should have been upfront. This feature/breakthrough could no doubt benefit future GPUs; conventional way providing just e.g. 7Gb VRAM, but with this method, able to bump the GPU to have "8Gb VRAM" with the last 0.5Gb being slower. But DO mention it upfront instead. Not all gamers pour over Nvidia architecture slideshows to learn how their GPUs are built.

We are dealing with an issue of misinformation here to customers who bought what they 'believed' in and there is no way to defend Nvidia.

Good job on the feature, bad job on relaying the information.
*
Dude ure just now digressing to the misled saga.

I was addressing ure remark about feature. Atleast now u know

Simple question does 295x2 have 8gb.. Titan z 12gb n is it written on the box. Lets not even go further back on all dual gpu single card back to 7950gx2 n 3870x2. Is there any mention on their spec only half is useable.

In retrospect the 970 does have a useable 4gb. So why nobody is harping about it before??

The 970 for what the current feature now is inline with the price. A lot of ppl were happy initially thinking nvidia was stupid by selling a card cheaper by 40% with same amount of rop/l2 n full 4gb of a 980.

The ones i pity is those who slied for 4k. This guys were truly misled n not getting the performance they expected.

Rest single card @1080p user wont see any diff.

This post has been edited by cstkl1: Feb 27 2015, 08:20 PM

128 Pages « < 46 47 48 49 50 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0276sec    0.08    6 queries    GZIP Disabled
Time is now: 28th November 2025 - 04:19 AM