QUOTE(SSJBen @ Feb 8 2015, 02:22 PM)
Hey, with all due respect, can you stop with all your fanboyish remarks? Shows how inmature you are.
Chill brah.Let's use /k logic, high resale value = god. Vios = godcar, iphone = godphone
Nvidia = godcard
NVIDIA GeForce Community V14
|
|
Feb 8 2015, 02:43 PM
Return to original view | Post
#1
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
|
|
|
|
|
|
Feb 10 2015, 12:39 AM
Return to original view | Post
#2
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
|
|
|
Feb 12 2015, 02:58 AM
Return to original view | Post
#3
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(Moogle Stiltzkin @ Feb 12 2015, 01:17 AM) i'm surprised they even intro that card. Nvidia don't want to retire 750(ti) yet? 960gtx seems more relevant than that if want to consider budget card. at the least it has the latest nvidia gpu tech in if, albeit with the lowest fps compared to 970 and 980. so why on earth would someone get the 750 ? i don't get it Yet they are so fast in retiring 650 ti boost... Being waiting to see what they can offer for 950... |
|
|
Mar 18 2015, 06:02 AM
Return to original view | Post
#4
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(terradrive @ Mar 18 2015, 05:52 AM) Great overclocker but you'll go deaf with it at thr same time. Almost the same amount of noise with 290X ref cooler if you overclock the titan x =_= They pretty much expecting "if you are rich enough to buy titan x, you probably rich enough to have custom loop for it..."Titan x inly comes with blower type cooler? |
|
|
Mar 20 2015, 06:46 AM
Return to original view | Post
#5
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(Moogle Stiltzkin @ Mar 20 2015, 04:30 AM) from what i see in the benchmark it's a 4k gpu albeit lower quality settings. seems the gaming with high settings the fps comes under 30fps. me personally i feel 45-60 is what i'm willing to have for gaming FreeSync is Nvidia GSync's counter part...i haven't really seen what the red camp is coming out with. Presumably i heard they will be the first to launch their HBM gpu first (because they made the right decision to go with HBM before Nvidia did after realizing HMC was taking too long). by the way freesync monitors are shipping out now According to this illustration freesync looks better than no freesync at all no freesync ![]() freesync enabled ![]() http://www.extremetech.com/gaming/201568-a...inally-shipping sounds like freesync prevents stuttering that comes from using regular old vsync, but i wonder what the FPS impact will be ? Gsync promised better vsync without any FPS performance impact. Is that the case for freesync as well ? another question also pops in, can you use freesync with a nvidia gpu ? Will nvidia be willing to force users to get gsync monitors to benefit, or will they allow their nvidia camp use freesync as well ? freesync monitors are $50 cheaper than gsync monitors. It's like we don't expect Nvidia's cards to be crossfire-able or running eyefinity... This post has been edited by Human10: Mar 20 2015, 06:47 AM |
|
|
Mar 20 2015, 06:10 PM
Return to original view | Post
#6
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(Moogle Stiltzkin @ Mar 20 2015, 04:41 PM) i think its relevant. do nvidia users want to spend more on gsync rather than freesync that is much cheaper? so hence the discussion :/ AFAIK from all those info I get this few days, not.PS: i use nvidia gpus for long while now. now i wonder if i should consider ati for my next gpu .... depends on what nvidia is going to do about freesync support. mantle wasn't that big a deal to me cause dx12 was over the horizon, but freesync is big deal i don't play evolve but i did watch youtube. none of the fps genres have interested me in a long while, got bored of that genre lelz. blizzards overwatch looks interesting though not sure i fully understand freesync. seems like if it goes above monitor refresh rate, it will auto use v-sync if you had that enabled automatically ? But what about if it's below the minimum refresh rate ? Because the reviews say that it doesn't work too well under refresh rate, so is there an option to disable when it dips below ? So 2 different settings, above is vsync enabled, but below would be disabled, i wonder if that is the case i'm no fanboy, but looking at the facts ....... there are indeed legitimate questions here By the way my monitor a Dell u2413 says has DisplayPort 1.2a does that mean it can support freesync ? Similar to that of G-Sync, you need an entire new monitor with full compatibility with it, just that it sans the costly hardware module (but as suggested by cstkl1, it won't entirely stopping manufacturers to charge a premium for FreeSync compatible monitors). This post has been edited by Human10: Mar 20 2015, 06:11 PM |
|
|
|
|
|
Mar 21 2015, 11:20 PM
Return to original view | Post
#7
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
|
|
|
Mar 23 2015, 08:46 PM
Return to original view | Post
#8
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(ruffstuff @ Mar 23 2015, 08:33 PM) I don't think there will be a 980Ti. The Titan x is essentially a 980Ti with Titan branding and pricing. It is nothing like the previous titan. The missing of double precision says it is a 980ti. When AMD's R9 390X launched, Ngreedia will have to magically come out with a 980TI, or may be Titan X TI. This post has been edited by Human10: Mar 23 2015, 08:48 PM |
|
|
Mar 25 2015, 03:31 PM
Return to original view | Post
#9
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
|
|
|
Mar 31 2015, 11:25 PM
Return to original view | Post
#10
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
|
|
|
Apr 10 2015, 11:46 PM
Return to original view | Post
#11
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(PsychoHDxMachine @ Apr 10 2015, 11:37 PM) I did survey, if processor maybe not playing high role in gaming. Some people recommend i% over i7 if normal usage. I set all ultra, average 80-130, around that figure. Havent try on UNITY yet. check on youtube, if all high setting, maybe less than 100 fps. like ngkhanmein said, ur eyes can see 150 fps? In some games, processors do matter on performance. It depend on whether the performance is bottleneck by GPU or CPU.I thought gtx980 can max out everything then fps should be 100 above Need to borrow cyclops tools from X-Men then only can verify. Just curious why only, some can shoot so high. As long more than 60 should be okay as long all set ultra/high. Paiseh, newbie on this stuff. |
|
|
Apr 10 2015, 11:48 PM
Return to original view | Post
#12
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(PsychoHDxMachine @ Apr 10 2015, 11:43 PM) i see, means this 980 can max out all setting easily without any problem (except shadow mordow). On 1080p, certainly. On higher resolution, it will require more than one card.As long more than 60 fps consider okay? newbie on this thought wanna buy 970 instead of 980, because of vram 3.5gb. i chose 980. important was because GST, have to rush. otherwise ++ For the 60FPS, we can't guarantee what frame rate will be smooth. It all depend on what your eyes perceive... Another thing to consider for further "smoothen" the game is G-sync monitor, have a check on it. Well, your money, your choice. |
|
|
Apr 10 2015, 11:49 PM
Return to original view | Post
#13
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(PsychoHDxMachine @ Apr 10 2015, 11:47 PM) oh, but nvm already since bought i5. if want to change, rugi and have to suffer extra money for GST. Well, luckily your mobo now is H97, which mean it can support Broadwell.May be when Skylake come out then you can consider opt for a 2nd hand Broadwell i7. |
|
|
|
|
|
Apr 13 2015, 11:45 AM
Return to original view | Post
#14
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(NewbieTech @ Apr 13 2015, 10:52 AM) Gigabyte's giveaway (as of lowyatnet homepage) is 4GB 960, so I guess it is either available already or soon?Another thing, I highly doubt 4GB VRAM will be much useful for GTX960. If given choice (of course the higher price too), I will always chose "3.5GB" GTX970 over 4GB GTX 960. Nvidia's reference 960 is paired with 2GB because under its given performance, it probably will only be able to utilize under 2GB most of the time. Unless you plan to SLI, then only 4GB make some sense, but again single 970 pawn it easily and I somehow curious how will 128bit memory bus affect in SLI. This post has been edited by Human10: Apr 13 2015, 11:50 AM |
|
|
Apr 13 2015, 03:34 PM
Return to original view | Post
#15
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(Minecrafter @ Apr 13 2015, 03:20 PM) Well, Nvidia may repeat the history of TI more powerful than Titan... 780TI is more powerful than Titan in term of graphic processing power.Some more, high chances that AMD will use Titan X as a benchmark for performance, so no point Nvidia launch another 980 just to lose to 390X, not? |
|
|
Apr 13 2015, 03:46 PM
Return to original view | Post
#16
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(genjo @ Apr 13 2015, 01:08 PM) Bro, you do know 970 and 980TI gotta be different segment right? 980TI's price will be at least double of 970... Even if 980TI launch, don't expect 970 to drop price by a huge margin due to that, at most is RM100-200, that's if it really drop price by then. If you factor in AMD, it's a totally different story then. But again, 380X is predicted to be just another rebrand of 290X and the targeted price isn't going to be far off 970. If I am Nvidia, I certainly won't drop my price just for that. If you talk about 970's Pascal successor, according to a Nvidia's trend, they won't launch it before a mid end Pascal is launched. Another thing is, we don't know if Nvidia will ever price their 2nd top tier card that cheap nor with performance such close to their top card. This post has been edited by Human10: Apr 13 2015, 03:51 PM |
|
|
May 14 2015, 04:57 PM
Return to original view | Post
#17
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(adamcyr @ May 14 2015, 04:46 PM) Im planning to purchase the GIgabyte GTX970 G1 cos of the cooling solution. Based on the review on the internet, it was like 50/50. Some of thm give this card a good rating and some of thm is over round. So wish to get more info frm all sifu here. Don't see anything wrong with the card.Performance wise it performance as much as what a 970 will offer. Gigabyte's card usually perform above average in temperature also. The only complain I heard around is its enormous size and length. |
|
|
May 21 2015, 12:23 AM
Return to original view | Post
#18
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
QUOTE(ChocChristy @ May 20 2015, 11:30 PM) I need some opinions from GTX 970 users. yaomingb1tchplease.jpgI have heard from terence and ngkhanmein. I'd like to collect more data however. I am curious about which PSU and their wattage you use for your rig with GTX 970 and how long have you been using it. I am currently torn between Capstone 650w and superflower 600w. From all my reading, it seems like gtx970 can work quite well with 50w. SO it seems to me 600w is more than enough, I doubt degradation over time would be such a huge issue, would it? Nvidia's official spec for GTX970 require only 500W PSU for whole system (145W for GPU), and mind you Nvidia always put a large headroom for the value (500-145=355W for the rest of the system, while most Intel system nowadays hardly ever breach 200W without GPU...). Here's what a i7 4790k rig (without GPU) will consume on load. http://www.legitreviews.com/intel-core-i7-...eview_143880/13 But still, I have to admit things aren't always that certain and simple. Overclocking can increase the power draw pretty significantly, so let's take a look at what's the power drew by pretty much the most OCed 970 out there. ![]() Still safely under 300W. Combined with the rest of your system which pretty much under 200W generally, you still have spare from a 500W PSU. As for degradation, we rarely heard people fail their PSU capacity due to degradation. But for more details, may be you can read more from PSU reviews site like johnnyguru, hardwaresecrets and so on. What I can conclude to you here is that a proper 600W is pretty damn enough for GTX970 rig. This post has been edited by Human10: May 21 2015, 12:40 AM |
|
|
May 29 2015, 04:13 AM
Return to original view | Post
#19
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
|
|
|
May 30 2015, 01:14 AM
Return to original view | Post
#20
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
6,774 posts Joined: Nov 2010 |
|
|
Topic ClosedOptions
|
| Change to: | 0.0412sec
0.58
7 queries
GZIP Disabled
Time is now: 28th November 2025 - 10:44 AM |