frankly, TLDR version of ram-gate
> users upset cos kenot muh 4K with 3.5GB VRAM
i do agree with the parts where GTX970 is still a whooping powerful card, just not at VRAM intensive games
NVIDIA GeForce Community V14
NVIDIA GeForce Community V14
|
|
Jan 28 2015, 03:45 PM
Return to original view | Post
#21
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
frankly, TLDR version of ram-gate
> users upset cos kenot muh 4K with 3.5GB VRAM i do agree with the parts where GTX970 is still a whooping powerful card, just not at VRAM intensive games |
|
|
|
|
|
Jan 28 2015, 09:48 PM
Return to original view | Post
#22
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
QUOTE(yaphong @ Jan 28 2015, 09:38 PM) What I don't understand is this; as per my previous postPrior to this RAM-Gate, Everyone was like, uhh 970 is the best card and performs better than 780. After discovering that the 0.5GB RAM is not the same fast RAM, Uhhh now my 970 sucks and I need refund badly. NVIDIA penipu scammer My question is that, has the performance of the GTX 970 changed after knowing the issue? If not, then why everyone cried for a refund? Or just people wanted to take this advantage to get a free upgrade to GTX 980 instead? QUOTE frankly, TLDR version of ram-gate > users upset cos kenot muh 4K with 3.5GB VRAM i do agree with the parts where GTX970 is still a whooping powerful card, just not at VRAM intensive games |
|
|
Jan 29 2015, 06:16 AM
Return to original view | Post
#23
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
|
|
|
Jan 29 2015, 03:15 PM
Return to original view | Post
#24
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
|
|
|
Jan 31 2015, 03:52 AM
Return to original view | Post
#25
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
QUOTE(pspslim007 @ Jan 31 2015, 02:47 AM) hey guys, just curious, since 970 having problem about vram, so i was wondering, is it a good purchase if i want to buy 970 or should i go with other cards ? Thanks ! QUOTE(wongtheboy92 @ Jan 31 2015, 03:37 AM) just think about this1. are you playing at >1080p resolution? 2. can you live with not maxing any setting that eats VRAM? if you answer no and yes respectively to above, GTX970 is still a darn good card. and still bang for buck card too otherwise, head to red camp and check out their 290X or 290s |
|
|
Jan 31 2015, 01:23 PM
Return to original view | Post
#26
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
|
|
|
|
|
|
Jan 31 2015, 02:03 PM
Return to original view | Post
#27
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
QUOTE(Najmods @ Jan 31 2015, 01:53 PM) Reference cooler? There are better 3rd party cooler that done better job for roughly the same money so that is a moot point to bias to one side. Talking about reference cooler, not the entire lineup uses the same great cooler, for example GTX 760 uses crappy stock cooler that never ever touch the boost clock the longer you play the game. i prefer the ref cooler method of dumping heat outside of case than letting it recirculate inside3rd party coolers are indeed superior in every way, but not in the aspect of dumping heat outside of casing my old GTX660 stock reference cooler did a pretty decent job, even allowing a minor OC and staying well within 76-82C. but thats a 140W card, so it might not count much. even so, my current 780 cooling perf is rather impressive so far for a reference cooler. 76-78C on room temp Shadow Of Mordor 1440p, all max cept texture and thats with an OC of 100/200MHz (reaching boost clock of 1084MHz). with AC room of ambient 24C, it drops to 73C max, avg 69C unless AMD is capable to giving to customers reference cooler with the capability like Nvidia-esque TITAN cooler, i have no choice but to stick to NGreedia |
|
|
Jan 31 2015, 03:50 PM
Return to original view | Post
#28
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
|
|
|
Feb 4 2015, 09:32 PM
Return to original view | Post
#29
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
|
|
|
Feb 8 2015, 03:34 AM
Return to original view | Post
#30
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
QUOTE(Moogle Stiltzkin @ Feb 8 2015, 03:03 AM) exactly. besides if we look at the performance charts, every next gen e.g. that usual 2 year product update, the usual performance change is usually roughly 15% tbh both are similar, theyre some sort of adaptive v-sync. just that FreeSync leaves the GPU to do the processing compared to Nvidia's GSync module which is an external unitonly some of their products had a huge leap in performance over the immediate previous generation especially when it's new architecture. but it's not only fps % increments to look at, there are other feature sets that should be taken note of - directx 12 (though not sure how soon the newer games will start having this) - gsync support (still too expensive to own a monitor with gsync module. The only other alternative is free sync on ati cards, but i don't think their tech is as good as gsync) - HBM (ati's product will be getting this sooner than nvidia by roughly a year ahead. Because nvidias volta HMC didn't pan out so they switched to HBM as well, so thats why they are slower than ati this time around) anyway this tech is vastly increased memory bandwidth which has been a long time bottleneck on gpus. Though it's presumed volta will be 2nd gen of hbm with even higher bandwidth, possibly double if the rumors to be believed. - unified memory (sounds good could be a game changer) - graphics technologies to improve visuals (physx, gameworks on nvidia) another interesting development, now their saying pascal and future gpus could possibly add vram from a sli configuration. before vram wasn't a single pool. but come pascal onwards it will, but with some rules to it. games need to be coded to support it or something. prolly explains why Nvidia charges extra for the module |
|
|
Feb 9 2015, 01:19 AM
Return to original view | Post
#31
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
my ATi Radeon X300 died after 8 years of service [2004-2012]
|
|
|
Feb 9 2015, 03:38 PM
Return to original view | Post
#32
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
|
|
|
Feb 9 2015, 04:02 PM
Return to original view | Post
#33
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
QUOTE(Moogle Stiltzkin @ Feb 9 2015, 03:43 PM) yeah but the cost.... even i don't have a gsync monitor unfortunately well, rmb what they said. high quality IPS panel dont exactly come cheapi use a Dell U2413 24'' GB-R led AH-IPS wide screen http://www.tftcentral.co.uk/reviews/dell_u2413.htm mostly because i watch anime and play games. Would suck for me having tn panel. A pro gamer would be more biased for a TN panel with a higher refresh rate 120-144hz coupled with a gsync module. Can't say i blame them, but we each have our own priorities so either a ah-ips or a tn gsync high refresh rate monitor for both roughly the same price (both expensive and yet your still sacrificing one thing over another ) not to mention the response rate is unacceptable for gamers frankly, GSync to me is good. but for a very specific purpose wont be getting it at all, unless the monitor+module itself is affordable |
|
|
|
|
|
Feb 11 2015, 04:38 AM
Return to original view | Post
#34
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
|
|
|
Feb 11 2015, 09:56 PM
Return to original view | Post
#35
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
|
|
|
Feb 12 2015, 03:41 AM
Return to original view | Post
#36
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
all these talk on Madvr, reclock, MPC HC.. im using MPC HC and KCP + SVP
stresses my card pretty darn too, 58C on normal video playing and the lack of guides makes it hard for me to tweak Madvr settings |
|
|
Feb 12 2015, 02:48 PM
Return to original view | Post
#37
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
QUOTE(Moogle Stiltzkin @ Feb 12 2015, 09:15 AM) i still don't understand why people still swear by vlc, when kcp has obviously become the best easy option its totally hassle free, just install and play but frankly, MPC HC is better in terms of quality |
|
|
Mar 2 2015, 03:28 AM
Return to original view | Post
#38
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
QUOTE(DeepMemory @ Mar 2 2015, 03:05 AM) I was asking about the GPU coolers. it may look similar, but it doesnt. small things like 1-2mm away and then the cooler cannot fit with the PCB deeIt is actually the normal temp due to the cooler design. I have also asked Cyntrix about this and they say its normal. But i just think that how can such an efficient GPU reach the temps of 83 degree. Just want to try switching the coolers since both the mounting holes on my Asus HD7850 and my Zotac GTX 960 looked similar. also, non-Titan reference cooler is well, bad. it gets the job done, but not that cool but do check with your casing airflow as well. i used to have reference GTX660, and max temps was only 82C (acceptable) |
|
|
Mar 2 2015, 03:53 PM
Return to original view | Post
#39
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
QUOTE(DeepMemory @ Mar 2 2015, 11:11 AM) Well, thing is its not a reference cooler. Here is the link My previous Asus HD7850 max temps at 63 degrees though so I don't think my casing airflow is a problem. well, Zotac coolers are never known as darn good ones anyway but my point still stands, the PCB may not fit. ASUS custom design their PCBs alot (or maybe mostly on the high end stuff only) |
|
|
Mar 2 2015, 10:36 PM
Return to original view | Post
#40
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,254 posts Joined: Nov 2011 |
QUOTE(DeepMemory @ Mar 2 2015, 08:23 PM) Nope mine is the normal version . I agree with you that the gpu should not reach 80 degrees since this is a very power efficient card. that is still the DCU II cooler, TOP usually means higher factory overclock |
|
Topic ClosedOptions
|
| Change to: | 0.0473sec
0.35
7 queries
GZIP Disabled
Time is now: 4th December 2025 - 02:04 AM |