Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
6 Pages < 1 2 3 4 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V14

views
     
marfccy
post Jan 28 2015, 03:45 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


frankly, TLDR version of ram-gate

> users upset cos kenot muh 4K with 3.5GB VRAM


i do agree with the parts where GTX970 is still a whooping powerful card, just not at VRAM intensive games
marfccy
post Jan 28 2015, 09:48 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(yaphong @ Jan 28 2015, 09:38 PM)
What I don't understand is this;

Prior to this RAM-Gate,
Everyone was like, uhh 970 is the best card and performs better than 780.

After discovering that the 0.5GB RAM is not the same fast RAM,
Uhhh now my 970 sucks and I need refund badly. NVIDIA penipu scammer

My question is that, has the performance of the GTX 970 changed after knowing the issue? If not, then why everyone cried for a refund? Or just people wanted to take this advantage to get a free upgrade to GTX 980 instead?
*
as per my previous post

QUOTE
frankly, TLDR version of ram-gate

> users upset cos kenot muh 4K with 3.5GB VRAM


i do agree with the parts where GTX970 is still a whooping powerful card, just not at VRAM intensive games

marfccy
post Jan 29 2015, 06:16 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(gogo2 @ Jan 29 2015, 06:02 AM)
I wonder if GTX970 will fully support Direct12 eventhough the spec said it does.
*
it is said even the GTX600 series would support DX12

heck, even Fermi is rumored to support DX12 sweat.gif

as for whether it supports full, i think itll be like DX11 vs DX9 vs DX10 again?
marfccy
post Jan 29 2015, 03:15 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(ngkhanmein @ Jan 29 2015, 02:54 PM)
gsync also gimmick rite? hmm.gif AMD 4GB really 4GB?
*
you havent seen it IRL have you?

its not a gimmick, it works

problem is, the specific application for it is solely gaming only (at least thats how i see it)
marfccy
post Jan 31 2015, 03:52 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(pspslim007 @ Jan 31 2015, 02:47 AM)
hey guys, just curious, since 970 having problem about vram, so i was wondering, is it a  good purchase if i want to buy 970 or should i go with other cards ? Thanks !
*
QUOTE(wongtheboy92 @ Jan 31 2015, 03:37 AM)
Me, too. Having such dilemma.
*
just think about this

1. are you playing at >1080p resolution?
2. can you live with not maxing any setting that eats VRAM?

if you answer no and yes respectively to above, GTX970 is still a darn good card. and still bang for buck card too

otherwise, head to red camp and check out their 290X or 290s

marfccy
post Jan 31 2015, 01:23 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Najmods @ Jan 31 2015, 12:51 PM)
I been changing back and forth from nvidia to AMD, I dont experience any problem with both. Dont get too attached to one company for no reason.
*
for my case, its

unless AMD releases a card that have good reference cooler, im sticking to Nvidia laugh.gif
marfccy
post Jan 31 2015, 02:03 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Najmods @ Jan 31 2015, 01:53 PM)
Reference cooler? There are better 3rd party cooler that done better job for roughly the same money so that is a moot point to bias to one side. Talking about reference cooler, not the entire lineup uses the same great cooler, for example GTX 760 uses crappy stock cooler that never ever touch the boost clock the longer you play the game.
*
i prefer the ref cooler method of dumping heat outside of case than letting it recirculate inside

3rd party coolers are indeed superior in every way, but not in the aspect of dumping heat outside of casing

my old GTX660 stock reference cooler did a pretty decent job, even allowing a minor OC and staying well within 76-82C. but thats a 140W card, so it might not count much.

even so, my current 780 cooling perf is rather impressive so far for a reference cooler. 76-78C on room temp Shadow Of Mordor 1440p, all max cept texture and thats with an OC of 100/200MHz (reaching boost clock of 1084MHz). with AC room of ambient 24C, it drops to 73C max, avg 69C

unless AMD is capable to giving to customers reference cooler with the capability like Nvidia-esque TITAN cooler, i have no choice but to stick to NGreedia
marfccy
post Jan 31 2015, 03:50 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(terradrive @ Jan 31 2015, 02:43 PM)
You have Carbide Air 540, should be no problem using open cooler cards  shocking.gif
*
not with the stock fans, there are still hot spots in the case even with 2 stock AF140s

i changed to 3x120mm Silverstone APs

works like magic, no hot spots noticed so far
marfccy
post Feb 4 2015, 09:32 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(terradrive @ Feb 4 2015, 06:41 PM)
The GTX 780 launched at $650 MSRP while GTX 980 was $550, but 780's time it's 3.2+ ringgit per usd lor heh.
*
i think if they did release a 980 Ti, itll take over previous 780 Ti price point when it first released
marfccy
post Feb 8 2015, 03:34 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Moogle Stiltzkin @ Feb 8 2015, 03:03 AM)
exactly. besides if we look at the performance charts, every next gen e.g. that usual 2 year product update, the usual performance change is usually roughly 15%

only some of their products had a huge leap in performance over the immediate previous generation especially when it's new architecture.
but it's not only fps % increments to look at, there are other feature sets that should be taken note of

- directx 12 (though not sure how soon the newer games will start having this)
- gsync support (still too expensive to own a monitor with gsync module. The only other alternative is free sync on ati cards, but i don't think their tech is as good as gsync)
- HBM (ati's product will be getting this sooner than nvidia by roughly a year ahead. Because nvidias volta HMC didn't pan out so they switched to HBM as well, so thats why they are slower than ati this time around) anyway this tech is vastly increased memory bandwidth which has been a long time bottleneck on gpus. Though it's presumed volta will be 2nd gen of hbm with even higher bandwidth, possibly double if the rumors to be believed.
- unified memory (sounds good could be a game changer)
- graphics technologies to improve visuals (physx, gameworks on nvidia)
another interesting development, now their saying pascal and future gpus could possibly add vram from a sli configuration. before vram wasn't a single pool. but come pascal onwards it will, but with some rules to it. games need to be coded to support it or something.
*
tbh both are similar, theyre some sort of adaptive v-sync. just that FreeSync leaves the GPU to do the processing compared to Nvidia's GSync module which is an external unit

prolly explains why Nvidia charges extra for the module
marfccy
post Feb 9 2015, 01:19 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


my ATi Radeon X300 died after 8 years of service [2004-2012]
marfccy
post Feb 9 2015, 03:38 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Moogle Stiltzkin @ Feb 9 2015, 11:32 AM)
» Click to show Spoiler - click again to hide... «

*
yep, pretty much sums up similar to what ive read so far

GSync is superior due to the module itself
marfccy
post Feb 9 2015, 04:02 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Moogle Stiltzkin @ Feb 9 2015, 03:43 PM)
yeah but the cost.... even i don't have a gsync monitor unfortunately  cry.gif

i use a Dell U2413 24'' GB-R led AH-IPS wide screen
http://www.tftcentral.co.uk/reviews/dell_u2413.htm
mostly because i watch anime and play games. Would suck for me having tn panel. A pro gamer would be more biased for a TN panel with a higher refresh rate 120-144hz coupled with a gsync module. Can't say i blame them, but we each have our own priorities  nod.gif

so either a ah-ips or a tn gsync high refresh rate monitor for both roughly the same price (both expensive and yet your still sacrificing one thing over another ) rclxub.gif
*
well, rmb what they said. high quality IPS panel dont exactly come cheap

not to mention the response rate is unacceptable for gamers

frankly, GSync to me is good. but for a very specific purpose

wont be getting it at all, unless the monitor+module itself is affordable
marfccy
post Feb 11 2015, 04:38 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(goldfries @ Feb 11 2015, 01:51 AM)
Looks like it is less resource hogging when compared to FRAPS.

What's the recorded video size? The ones from FRAPS is huge which is a problem when one games on SSD.
*
i record 1080p at 60FPS at highest bitrate, 5min video takes about 1.5-2GB
marfccy
post Feb 11 2015, 09:56 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(amxpayne67 @ Feb 11 2015, 04:37 PM)
I noticed that Shadowplay recording doesnt include interlay, like in the shadowplay setting. It should be able to show FPS and webcam interlay too.
*
you can set them

i have FPS counter on mine, i disabled the webcam as i dont have webcam anyway
marfccy
post Feb 12 2015, 03:41 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


all these talk on Madvr, reclock, MPC HC.. im using MPC HC and KCP + SVP

stresses my card pretty darn too, 58C on normal video playing sweat.gif

and the lack of guides makes it hard for me to tweak Madvr settings
marfccy
post Feb 12 2015, 02:48 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Moogle Stiltzkin @ Feb 12 2015, 09:15 AM)
thumbup.gif
i still don't understand why people still swear by vlc, when kcp has obviously become the best easy option  sweat.gif  still can tweak further sure, but even the default presets are quite good already.
*
i used to use VLC, so i can see the appeal

its totally hassle free, just install and play

but frankly, MPC HC is better in terms of quality
marfccy
post Mar 2 2015, 03:28 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(DeepMemory @ Mar 2 2015, 03:05 AM)
I was asking about the GPU coolers.
It is actually the normal temp due to the cooler design. I have also asked Cyntrix about this and they say its normal. But i just think that how can such an efficient GPU reach the temps of 83 degree. Just want to try switching the coolers since both the mounting holes on my Asus HD7850 and my Zotac GTX 960 looked similar.
*
it may look similar, but it doesnt. small things like 1-2mm away and then the cooler cannot fit with the PCB dee

also, non-Titan reference cooler is well, bad. it gets the job done, but not that cool

but do check with your casing airflow as well. i used to have reference GTX660, and max temps was only 82C (acceptable)
marfccy
post Mar 2 2015, 03:53 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(DeepMemory @ Mar 2 2015, 11:11 AM)
Well, thing is its not a reference cooler. Here is the link My previous Asus HD7850 max temps at 63 degrees though so I don't think my casing airflow is a problem.
*
hmm.gif

well, Zotac coolers are never known as darn good ones anyway

but my point still stands, the PCB may not fit. ASUS custom design their PCBs alot (or maybe mostly on the high end stuff only)
marfccy
post Mar 2 2015, 10:36 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(DeepMemory @ Mar 2 2015, 08:23 PM)
Nope mine is the normal version . I agree with you that the gpu should not reach 80 degrees since this is a very power efficient card.  hmm.gif
*
that is still the DCU II cooler, TOP usually means higher factory overclock smile.gif

6 Pages < 1 2 3 4 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0473sec    0.35    7 queries    GZIP Disabled
Time is now: 4th December 2025 - 02:04 AM