Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
6 Pages  1 2 3 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V14

views
     
marfccy
post Jan 7 2015, 09:01 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Minecrafter @ Jan 7 2015, 12:56 PM)
LOL!My bad,always have the GM200 name mixed up. doh.gif
No good competition from NVidia around RM500-600 price range.And no,only Z97 will help in dual cards,due to x8 speed on 2 PCI-E lanes.
*
only SLi requires minimum x8 speed to work each GPU

Crossfire will work even with just x4
marfccy
post Jan 9 2015, 06:38 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


just bought 3DMark on Steam at a discount, man the Firestrike is intense shit
user posted image

CPU score drag entire thing down laugh.gif
marfccy
post Jan 9 2015, 03:30 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(rav3n82 @ Jan 9 2015, 09:33 AM)
Just curious, is your GTX 780 running at stock or overclocked speeds?

Coming from my previous 780 AMP! to the 970, I can see that there is improvements in terms of the Graphics Score for 3DMark, however in Unigine it doesn't perform as well. sweat.gif
*
purely stock, 863/1502MHz

have yet to test it on 100MHz OC

wait, 970 drop score in Unigine compared to 780? hmm.gif
marfccy
post Jan 9 2015, 09:22 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(rav3n82 @ Jan 9 2015, 05:57 PM)
Yup, perhaps the GM204 doesn't handle Unigine very well. tongue.gif CPU all running at stock speeds, monitor 1080p.

GTX 780 (clockspeed 1006MHz) - FPS 58.5, Score 1473

GTX 970 (clockspeed 1152MHz) - FPS 55.2, Score 1391
*
mere 3FPS though, i suppose new drivers might fix it

and i thought only AMD Omega driver cause drop in synthetic benchmarks hmm.gif

QUOTE(alfiejr @ Jan 9 2015, 06:16 PM)
overclocking the gpu increases the score by quite a lot  brows.gif
*
ill do that when im back, just bought it on sale this morning, only tested Sky Diver and Firestrike

even on Firestrike normal, my system is chugging. 4K Firestrike will kill me sweat.gif
marfccy
post Jan 10 2015, 05:21 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


pumped GPU to 963/1602MHz and ran Firestrike again
» Click to show Spoiler - click again to hide... «


with Shadowplay turned off, i pumped couple of more points
» Click to show Spoiler - click again to hide... «

marfccy
post Jan 10 2015, 02:53 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(alfiejr @ Jan 10 2015, 01:06 PM)
nice increase, mine is around 92xx. I think having a beefy cpu like the core i7 will boost the score even moreĀ  sweat.gif
*
yeah, as from the pictures above, the Physics test screw my CPU upside down laugh.gif

EDIT: though, even with Sandy Core i5, im still surprised at its performance level

QUOTE(shikimori @ Jan 10 2015, 01:59 PM)
nice bro , for some reason if I OCed and add more voltage my pc will starts to turn off (complete shutdown) no restart .

Suspect I will be needing new PSU soon . Temp looks okay though

This is the last push I managed before
» Click to show Spoiler - click again to hide... «

*
yeah man, could be PSU problem. suggest you get a new one if you really want to push it maximum

This post has been edited by marfccy: Jan 10 2015, 02:56 PM
marfccy
post Jan 10 2015, 09:43 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(shikimori @ Jan 10 2015, 09:17 PM)
is this any good or can go further ?

user posted image

LOL right after ended the benchmark my nvidia driver have stopped working .. Dayum I really need to amp up the voltage but then again that will cause shutdown  cry.gif
for single card I suppose it serves me well but now have 2 970s  despite the low power consumption I still have issues .  Maybe just like u said this cap ayam acbel is a noGO

can give me some recommendation for PSU ? I got plenty of hdd , 2 ssd + soundcard T_T oced my proc as well .
*
oh hey that benchmark laugh.gif

i tried it too
» Click to show Spoiler - click again to hide... «


for dual cards, you cant go wrong with Seasonic G/X series, CM V series and Corsair AX/HX series
marfccy
post Jan 15 2015, 09:05 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(ssxcool @ Jan 15 2015, 08:27 PM)
then why i do not get above 60 fps? using very high setting

i feel so cheated by anandtech gpu bench  cry.gif  cry.gif
*
try check what CPU anandtech uses first sweat.gif


marfccy
post Jan 17 2015, 09:21 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(chocobo7779 @ Jan 17 2015, 04:21 PM)
Let people buy up all their 970s until shortage... then use the opportunity to raise up prices...  brows.gif brows.gif

Then people is forced to get the weaker 960s... brows.gif

Note: This is a speculation tongue.gif
*
and then Nvidia troll later when true Maxwell cards are released whistling.gif
marfccy
post Jan 18 2015, 08:33 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(SSJBen @ Jan 18 2015, 12:59 PM)
Tested a couple of games yesterday, namely DA:I and Alien Isolation.

As we all know DA:I is a bit of a hit and miss when it comes to PC optimisation, it's not entirely solid but for the most part it runs decent. My 970 top out at 3.7GB (3702mb to be specific) VRAM on 1440p, max settings. Switch it down to 1080p, max settings and it was hovering around 3680MB. However I did get stuttering once the VRAM went about 3.5GB in DA:I, which is what many others on the internet has already reported.
I wouldn't say the stuttering is terrible till the point of being unplayable, but it can get annoying. Fortunately DA:I isn't a game where you need twitch reflexes, but if this issue crops up in other games then it'll be quite a bad gaming experience.

Alien Isolation on the other hand runs on pretty much any machine and it's example of great optimisation on PC games. Obviously the game isn't stressful enough to use 4GB of VRAM, but what I did was downscale from 4k to 1080p. VRAM usage was around only 3GB. Obviously no stuttering, but I just wanted to try a game where everything is all well and dandy.

Will test a few other games later.
960 is expected to be near 770 performance. Given Maxwell's track record of OC ability, it's quite a given that the 960 will surpass 770 in performance. It's an assumption of course, but I think at this point it's a safe one.
*
its always weird how some older gen games that looks way better than current, yet eats up less VRAM sweat.gif

for exp, Crysis 3. its looks superb and beautiful, and it doesnt even chug under <3GB VRAM cards too
marfccy
post Jan 18 2015, 09:50 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(terradrive @ Jan 18 2015, 09:43 PM)
I actually love newer memory chugging games to be honest.

For example Far Cry 4's texture looked superb and detailed compared to Far Cry 3 with the same engine. And both running about the same speed on the same hardware.
*
i like it as well, but its sort of an unnecessary usage of GPU resource imo

some things we dont notice much during heat of gaming, texture wise, unless its really bad textures or im actively looking for it, its not noticeable to me
marfccy
post Jan 19 2015, 01:06 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(SSJBen @ Jan 19 2015, 12:48 AM)
For years developers has been finding ways to compress textures and optimize fillrate for games. Suddenly, devs just say "oh wadaheck" when consoles suddenly have 8GB of RAM. Then suddenly we are getting all these games which needs a gigantic step in VRAM just because, "why not?".
*
which gave GPU makers ideas of "oh hey, lets make a card that have high VRAM, just cause we can and people will still pay for it. then, we can proceed to release newer cards with slightly abit more VRAM just cause "console demands it"

the idea of utilising more VRAM for graphics is good, but lets face it. going high resolution texture pack is not so important as 4K is still in infancy and yet to be widely adopted. so the needs of extremely high VRAM is not required for now

not to mention, devs will also go "ahh why the hell need to compress?" and then massive influx of uncompressed textures appearing that made GPUs suffer just cause lack of VRAM despite being rather powerful


TLDR version : unless 8GB VRAM GPUs are standards, this is somehow like another scheme to make us upgrade parts too often eventhough the GPU itself is more than capable. the only issue is VRAM throttling perf

This post has been edited by marfccy: Jan 19 2015, 01:08 AM
marfccy
post Jan 19 2015, 10:01 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(amxpayne67 @ Jan 19 2015, 09:55 AM)
I thought only multi monitor setup use a lot of VRAM? First time heard VRAM throttling
*
its due to massive spam of uncompressed textures by devs, which causes the VRAM to be overloaded and struggling

not so long ago, even 2GB is more than enough for 1440p

but with the introduction of 8GB VRAM in PS4, devs are now taking opportunity to utilise more VRAM

so despite GPU having more than enough power to deal with the game, you get frame drops due to lack of VRAM
marfccy
post Jan 21 2015, 03:37 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(terradrive @ Jan 21 2015, 09:05 AM)
If you have lower VRAM size then just set your settings for lower sized texture in the game settings, that's all...

But we are in 2015 already. AMD or Nvidia should push 8GB VRAM for their top of the line cards now, since VRAM isn't that expensive anymore.

» Click to show Spoiler - click again to hide... «

GTX 960 is a dud by Nvidia, it's 2015 yet the performance is like a GTX 760...

Performance increase from OC is not much though compared to GTX 780 or GTX 970. And interesting to see the lousy overclocker R9 290 (as everybody been saying) had performance increase as good as 780/970.

Mid range gamers are still screwed for years with lousy performance lol. Most berbaloi GPU still over RM1k  laugh.gif
*
nah, my negativity on >VRAM is more on how older games still can have awesome textures yet doesnt max out the VRAM in the GPUs
marfccy
post Jan 21 2015, 09:21 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(terradrive @ Jan 21 2015, 04:46 PM)
Older games textures aren't that awesome compared to latest games. Most of them are so blurry including those on titanfall.
*
well SSJBen sort of answered what i wanted to answer sweat.gif

those MP focused games have to be avoid using as examples, as theyre never meant for "high end graphics" anyway, they dont cater to that

even with my Skyrim with 4K res texture pack, even if its filled with bucket full of other graphical mods, it barely taxes 3GB VRAM cards. and it still look superb as well. their stock one was nasty as fuk btw, 128x128 reso max doh.gif

and then suddenly Shadow Of Mordor comes in with Ultra Texture DLC, i tried it. and game keep crashing due to insufficient memory (GPU-Z shows maxing at 3+GB VRAM and being throttled by that)

and then visual checking time, i see no discernible differences (granted i only downsample from 2880x1620p) as their High settings is plenty superb already

the trend of constantly spamming uncompressed textures just because "muh next gen" is getting out of hand these days. Watchdogs look like crap. yet requires >3GB VRAM. what? how???

if this trend continues just cause "marketing tactic of more is better", GPU manufacturers cannot catch up as 8GB GDDR5 are not cheap and easily to mass produced yet (theres a reason workstation cards with 12GB VRAM are obnoxiously priced)


in short, i like how the direction the market is going, but only if the higher VRAM is to be utilised efficiently.
marfccy
post Jan 24 2015, 05:12 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(THE ZUL @ Jan 24 2015, 03:39 PM)
hi,

my Leadktek GTX 670 shown some strange thing lately.
the temperature is spiking to 90-100c when in game (watchdogs,far cry 4,fifa 15 and dragon age)

i thought my normal load temp around 75-80c.

idle temp is 40-45c.

i already assemble it, clear dust and replace new thermal.
but still the same.

i dont't OC it.

does my GC went bad?
PSU faulty?

i bought it in 2012.  cry.gif  rclxub.gif
*
get one of those GPU monitor software, change the fan curve as you desire then run a game. check if the temperature reach over the 80C limit again

i think this is one of those case where the card just isnt cooled properly
marfccy
post Jan 25 2015, 05:22 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(chocobo7779 @ Jan 25 2015, 04:55 PM)
So do we still recommend the 970?  Since the VRAM issue seems to be pointing out to be a hardware issue. sweat.gif
*
advise them not to go GTX970 first until problem is resolved

later we kena bombed balik :3
marfccy
post Jan 25 2015, 05:28 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(chocobo7779 @ Jan 25 2015, 05:26 PM)
So what alternatives are available? sweat.gif
*
R9 290 biggrin.gif

i think we can still recommend GTX970 la, but let the fella know the possible issue of VRAM? hmm.gif
marfccy
post Jan 25 2015, 11:23 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(chocobo7779 @ Jan 25 2015, 05:32 PM)
Though wasn't that this is pretty much confirmed unless nVidia issues a fix/recall? hmm.gif
*
Nvidia is taking a look at the issue

so well have to wait to see what is their verdict

QUOTE(terradrive @ Jan 25 2015, 10:16 PM)
Since when Haswell-E and geforce considered as workstation?  doh.gif

Workstation computer uses Xeon and Quadro la
*
home workstation biggrin.gif

this is call the poorman's workstation afaik
marfccy
post Jan 26 2015, 02:08 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(SSJBen @ Jan 26 2015, 01:34 AM)
They already have a verdict; the 970 has a gimped 0.5GB of VRAM and in comparison to the 980, the performance drop according to their "testing" is only between 1-4%. doh.gif

Yeah. yawn.gif
*
i still have a feeling they accidentally cut off too much when they were cutting down on the GM204 chip laugh.gif

6 Pages  1 2 3 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0624sec    0.38    7 queries    GZIP Disabled
Time is now: 3rd December 2025 - 04:17 AM