Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
7 Pages « < 3 4 5 6 7 >Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V13

views
     
marfccy
post Sep 20 2014, 10:13 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(rav3n82 @ Sep 20 2014, 10:06 PM)
Yes, this is true. If you watched the Nvidia promotional video on the GTX 980, towards the end where they did an exploded view of the 980 piece by piece, you can see the triple heatpipe design. It is significantly cheaper going with the standard copper heatpipe design rather than going for the vapor champer baseplate, which is expensive to produce. And also the GM204 core aren't as hot running as the earlier GK110 anyway. Smart Nvidia did this so that they can keep the costs as low as possible for the GTX 980. tongue.gif
*
this is where im skeptical, reviews did mention that despite the lower TDP, GTX980 still hits the thermal limit for GPU Boost 2.0

i know cost is important since theyre now offering a cheaper flagship as compared to the time when GTX780/ Ti version is out

but sadly the non-vapor cooler just wont cut it for better cooling perf
marfccy
post Sep 21 2014, 03:59 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(KHOdin @ Sep 21 2014, 01:00 AM)
anyone know if my CPU as in my siggy will bottleneck 970 ?
*
my i5 2500 2nd gen Sandy barely bottleneck my 780, why should you be afraid? laugh.gif

games nowadays just starting to utilise more cores, an i5 can last pretty long if you ask me
marfccy
post Sep 22 2014, 09:15 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Skidd Chung @ Sep 22 2014, 05:54 PM)
Want to ask if the latest 344 drivers screwing with anyone's GTX 760?
*
i can only report this though

while idling in my desktop after gaming, GPU suddenly ramp up in clock speed to 600MHz

resulting in GPU fan on constant ~40% fan speed

cant do anything to revert it back except a PC restart hmm.gif

QUOTE(pspslim007 @ Sep 22 2014, 09:08 PM)
hi guys , is there any news regarding 970 Reference coming to Malaysia ? i prefer a reference than aftermarket cooler . Thanks !
*
reference design PCB or cooler? hmm.gif

afaik, Nvidia said 970 wont be shipped with reference cooler
marfccy
post Sep 25 2014, 12:19 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(rav3n82 @ Sep 24 2014, 11:15 PM)
The problem with most reference based cards (assuming the base level Zotac 970 is based on that similarly, if such reference design for 970 ever exists) is that they tend to somehow exhibit some certain level of coil whine when under load (usually when running benchmarks). And what SSJBen says is true, my personal experience with Asus and MSI cards is that they really do not have any such issues. I believe it's because they've used superior quality chokes (SAP for Asus) and (SFC for MSI). However, my reference based 780 AMP! Edition does not have any coil whine, fortunately. tongue.gif But most reference based cards I've tested sadly have this problem plaguing it. sad.gif
*
the only one i know releasing ref cooler GTX970 is this brand, Manli
user posted image
http://www.manli.com/en/product/geforce-gtx970
marfccy
post Sep 25 2014, 02:27 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


anyone have any idea why is the Zotac GTX980/970 Omega have a huge ass thing on top of the fans? hmm.gif

user posted image

does it do anything?
marfccy
post Sep 26 2014, 01:13 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(feekle @ Sep 26 2014, 12:41 PM)
Only gigabyte have side LED for 970 is it? I know this may sound silly..but the side LED illuminating the logo is a huge attraction to me.
*
unless you take ref or Giga's WF, i dont think others have bling

MSI, ASUS all huge ass heatsink laugh.gif

wait till ASUS release GTX980 Matrix? icon_idea.gif
marfccy
post Sep 27 2014, 04:37 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(devillightning1 @ Sep 27 2014, 04:30 PM)
Just sold one of my R9 290s to buy a GTX 970...since there is no reference GTX 970 (sad, I wanna do SLI), which one is the best? MSI or Gigabyte since I am aiming for either one. I prefer the Gigabyte for looks, but not sure about the cooling against MSI Gaming
*
from reviews, both are damn good coolers. only thing i noticed is

Windforce
- long ass card
- no 0db feature (if it matters to you)
- has Blue bling2 smile.gif

Twin Frozr
- respectable length
- 0db feature
- black and red theme
- no bling2
marfccy
post Sep 27 2014, 04:56 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(devillightning1 @ Sep 27 2014, 04:41 PM)
dunno is it longer than the Tri-X leh, this Tri-X is the LONGEST GPU I've ever owned...Anyway so the TwinFrozr is RM100 more compared to Gigabyte, hmm...
*
GTX970 WF is about slightly longer than 1 foot afaik (1 foot = 30cm)

should be more or less similar to the Tri-X

sorry, didnt bothered to double check with Google tongue.gif
marfccy
post Sep 27 2014, 08:23 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(law1777 @ Sep 27 2014, 07:41 PM)
How much can u sell a 2nd hand 290 now? How much you lose from selling it off? 970 price plus the lose from selling off 290 is already so much
*
see his siggy mah, RM1.1k for one card, est price of sale is around RM900-1k after nego

give or take, he spends extra RM500 for each 970? hmm.gif
marfccy
post Sep 28 2014, 09:15 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(devillightning1 @ Sep 27 2014, 11:05 PM)
900 wadafak?
*
failed price estimation, dont mind tongue.gif
marfccy
post Oct 1 2014, 02:30 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(empire23 @ Oct 1 2014, 11:57 AM)
user posted image

Got bored this morning. Went out and bought this while waiting for its twin brother to fly in from the states.
*
rclxms.gif
marfccy
post Oct 1 2014, 09:04 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(sai86 @ Oct 1 2014, 02:49 PM)
hmm, plan to dissemble my gpu - Asus. on the 1 of the screw, got 1 white round sticker on it. i think its an indicator to see whether its removed or not for warranty purpose.

so um, is it safe to remove and re-stick it? tongue.gif
*
got any ASUS rep here? rolleyes.gif

DO IT
marfccy
post Oct 3 2014, 01:51 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(sai86 @ Oct 3 2014, 10:19 AM)
47 idle in silverstone case. hence, i found it strange that naked open temp decrease juz 3-4C + got a stand fan cooling it with fresh air from outside the window laugh.gif
*
load temps?

go custom cooler samore la, their cooler dont work well in cramped places since hot air is expelled mostly back inside casing whistling.gif

#Teamrefdesign
marfccy
post Oct 3 2014, 08:22 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(empire23 @ Oct 3 2014, 04:02 PM)
Just tried it with the R9 290. There is an obvious increase in image quality. Playing it at 2560x1600 is quite ok.

There's a decent hit in performance, but game is still playable with around 30 to 40-ish FPS.
*
i find SOM to be quite well optimised

running downsampled 2560x1440 High Tex (the rest max) tried Ultra, can get avg ~60FPS but the FPS went roller coaster even during simple things like looking around

solid 60FPS even during massive battles in stronghold (like 20-30 orc rushing to me)

my only qualms is that the Vsync lock at 30FPS doh.gif

wish they have AA settings too, the jaggies is quite unbearable
marfccy
post Oct 4 2014, 02:01 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(rurushu @ Oct 3 2014, 11:26 PM)
just wanna ask, will it be significant difference when gaming in 1080p or 4K?

because, IMHO, when you game, especially those high pace games, all those details, will your eyes even have time to really scrutinize them?  hmm.gif
*
yeah they do, jaggies wont be that apparent

textures looks more sharp and less pixellated

in high paced game, not so much since youll be more focused on fragging (say FPS)

but those campaign and SP modes, i take my sweet time enjoy the graphics
marfccy
post Oct 5 2014, 06:45 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(SSJBen @ Oct 5 2014, 06:34 PM)
I think unless you really only play the latest games at 4k, it's still quite a bad time to jump into 4k monitors.

Throughout the next year, we are going to see a huge influx of 1440p monitors (some of which we can expect are G-sync 2.0 compatible at a much lower price). 4k monitors will drop a lot in prices and Windows 10 will have proper support for 4k resolution, including many other programs.

So no, unless it is mostly for playing the latest games in 4k (of which you'll need a multi-GPU setup as well), 4k isn't worth the investment at this point in time.
*
i really hope a 2560x1440 monitor goes close to RM1.2k at least (or cheaper)

msot 1440p are >RM1.5k sweat.gif
marfccy
post Oct 5 2014, 08:39 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(SSJBen @ Oct 5 2014, 07:17 PM)
I second that bro. It's been a while where 1440p monitors has remained stagnant at above 1.5k.

Where are the 1440p, native 120hz (non-lightboost) monitors from Dell, LG, Samsung or even, dare I say it, Acer? I mean... Qnix has been doing it for how long now (though theirs need to be overclocked and results aren't guaranteed)?
Also, we need G-SYNC 2.0 here asap and at a much lower price point, $200 for the kit alone is still out of reach for the majority of gamers. Running on a single DP 1.2 port isn't much fun eventhough it is in theory, what most people really only use.
*
prices like that giving me double thoughts to get a second monitor grumble.gif

i dont mind the 60Hz or Gsync, i just want an affordable 1440p
marfccy
post Oct 7 2014, 11:01 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Najmods @ Oct 7 2014, 10:35 PM)
How hard to pull the damn sticker? One stupid things MSI did is to glue the middle bit where the sticker meet the fan center. That is where people damaged the fan. If they don't this thing won't happen
*
they dunno how use alcohol to dissolve the adhesive? hmm.gif

or use a hair dryer to "weaken" the glue

marfccy
post Oct 20 2014, 08:31 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(defaultname365 @ Oct 20 2014, 12:45 PM)
Anyone with a GTX980/970 or a 6GB VRAM card played 'Shadow of Mordor' with HD Texture Pack + Ultra Textures turned on?  sweat.gif

I tried it on my 'now' merely mortal GTX680 2G, and it is a slide-show fest at 1080p about 40-50% of the time.

Well, I've read how some gamers are freaking out and calling it BS without actually downloading the HD Texture Pack in the first place since it recommends 6GB VRAM for Ultra.

That said, they have discovered that, 'Shadow of Mordor' will use at most 3.7Gb of VRAM (at 1080p) in Ultra.

I am waiting patiently for the rumored 8GB GTX980... no idea if it will ever benefit a mere 1920x1080 display or that my i7-870 (OC'd to 4Ghz) will bottleneck.

Ultra:
» Click to show Spoiler - click again to hide... «


High:
» Click to show Spoiler - click again to hide... «


Open the image in a new window and compare them side-by-side, minimal but somewhat noticeable difference.

Source:
http://forums.guru3d.com/showpost.php?p=4926657&postcount=85
*
you can try asking cstkl1 for input, afaik he has a Titan Black. so he should be able to help address the texture issue
marfccy
post Oct 20 2014, 10:32 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(trent666 @ Oct 20 2014, 10:17 PM)
Guys, what do you think of the Gainward Phantom 970? Idealtech selling at RM1199. Clock speed reaches 1304Mhz with "boost".

What is this "boost". Do I only turn it on when gaming?
*
Boost is where the GPU increase clock speed to give better performance. granted, its only applicable if (based on GPU Boost 2.0)

- you dont reach the thermal limit (standard is 80C)

its useful if youre playing a game thats rather demanding in terms of graphics where with your superior cooling (say you have GPU watercooled) which keeps the GPU core cooled enough for it to maximise itself to the highest Boost clock

EDIT: mostly its on gaming perf afaik, unless you use Geforce GPUs for other than gaming purposes

This post has been edited by marfccy: Oct 20 2014, 10:34 PM

7 Pages « < 3 4 5 6 7 >Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0508sec    0.66    7 queries    GZIP Disabled
Time is now: 1st December 2025 - 10:24 PM