Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V13

views
     
marfccy
post Nov 23 2014, 07:57 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(shikimori @ Nov 23 2014, 05:53 PM)
huh you got amd endorsed game instead of nvidia inbed with ubisoft game ? Contemplating whether to get another gtx 970 for sli but my case cant fit another g1 gaming gtx 970 . Is it true that even if I used different brand gtx 970 it will use the lower clock card to match the speed ?

Will that be a much issue in terms of performance ? Looking where pc gaming is heading it looks like single card wont just cut it even if you are gaming at 1080p with all eye candy on  . I'm a graphic whore and 60 fps addict 

Combining either zotac gtx 970 , msi gtx 970 or asus strix 970 with a gtx g1 gaming 970  will that be a problem ?
*
get a new case brows.gif

provided same GPU model, no issue regardless of cooler type. the drivers know how to detect diff models and adjust itself

but ofc, its always better to get same model together (think of it like RAMs)

QUOTE(jamilselamat @ Nov 23 2014, 07:07 PM)
I'd like to get a 970 too, what with its 4GB memory(I play Skyrim and VRAM is crucial for mods).

Then again, my concern is bus speed. Only 256bit seems like it'll be a bottleneck. Can it truly use all 4GB of that VRAM? What about the rumored 8GB versions of 970/980? I imagine that much VRAM means those cards are targeted at people who do SLI. Would it mean anything at all for single-GPU?
*
Maxwell architecture can address the memory more efficient than Kepler based cards. although 256bit seems little compared to 780 and above's 384bit, its still more than capable enough

QUOTE
NVIDIA has also been focused on memory efficiency, both for performance and power reasons, resulting in the L2 cache size been greatly increased. NVIDIA has from 256KB in GK107 to 2MB on GM107, and from 512KB on GK104 to the same 2MB on GM204. This cache size increase reduces the amount of traffic that needs to cross the memory bus, reducing both the power spent on the memory bus and improving overall performance.


http://anandtech.com/show/8526/nvidia-gefo...tx-980-review/2

EDIT: with my 3GB GTx 780, i reached the limit several times with games like Shadow Of Mordor, CODAW and Watch Dogs, GPU-Z tells me its eating up 3GB of VRAM too

This post has been edited by marfccy: Nov 23 2014, 07:59 PM
marfccy
post Nov 23 2014, 08:29 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(jamilselamat @ Nov 23 2014, 08:21 PM)
Whoa...

Based on your input, I guess I'm picking up a 970 after all.

Goddamnit, January just can't come soon enough.
*
just hold up until AMD releases new stuff, high possibility Nvidia might counter with 'real' 20nm Maxwell cards laugh.gif

but if your current card is >2 years old, it justifies the upgrade nod.gif
marfccy
post Nov 25 2014, 03:19 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(wilsonz92 @ Nov 24 2014, 08:53 PM)
hmmm..since u talk about cpu i oso go check online..currently using old i5-2400...some say wnt bottleneck some say depend on game.. im confuse  rclxub.gif
*
little to no bottleneck, my i5 2500 still isnt even struggling for most games

unless you play Battlefield 4, Watch Dogs, Assassin's Creed Unity, you dont really need the latest up to date CPU
marfccy
post Nov 25 2014, 04:49 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(shikimori @ Nov 25 2014, 04:29 AM)
Even my shitty i7 2600k doesnt seems to be using much usage when playing new games . I just don't see the point upgrading to newer cpu unless doing video editting , compression and any other cpu intensive apps
*
aye, even in those aspect youve mentioned, its still not that worth it too

comparing Sandy to Haswell perf in video rendering, its like improvement of couple of seconds only rclxub.gif

Intel wad u doing?
marfccy
post Dec 8 2014, 08:28 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(stringfellow @ Dec 8 2014, 09:23 AM)
Purchased this from Newegg. Hopefully arriving before my birthday this 16th.

» Click to show Spoiler - click again to hide... «


Acer XB280HK 28" 3840x2160 4K monitor with Nvidia G-Sync.

I have the ROG Swift 144Hz 2560x1440 G-Sync monitor, but trying to get the framerates up to 144fps requires serious quality setting downgrade to take full advantage of the ROG Swift, compared to keeping the settings maximum at 4K on 60Hz.
*
its the very same one APES promote during a convention at KLCC right?

GSync indeed is good (i was there to test the monitor as well), but yeah, 144Hz is really hard to achieve with single or dual cards sweat.gif

do give a review when it arrived smile.gif
marfccy
post Dec 9 2014, 12:46 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(stringfellow @ Dec 8 2014, 10:15 PM)
That one is the Asus ROG Swift, 27" 2560x1440 at 144Hz with G-Sync. This one is the Acer XB280HK, 28" 3840x2160 at 60Hz with G-Sync. Both are priced at USD799. So it depends on which one you prefer: 1440p at 144 frames per second or 4K at 60 frames per second. I find that my usage fits the 4k60 than 1440p144. Like I mentioned earlier, it's harder to push a game to 144fps to get the full benefit of the ROG Swift panel, WITHOUT having to drop quality settings. With 4k60, I can opt not to have any AA and still have it above 45fps. And from Linus's observation in that video I linked above, framerates around that figure gets smoothen out by G-Sync, eliminating both tearing and lag.
*
which is exactly what GSync is built for, removing input lag.

im on your side as well, i think 144Hz is very unrealistic, considering there are games that limits your FPS (AssCreed, Shadow Of Mordor etc)

personally though, i wish there will be something like 75-90Hz monitors? eventhough the smoothness isnt much noticeable, but its still better than 60Hz
marfccy
post Dec 9 2014, 04:12 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(TristanX @ Dec 9 2014, 01:51 PM)
Where got RM1299? It's RM1399.

Already got 240GB SSD. Don't need another.
*
can always sell to gain extra bucks biggrin.gif
marfccy
post Dec 10 2014, 03:23 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(TristanX @ Dec 9 2014, 04:52 PM)
Too small for my time.

74C maxed out for the temperature on my Leadtek GTX970 Hurricane. Haven't tested higher OC yet.
*
pretty darn good, but tested on what? hmm.gif

Unigine?
marfccy
post Dec 21 2014, 12:14 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Ojil @ Dec 20 2014, 10:50 PM)
Going to buy gainward gtx 970 reference.
Is there any forumers here using it?
what is the max temp of the card at full load?
*
someone commented somewhere that his temps reached 80C on load, sounds just abt normal for me
marfccy
post Dec 24 2014, 02:07 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(ngkhanmein @ Dec 23 2014, 10:08 PM)
use reclock? hmm.gif

347.09 driver is out! anyone tested?

http://www.nvidia.com/download/driverResul...spx/80913/en-us
*
thats the shortest Release Highlight ive seen from Nvidia so far hmm.gif
marfccy
post Dec 26 2014, 02:34 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(shikimori @ Dec 26 2014, 12:06 AM)
Just went mental and bought another G1 GTX 970 card after failing to hunt for 4k monitor  cry.gif

user posted image

Now I got three GTX 970 XD . Looking to sell the Leadtek
*
impulsive purchase gila laugh.gif
marfccy
post Dec 27 2014, 05:13 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(TristanX @ Dec 23 2014, 09:13 PM)
Is there a way to make my refresh rate 60hz? I can only see 59.9hz in game for fullscreen.
*
random question, does the 0.1Hz difference matters? hmm.gif

7 Pages « < 5 6 7Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0468sec    0.66    7 queries    GZIP Disabled
Time is now: 1st December 2025 - 12:20 AM